TensorBoard - Plot training and validation losses on the same graph?

前端 未结 8 1744
不思量自难忘°
不思量自难忘° 2020-12-02 18:34

Is there a way to plot both the training losses and validation losses on the same graph?

It\'s easy to have two separate scalar summaries for each of them i

相关标签:
8条回答
  • 2020-12-02 19:09

    The work-around I have been doing is to use two SummaryWriter with different log dir for training set and cross-validation set respectively. And you will see something like this:

    0 讨论(0)
  • 2020-12-02 19:10

    Tensorboard is really nice tool but by its declarative nature can make it difficult to get it to do exactly what you want.

    I recommend you checkout Losswise (https://losswise.com) for plotting and keeping track of loss functions as an alternative to Tensorboard. With Losswise you specify exactly what should be graphed together:

    import losswise
    
    losswise.set_api_key("project api key")
    session = losswise.Session(tag='my_special_lstm', max_iter=10)
    loss_graph = session.graph('loss', kind='min')
    
    # train an iteration of your model...
    loss_graph.append(x, {'train_loss': train_loss, 'validation_loss': validation_loss})
    # keep training model...
    
    session.done()
    

    And then you get something that looks like:

    Notice how the data is fed to a particular graph explicitly via the loss_graph.append call, the data for which then appears in your project's dashboard.

    In addition, for the above example Losswise would automatically generate a table with columns for min(training_loss) and min(validation_loss) so you can easily compare summary statistics across your experiments. Very useful for comparing results across a large number of experiments.

    0 讨论(0)
提交回复
热议问题