Use TensorBoard with Keras Tuner

后端 未结 1 1019
悲&欢浪女
悲&欢浪女 2021-02-06 12:33

I ran into an apparent circular dependency trying to use log data for TensorBoard during a hyper-parameter search done with Keras Tuner, for a model built with TF2. The typical

相关标签:
1条回答
  • 2021-02-06 12:41

    The keras tuner creates a subdir for each run (statement is probably version dependent).

    I guess finding the right version mix is of importance.

    Here is how it works for me, in jupyterlab.

    prerequisite:

    1. pip requirements
        keras-tuner==1.0.1
        tensorboard==2.1.1
        tensorflow==2.1.0
        Keras==2.2.4
        jupyterlab==1.1.4
    

    (2.) jupyterlab installed, built and running [standard compile arguments: production:minimize]

    Here is the actual code. First i define the log folder and the callback

    # run parameter
    log_dir = "logs/" + datetime.datetime.now().strftime("%m%d-%H%M")
    
    # training meta
    stop_callback = EarlyStopping(
        monitor='loss', patience=1, verbose=0, mode='auto')
    
    hist_callback = tf.keras.callbacks.TensorBoard(
        log_dir=log_dir,
        histogram_freq=1,
        embeddings_freq=1,
        write_graph=True,
        update_freq='batch')
    
    print("log_dir", log_dir)
    

    Then i define my hypermodel, which i do not want to disclose. Afterwards i set up the hyper parameter search

    from kerastuner.tuners import Hyperband
    
    hypermodel = get_my_hpyermodel()
    
    tuner = Hyperband(
        hypermodel
        max_epochs=40,
        objective='loss',
        executions_per_trial=5,
        directory=log_dir,
        project_name='test'
    )
    

    which i then execute

    tuner.search(
        train_data,
        labels,
        epochs=10,
        validation_data=(val_data, val_labels),
        callbacks=[hist_callback],
        use_multiprocessing=True)
    
    tuner.search_space_summary()
    

    While the notebook with this code searches for adequate hyper parameters i control the loss in another notebook. Since tf V2 tensorboard can be called via a magic function

    Cell 1

    import tensorboard
    

    Cell 2

    %load_ext tensorboard
    

    Cell 3

    %tensorboard --logdir 'logs/'
    

    Sitenote: Since i run jupyterlab in a docker container i have to specifiy the appropriate address and port for tensorboard and also forward this in the dockerfile.

    The result is not really predictable for me... I did not understand yet, when i can expect histograms and distributions in tensorboard. Some runs the loading time seems really excessive... so have patience

    Under scalars i find a list of the turns as follows

    "logdir"/"model_has"/execution[iter]/[train/validation]

    E.g. 0101-1010/bb7981e03d05b05106d8a35923353ec46570e4b6/execution0/train 0101-1010/bb7981e03d05b05106d8a35923353ec46570e4b6/execution0/validation

    0 讨论(0)
提交回复
热议问题