How to use evaluation_loop with train_loop in tf-slim

前端 未结 3 2010
长情又很酷
长情又很酷 2021-02-04 10:57

I\'m trying to implement a few different models and train them on CIFAR-10, and I want to use TF-slim to do this. It looks like TF-slim has two main loops that are useful during

3条回答
  •  温柔的废话
    2021-02-04 11:53

    Adding my 2-cent:

    I currently have this model for the evaluation_loop hogging up an entire GPU, but it's rarely being used

    Usually an evaluation model takes less GPU memory. You could prevent TF from hogging the whole GPU memory by setting the session config allow_growth to True. This way you can use the same GPU for both training and evaluation

    Example @ Training

    session_config = tf.ConfigProto()
    session_config.gpu_options.allow_growth = True
    
    slim.learning.train(train_tensor, 
                      logdir=train_log_dir,
                      local_init_op=tf.initialize_local_variables(),
                      save_summaries_secs=FLAGS.save_summaries_secs,
                      save_interval_secs=FLAGS.save_interval_secs,
                      session_config=session_config)
    

    Example @ validation

    session_config = tf.ConfigProto()
    session_config.gpu_options.allow_growth = True
    
    slim.evaluation.evaluation_loop(
          '',
          checkpoint_dir=train_log_dir,
          logdir=train_log_dir,
          num_evals=FLAGS.num_eval_batches,
          eval_op=names_to_updates.values(),
          summary_op=tf.merge_summary(summary_ops),
          eval_interval_secs=FLAGS.eval_interval_secs,
          session_config=session_config)
    

提交回复
热议问题