问题
I'm looking for a way to implement a search of learning rate as described here: https://arxiv.org/pdf/1506.01186.pdf .
My network is implemented using estimator api and I'd like to stick to that, but unfortunately I'm not able to force estimator to skip saving checkpoints. Do you know a way to simply run a one epoch o training without saving the checkpoints?
回答1:
According to the docs tf.estimator.RunConfig:
If both save_checkpoints_steps and save_checkpoints_secs are None, then checkpoints are disabled
So the code is following:
run_config = tf.estimator.RunConfig(save_summary_steps=None,
save_checkpoints_secs=None)
estimator = tf.estimator.Estimator(model_fn=model_fn, config=run_config)
来源:https://stackoverflow.com/questions/48361703/how-to-run-estimator-train-without-saving-checkpoints