Tensorflow not being deterministic, where it should

后端 未结 1 1088
旧巷少年郎
旧巷少年郎 2021-01-14 06:46

I have a small network. Trained [many hours] and saved to a checkpoint. Now, I want to restore from checkpoint, in a different script, and use it. I recreate the session: bu

1条回答
  •  礼貌的吻别
    2021-01-14 06:55

    It seems as if this question was already answered in the comments but no-one has written down the answer explicitly yet, so here it is:

    You were expecting the computation graph to return always the same values even with different random seeds because you thought that there should not be any Op in your graph which depends on the random seed.

    You forgot about the dropout.

    In any case, I would always keep the random seed fixed anyway. Then also this and any other random ops are deterministic, and your whole training can be as well. If you are wondering at some point how much variance you get by different random seeds, you can explicitly try other random seeds.

    0 讨论(0)
提交回复
热议问题