问题
I got the results by running the code provided in this link Neural Network – Predicting Values of Multiple Variables. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same (consistent) result?
回答1:
The code is full of random.randint()
everywhere! Furthermore, the weights are most of the time randomly set aswell, and the batch_size also has an influence (although pretty minor) in the result.
- Y_train, X_test, X_train are generated randomly
- Using
adam
as optimizer, means you'll be performing stochastic gradient descent. With a random beginning point of the iterations in order to converge. - A batch_size of 8 means you will run batches consisting of 8 randomly selected samples.
Solution:
- Set a random seed in your code to have always the random values generated with
np.random.seed()
- Doesn't generate much of an issue although minor deviations
- Same as 2.
If I find a way to have consistente sampling methods for the batch_size
/epoch
issue I will edit my answer.
回答2:
There are lots of random arrays in there. Use np.random.seed() to get the same ones each time. For example:
np.random.seed(42)
for _ in range(3):
print(np.random.random(3))
Every time you run this code, you'll get the same result. On my machine:
[0.37454012 0.95071431 0.73199394]
[0.59865848 0.15601864 0.15599452]
[0.05808361 0.86617615 0.60111501]
Note that lots of other bits of the machine learning pipeline use randomization too. For example:
- Splitting into train, validation and test datasets with train_test_split().
- Setting initial weights in a neural network.
- Optimization pathways.
Most ML functions allow you to pass a seed as an argument. Have a look in the documentation. Depending on what you are doing, and which libraries you're using, you may or may not be able to make the entire pipeline reproducible.
You might also like this article or this one about getting reproducible results with Keras.
来源:https://stackoverflow.com/questions/58241065/result-changes-every-time-i-run-neural-network-code