I trained my CNN (VGG) through google colab and generated .h5 file. Now problem is, I can predict my output successfully through google colab but when i download that .h5 tr
Wow I, just spent 6 Hours of my life trying to figure this out.. Dmitri posted a solution to this here: I trained a keras model on google colab. Now not able to load it locally on my system.
I'm just basically reposting it here because it worked for me.
This looks like some kind of a serialization bug in keras. If you wrap your load_model with the below CustomObjectScope thingy... all should work..
import keras
from keras.models import load_model
from keras.utils import CustomObjectScope
from keras.initializers import glorot_uniform
with CustomObjectScope({'GlorotUniform': glorot_uniform()}):
model = load_model('imdb_mlp_model.h5')
Changing
from keras.models import load_model
to
from tensorflow.keras.models import load_model
solved my problem!
To eliminate errors, import all things directly from Keras or TensorFlow. Mixing both of them in same project may result in problems.
Something that helped me which wasn't in any of the answers:
custom_objects={'GlorotUniform': glorot_uniform()}
I had a same problem and was fixed this way. just don't save the optimizer with the model! just change the save line like this:
the_model.save(file_path,True/False,False)
Second parameter tells Keras to overwrite the model if the file existed or not and the 3rd one tells it not to save the optimizer with the model.
Edit: I ran over the problem again on another system today and this did not helped me this time. so i saved the model conf as json and weights as h5 and used them to rebuild the model in another machine. you can do it like this. save like this:
json = model.to_json()
# Save the json on a file
model.save_weights(weights_filepath,save_format="h5")
rebuild the model like this:
# load the json file
# here i use json as loaded content of json file
model = keras.models.model_from_json(json)
model.load_weights(weights_file_path)