I created a model using Keras library and saved the model as .json and its weights with .h5 extension. How can I download this onto my local machine?
to save the model I
To download the model to the local system, the following code would work- Downloading json file:
model_json = model.to_json()
with open("model1.json","w") as json_file:
json_file.write(model_jason)
files.download("model1.json")
Downloading weights:
model.save('weights.h5')
files.download('weights.h5')
This worked for me !! Use PyDrive API
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# 1. Authenticate and create the PyDrive client.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# 2. Save Keras Model or weights on google drive
# create on Colab directory
model.save('model.h5')
model_file = drive.CreateFile({'title' : 'model.h5'})
model_file.SetContentFile('model.h5')
model_file.Upload()
# download to google drive
drive.CreateFile({'id': model_file.get('id')})
Same for weights
model.save_weights('model_weights.h5')
weights_file = drive.CreateFile({'title' : 'model_weights.h5'})
weights_file.SetContentFile('model_weights.h5')
weights_file.Upload()
drive.CreateFile({'id': weights_file.get('id')})
Now, check your google drive.
On next run, try reloading the weights
# 3. reload weights from google drive into the model
# use (get shareable link) to get file id
last_weight_file = drive.CreateFile({'id': '1sj...'})
last_weight_file.GetContentFile('last_weights.mat')
model.load_weights('last_weights.mat')
A Better NEW way to do it (post update) ... forget the previous (also works)
# Load the Drive helper and mount
from google.colab import drive
drive.mount('/content/drive')
You will be prompted for authorization Go to this URL in a browser: something like : accounts.google.com/o/oauth2/auth?client_id=.....
obtain the auth code from the link, paste your authorization code in the space
Then you can use drive normally as your own disk
Save weights or even the full model directly
model.save_weights('my_model_weights.h5')
model.save('my_model.h5')
Even a Better way, use call backs, which automatically checks if the model at each epoch achieved better than the best saved one and save the one with best validation loss so far.
my_callbacks = [
EarlyStopping(patience=4, verbose=1),
ReduceLROnPlateau(factor=0.1, patience=3, min_lr=0.00001, verbose=1),
ModelCheckpoint(filepath = filePath + 'my_model.h5',
verbose=1, save_best_only=True, save_weights_only=False)
]
And use the call back in the model.fit
model.fit_generator(generator = train_generator,
epochs = 10,
verbose = 1,
validation_data = vald_generator,
callbacks = my_callbacks)
You can load it later, even with a previous user defined loss function
from keras.models import load_model
model = load_model(filePath + 'my_model.h5',
custom_objects={'loss':balanced_cross_entropy(0.20)})
files.download
does not let you directly download large files. A workaround is to save your weights on Google drive, using this pydrive snippet below. Just change the filename.txt
for your weights.h5
file
# Install the PyDrive wrapper & import libraries.
# This only needs to be done once in a notebook.
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# Authenticate and create the PyDrive client.
# This only needs to be done once in a notebook.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# Create & upload a file.
uploaded = drive.CreateFile({'title': 'filename.csv'})
uploaded.SetContentFile('filename.csv')
uploaded.Upload()
print('Uploaded file with ID {}'.format(uploaded.get('id')))
Try this
from google.colab import files
files.download("model.json")
simply use model.save(). Below here i created a variable to store the name of the model then i saved it with model.save(). I used google collab but it should work for other s enter image description here
You can run the following after training.
saver = tf.train.Saver()
save_path = saver.save(session, "data/dm.ckpt")
print('done saving at',save_path)
Then check the location where the ckpt files were saved.
import os
print( os.getcwd() )
print( os.listdir('data') )
Finally download the files with weight!
from google.colab import files
files.download( "data/dm.ckpt.meta" )