We\'re currently training various neural networks using Keras, which is ideal because it has a nice interface and is relatively easy to use, but we\'d like to be able to apply t
In case you don't need to utilize a GPU in the environment you are deploying to, you could also use my library, called frugally-deep. It is available on GitHub and published under the MIT License: https://github.com/Dobiasd/frugally-deep
frugally-deep allows running forward passes on already-trained Keras models directly in C++ without the need to link against TensorFlow or any other backend.
Save your keras model as an HDF5 file.
You can then do the conversion with the following code:
from keras import backend as K
from tensorflow.python.framework import graph_util
from tensorflow.python.framework import graph_io
weight_file_path = 'path to your keras model'
net_model = load_model(weight_file_path)
sess = K.get_session()
constant_graph = graph_util.convert_variables_to_constants(sess, sess.graph.as_graph_def(), 'name of the output tensor')
graph_io.write_graph(constant_graph, 'output_folder_path', 'output.pb', as_text=False)
print('saved the constant graph (ready for inference) at: ', osp.join('output_folder_path', 'output.pb'))
Here is my sample code which handles multiple input and multiple output cases: https://github.com/amir-abdi/keras_to_tensorflow
You can access TensorFlow backend by:
import keras.backend.tensorflow_backend as K
Then you can call any TensorFlow utility or function like:
K.tf.ConfigProto
This seems to be answered in "Keras as a simplified interface to TensorFlow: tutorial", posted on The Keras Blog by Francois Chollet.
In particular, section II, "Using Keras models with TensorFlow".
Make sure you change the learning phase of keras backend to store proper values of the layers (like dropout or batch normalization). Here is a discussion about it.