Based on this converting-trained-tensorflow-model-to-protobuf I am trying to save/restore TF graph without success.
Here is saver:
ParseFromString
needs binary serialized protocol buffer, for human-readable representation you need to use text_format.Merge
as used here
The GraphDef.ParseFromString()
method (and, in general, the ParseFromString()
method on any Python protobuf wrapper) expects a string in the binary protocol buffer format. If you pass as_text=False
to tf.train.write_graph(), then the file will be in the appropriate format.
Otherwise you can do the following to read the text-based format:
from google.protobuf import text_format
# ...
graph_def = tf.GraphDef()
text_format.Merge(proto_b, graph_def)
I tried to load model via the java API which accepts only binary. But in the python where we uses contrib.Estimator produces textual model file format. I found a model file converter online, seems it's working fine. This might solve the original issue too (use the binary model loader) if you have an existing text format model file.