Converting Tensorflow Frozen Graph to UFF for TensorRT inference

被刻印的时光 ゝ 提交于 2020-08-10 05:32:09

问题


I want to use a tensorflow model with TensorRT (C++) and first need to convert the the .pb to .uff format. When executing this code:

uff_model = uff.from_tensorflow_frozen_model(
  frozen_file="my_tf_model.pb",
  output_nodes=["output_layer"],
  output_filename="tmp.uff",
  debug_mode=True,
)

I am getting this error message:

Traceback (most recent call last):
  File "/home/jodo/ILONA/object-detection/ssd/src/uff_converter/uff_converter.py", line 29, in <module>
    text=False,
  File "/home/jodo/miniconda3/envs/uff_converter/lib/python3.7/site-packages/uff/converters/tensorflow/conversion_helpers.py", line 228, in from_tensorflow_frozen_model
    graphdef.ParseFromString(frozen_pb.read())
google.protobuf.message.DecodeError: Error parsing message

The exported graph (my_tf_model.pb) was trained and saved with tensorflow 2.0.0. But for the uff converter I have to use tensorflow 1.15.0. Could this be an issue or should the .pb file be downwards compatiable?

Update: Tested with a model trained with the same version as uff converter is used (1.15.0) and still same error.


回答1:


Answering my own question: My .pb file was not a frozen graph but part of the SavedModel format

To fix this 1) convert it to a frozen graph and use the frozen graph:

python -m tensorflow.python.tools.freeze_graph --input_saved_model_dir SAVED_MODEL_DIR

Then uff.from_tensorflow_frozen_model() should work.



来源:https://stackoverflow.com/questions/59345600/converting-tensorflow-frozen-graph-to-uff-for-tensorrt-inference

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!