Swap a TensorFlow Dataset input pipeline with a placeholder after training

三世轮回 提交于 2019-11-30 14:48:44

Since you already have a trained graph saved in a checkpoint, in theory the simplest solution for you is to export the inference graph via optimize_for_inference.

This tool works both for already-frozen graphs and, as is your case, for graphs with variables still defined. Assuming you go for the frozen graph way, the first step is to transform your graph's variables in constants via:

python freeze_graph.py \
--input_graph=temp/path/graph.pbtxt \
--input_checkpoint=temp/path/your_model_name.ckpt \
--output_graph=frozen_model.pb \
--output_node_names=name_of_the_output_tensor_you_want_to_use

This will generate a new binary file called frozen_model.pb that has the Variable operations replaced with Const ops with the values loaded from the checkpoint file.

Then, you need to generate the inference graph with:

python optimize_for_inference.py \
--input=frozen_model.pb \
--output=inference.pb \
--frozen_graph=True \
--input_names=IteratorGetNext
--output_names=name_of_the_output_tensor_you_want_to_use

This will replace the IteratorGetNext node with a float placeholder. You might want to choose another node, in which case just change the name. You can also change the type of the generated placeholder via the --placeholder_type_enum option. In that case, you need to provide an integer value matching the datatype you want from the DataType enum.

NOTE: I said "in theory" because actually inspecting the generated inception graph from a test I made it seems there are still some weird ops in there that are not really necessary for inference. You might have to further process your graph via nvidia's Graph Surgeon or TF's graph transform tool

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!