How to optimize for inference a simple, saved TensorFlow 1.0.1 graph?

后端 未结 2 911
野趣味
野趣味 2021-01-30 11:39

I cannot successfully run the optimize_for_inference module on a simple, saved TensorFlow graph (Python 2.7; package installed by pip install tensorflow-gpu==

2条回答
  •  小鲜肉
    小鲜肉 (楼主)
    2021-01-30 12:20

    1. You are doing it wrong: input is a graphdef file for the script not the data part of the checkpoint. You need to freeze the model to a .pb file/ or get the prototxt for graph and use the optimize for inference script.

    This script takes either a frozen binary GraphDef file (where the weight variables have been converted into constants by the freeze_graph script), or a text GraphDef proto file (the weight variables are stored in a separate checkpoint file), and outputs a new GraphDef with the optimizations applied.

    1. Get the graph proto file using write_graph
    2. get the frozen model freeze graph

提交回复
热议问题