What is the best way to run saved model with different batch size in TensorFlow?

后端 未结 2 874
别跟我提以往
别跟我提以往 2021-02-06 01:53

I trained Cifar10 example model from TensorFlow\'s repository with batch_size 128 and it worked fine. Then I froze graph and managed to run it with C++ just like they do it in t

2条回答
  •  谎友^
    谎友^ (楼主)
    2021-02-06 02:46

    Is there a reason you need fixed batch size in the graph?

    I think a good way is to build a graph with a variable batch size - by putting None as the first dimension. During training, you can then pass the batch size flag to your data provider, so it feeds the desired amount of data in each iteration.

    After the model is trained, you can export the graph using tf.train.Saver(), which exports the metagraph. To do inference, you can load the exported files and just evaluate with any number of examples - also just one.
    Note, this is different from the frozen graph.

提交回复
热议问题