TensorFlow: how to export estimator using TensorHub module?

扶醉桌前 提交于 2019-12-10 17:04:18

问题


I have an estimator using a TensorHub text_embedding column, like so:

my_dataframe = pandas.DataFrame(columns=["title"})
# populate data
labels = [] 
# populate labels with 0|1
embedded_text_feature_column = hub.text_embedding_column(
    key="title" 
    ,module_spec="https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1")


estimator = tf.estimator.LinearClassifier(
    feature_columns = [ embedded_text_feature_column ]
    ,optimizer=tf.train.FtrlOptimizer(
        learning_rate=0.1
        ,l1_regularization_strength=1.0
    )
    ,model_dir=model_dir
)

estimator.train(
    input_fn=tf.estimator.inputs.pandas_input_fn(
        x=my_dataframe
        ,y=labels
        ,batch_size=128
        ,num_epochs=None
        ,shuffle=True
        ,num_threads=5
    )
    ,steps=5000
)
export(estimator, "/tmp/my_model")

How can I export and serve the model so that it accepts strings as input to predictions? I have a serving_input_receiver_fn as follows, and tried quite a few more, but I'm quite confused as to what it needs to look like so that I can serve it (with saved_model_cli, say) and call it with title strings (or a simple JSON structure) as input.

def export(estimator, dir_path):
    def serving_input_receiver_fn():
        feature_spec = tf.feature_column.make_parse_example_spec(hub.text_embedding_column(
        key="title" 
        ,module_spec="https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1"))
        return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)


    estimator.export_savedmodel(
        export_dir_base=dir_path
        ,serving_input_receiver_fn=serving_input_receiver_fn()
    )

回答1:


If you want to feed raw strings, you might want to consider using the raw input receiver. This code:

feature_placeholder = {'title': tf.placeholder('string', [1], name='title_placeholder')}
serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholder)

estimator.export_savedmodel(dir_path, serving_input_fn)

will give you a SavedModel with the following input specification according to the SavedModel CLI:

saved_model_cli show --dir ./ --tag_set serve --signature_def serving_default

The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: title_placeholder_1:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
    dtype: DT_STRING
    shape: (-1, 2)
    name: linear/head/Tile:0
  outputs['scores'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 2)
    name: linear/head/predictions/probabilities:0

You can provide a python expression to the CLI to serve an input to the model to validate that it works:

saved_model_cli run --dir ./ --tag_set serve --signature_def \
serving_default --input_exprs "inputs=['this is a test sentence']"

Result for output key classes:
[[b'0' b'1']]
Result for output key scores:
[[0.5123377 0.4876624]]


来源:https://stackoverflow.com/questions/51482730/tensorflow-how-to-export-estimator-using-tensorhub-module

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!