tensorflow-serving

Tensorflow Serving Compile Error Using Docker on OSX

只愿长相守 提交于 2019-12-10 11:28:45
问题 I'm trying to install TensorFlow serving on OSX El Capitan using Docker but keep running into an error. Here is the tutorial I'm following: https://tensorflow.github.io/serving/docker.html Here is the command causing the error: bazel test tensorflow_serving/... Here's the error I'm getting: for (int i = 0; i < suffix.size(); ++i) { ^ ERROR: /root/.cache/bazel/_bazel_root/f8d1071c69ea316497c31e40fe01608c/external/tf/tensorflow/core/kernels/BUILD:212:1: C++ compilation of rule '@tf//tensorflow

Serving a Keras model with Tensorflow Serving

这一生的挚爱 提交于 2019-12-09 13:38:10
问题 Tensorflow 1.12 release notes states that: "Keras models can now be directly exported to the SavedModel format(tf.contrib.saved_model.save_keras_model()) and used with Tensorflow Serving" . So I gave it a shot - I have exported a simple model with this op using a single line. However, Tensorflow serving doesn't recognize the model. I guess the problem is with the docker call, and maybe with a missing 'signature_defs' in the model definition. I would be thankful for info regarding the missing

Import Error: no module named cloud.ml

余生长醉 提交于 2019-12-08 07:12:24
问题 I am trying to follow the instructions to use local predictions in tensorflow as described here. Running the command gcloud ml-engine local predict --model-dir=~/PycharmProjects/nlc/export/1/ --json-instances=test.json gives me the error: ERROR: (gcloud.ml-engine.local.predict) Cannot import google.cloud.ml. Please verify "python -c 'import google.cloud.ml'" works. Please verify the installed cloudml sdk version with: "python -c 'import google.cloud.ml as cloudml; print cloudml.__version__'".

For a tensorflow SavedModel in pbtxt format, where is the device placement of an operation/node defined?

你说的曾经没有我的故事 提交于 2019-12-08 04:00:33
问题 I have a SavedModel with saved_model.pbtxt and variables\ which was pre-trained on a single GPU from this repo: https://github.com/sthalles/deeplab_v3. I'm trying to serve this SavedModel with tensorflow-serving, and it can only utilise GPU:0 in a multi-GPU machine. I learned from https://github.com/tensorflow/serving/issues/311 that tensorflow-serving loads the graph with tensorflow, and this model was trained on a single GPU. I tried to save the model with clear_devices=True flag but no

tensorflow-serving signature for an XOR

半城伤御伤魂 提交于 2019-12-08 03:59:46
问题 I am trying to export my first xor NN using tensorflow serving but I am not getting any result when I call the gRPC. Here the code I use to predict the XOR import tensorflow as tf sess = tf.Session() from keras import backend as K K.set_session(sess) K.set_learning_phase(0) # all new operations will be in test mode from now on from tensorflow.python.saved_model import builder as saved_model_builder from tensorflow.python.saved_model import tag_constants, signature_constants, signature_def

TensorFlow Serving: Pass image to classifier

左心房为你撑大大i 提交于 2019-12-08 03:02:10
问题 I have built a simple classifier in Tensorflow (Python, tensorflow 1.9.0 and tensorflow-serving 1.9.0) which classifies objects into one of 5 classes. Now, I would like to serve that model. I have exported it and given it a classification signature (and only a classification signature): classification_signature = tf.saved_model.signature_def_utils.build_signature_def( inputs={signature_constants.CLASSIFY_INPUTS: classification_inputs}, outputs={ signature_constants.CLASSIFY_OUTPUT_CLASSES:

How to serve multiple versions of model via standard tensorflow serving docker image?

不想你离开。 提交于 2019-12-07 21:48:40
问题 I'm new to Tensorflow serving, I just tried Tensorflow serving via docker with this tutorial and succeeded. However, when I tried it with multiple versions, it serves only the latest version. Is it possible to do that? Or do I need to try something different? 回答1: This require a ModelServerConfig, which will be supported by the next docker image tensorflow/serving release 1.11.0 (available since 5. Okt 2018). Until then, you can create your own docker image, or use tensorflow/serving:nightly

In Tensorflow how to freeze saved model

若如初见. 提交于 2019-12-07 12:30:31
问题 This is probably a very basic question... But how do I convert checkpoint files into a single .pb file. My goal is to serve the model using probably C++ These are the files that I'm trying to convert. As a side note I'm using tflearn with tensorflow. Edit 1: I found an article that explains how to do this: https://blog.metaflow.fr/tensorflow-how-to-freeze-a-model-and-serve-it-with-a-python-api-d4f3596b3adc The problem is that I'm stuck with the following error KeyError: "The name 'Adam'

Export a basic Tensorflow model to Google Cloud ML

好久不见. 提交于 2019-12-07 08:46:13
问题 I am trying to export my local tensorflow model to use it on Google Cloud ML and run predictions on it. I am following the tensorflow serving example with mnist data. There is quite a bit of difference in the way they have processed and used their input/output vectors and it is not what you find in typical examples online. I am unsure how to set the parameters of my signatures : model_exporter.init( sess.graph.as_graph_def(), init_op = init_op, default_graph_signature = exporter

Tensorflow serving: request fails with object has no attribute 'unary_unary

蓝咒 提交于 2019-12-07 04:44:33
问题 I'm building a CNN text classifier using TensorFlow which I want to load in tensorflow-serving and query using the serving apis. When I call the Predict() method on the grcp stub I receive this error: AttributeError: 'grpc._cython.cygrpc.Channel' object has no attribute 'unary_unary' What I've done to date: I have successfully trained and exported a model suitable for serving (i.e., the signatures are verified and using tf.Saver I can successfully return a prediction). I can also load the