I want to deploy a simple TensorFlow model and run it in REST service like Flask. Did not find so far good example on github or here.
I am not ready to use TF Serving as
This github project shows a working example of restoring a model checkpoint and using Flask.
@app.route('/api/mnist', methods=['POST'])
def mnist():
input = ((255 - np.array(request.json, dtype=np.uint8)) / 255.0).reshape(1, 784)
output1 = simple(input)
output2 = convolutional(input)
return jsonify(results=[output1, output2])
The online demo seems pretty quick.
I don't like to put much code with data/model processing in flask restful file. I usually have tf model class and so on separately. i.e. it could be something like this:
# model init, loading data
cifar10_recognizer = Cifar10_Recognizer()
cifar10_recognizer.load('data/c10_model.ckpt')
@app.route('/tf/api/v1/SomePath', methods=['GET', 'POST'])
def upload():
X = []
if request.method == 'POST':
if 'photo' in request.files:
# place for uploading process workaround, obtaining input for tf
X = generate_X_c10(f)
if len(X) != 0:
# designing desired result here
answer = np.squeeze(cifar10_recognizer.predict(X))
top3 = (-answer).argsort()[:3]
res = ([cifar10_labels[i] for i in top3], [answer[i] for i in top3])
# you can simply print this to console
# return 'Prediction answer: {}'.format(res)
# or generate some html with result
return fk.render_template('demos/c10_show_result.html',
name=file,
result=res)
if request.method == 'GET':
# in html I have simple form to upload img file
return fk.render_template('demos/c10_classifier.html')
cifar10_recognizer.predict(X) is simple func, that runs prediction operation in tf session:
def predict(self, image):
logits = self.sess.run(self.model, feed_dict={self.input: image})
return logits
p.s. saving/restoring the model from a file is an extremely long process, try to avoid this while serving post/get requests
There are different ways to do this. Purely, using tensorflow is not very flexible, however relatively straightforward. The downside of this approach is that you have to rebuild the graph and initialize variables in the code where you restore the model. There is a way shown in tensorflow skflow/contrib learn which is more elegant, however this doesn't seem to be functional at the moment and the documentation is out of date.
I put a short example together on github here that shows how you would named GET or POST parameters to a flask REST-deployed tensorflow model.
The main code is then in a function that takes a dictionary based on the POST/GET data:
@app.route('/model', methods=['GET', 'POST'])
@parse_postget
def apply_model(d):
tf.reset_default_graph()
with tf.Session() as session:
n = 1
x = tf.placeholder(tf.float32, [n], name='x')
y = tf.placeholder(tf.float32, [n], name='y')
m = tf.Variable([1.0], name='m')
b = tf.Variable([1.0], name='b')
y = tf.add(tf.mul(m, x), b) # fit y_i = m * x_i + b
y_act = tf.placeholder(tf.float32, [n], name='y_')
error = tf.sqrt((y - y_act) * (y - y_act))
train_step = tf.train.AdamOptimizer(0.05).minimize(error)
feed_dict = {x: np.array([float(d['x_in'])]), y_act: np.array([float(d['y_star'])])}
saver = tf.train.Saver()
saver.restore(session, 'linear.chk')
y_i, _, _ = session.run([y, m, b], feed_dict)
return jsonify(output=float(y_i))