tensorflow-lite

java.lang.IllegalArgumentException: Cannot copy between a TensorFlowLite tensor with shape [2] and a Java object with shape [1, 2]

不羁岁月 提交于 2020-07-23 06:28:31
问题 I've trained my own model for image classification model in keras and i converted it into tflite then i want to use that model in android through tensorflow lite. for this i used a github project to get my hands directly on the app link to the project is here: https://github.com/amitshekhariitbhu/Android-TensorFlow-Lite-Example/tree/master/app/src/main/java/com/amitshekhar/tflite But i have got this error in the logcat: 2020-03-30 14:50:48.747 27421-27421/com.amitshekhar.tflite E

tf.keras HDF5 Model and Keras HDF5 Model

耗尽温柔 提交于 2020-07-22 10:11:50
问题 I want to convert a Keras model to Tensorflow Lite model. When I examined the documentation, it is stated that we can use tf.keras HDF5 models as input. Does it mean I can use my saved HDF5 Keras model as input to it or tf.keras HDF5 model and Keras HDF5 models are different things? Documentation: https://www.tensorflow.org/lite/convert Edit: I could convert my Keras model to Tensorflow Lite model with using this API, but I didn't test it yet. My code: converter = tf.lite.TFLiteConverter.from

tf.keras HDF5 Model and Keras HDF5 Model

大城市里の小女人 提交于 2020-07-22 10:11:47
问题 I want to convert a Keras model to Tensorflow Lite model. When I examined the documentation, it is stated that we can use tf.keras HDF5 models as input. Does it mean I can use my saved HDF5 Keras model as input to it or tf.keras HDF5 model and Keras HDF5 models are different things? Documentation: https://www.tensorflow.org/lite/convert Edit: I could convert my Keras model to Tensorflow Lite model with using this API, but I didn't test it yet. My code: converter = tf.lite.TFLiteConverter.from

How do i know my --output_arrays in tflite_convert

我与影子孤独终老i 提交于 2020-07-21 18:44:08
问题 I'm trying to convert my .pb to .tflite using tflite_convert How do i know my --output_arrays ? I'm using the ssd_mobilenet_v2_coco_2018_03_29 this is my current code: tflite_convert --output_file=C:/tensorflow1/models/research/object_detection/inference_graph/detect.tflite --graph_def_file=C:/tensorflow1/models/research/object_detection/inference_graph/tflite_graph.pb --inference_type=FLOAT --inference_input_type=QUANTIZED_UINT8 --input_arrays=ImageTensor --input_shapes=1,513,513,3 --output

Conveting saved model into tflite - 'image_tensor' has invalid shape '[None, None, None, 3]'

♀尐吖头ヾ 提交于 2020-07-09 06:48:08
问题 Trying to convert saved model into tflite. model name: ssd_mobilenet_v1_ppn_coco https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md I have tried converting the model to tflite pb with this command: $tflite_convert --output_file=/tmp/dec.tflite --saved_model_dir=/ppn/saved_model/ Got this error: ValueError: None is only supported in the 1st dimension. Tensor 'image_tensor' has invalid shape '[None, None, None, 3]'. In order to get more info

How to create a model easily convertible to TensorFlow Lite?

谁都会走 提交于 2020-06-23 14:27:28
问题 How to create a TensorFlow model which can be converted to TensorFlow Lite (tflite) and can be used in Android application? Following the examples in Google ML Crash Course I've created a classifier and trained a model. I've exported the model as saved model . I wanted to convert the model to .tflite file and use it to infer on Android . Soon (actually later) I understand that my model uses unsupported operation - ParseExampleV2 . Here is the classifier I'm using for training the model:

Cannot convert between a TensorFlowLite buffer with 307200 bytes and a Java Buffer with 270000 bytes

旧城冷巷雨未停 提交于 2020-06-16 19:19:07
问题 I am trying to run a pre-trained Object Detection TensorFlowLite model from Tensorflow detection model zoo. I used the ssd_mobilenet_v3_small_coco model from this site under the Mobile Models heading. According to the instructions under Running our model on Android , I commented out the model download script to avoid the assets being overwritten: // apply from:'download_model.gradle' in build.gradle file and replaced the detect.tflite and labelmap.txt file in assets directory. Build was

how to set input of Tensorflow Lite C++

与世无争的帅哥 提交于 2020-06-14 04:59:07
问题 I'm trying to test simple tensorflow lite c++ code with TensorflowLite model. It gets two floats and do xor. However when I change inputs, output doesn't change. I guess the line interpreter->typed_tensor<float>(0)[0] = x is wrong so inputs aren't properly applied. How should I change the code to work? This is my code #include <stdio.h> #include <stdlib.h> #include <string> #include <vector> #include "tensorflow/contrib/lite/kernels/register.h" #include "tensorflow/contrib/lite/model.h"

How can I build only TensorFlow lite and not all TensorFlow from source?

女生的网名这么多〃 提交于 2020-06-12 15:29:56
问题 I am trying to use edgetpu USB accelerator with Intel ATOM single board computer and C++ API for real-time inference. C++ API for edgetpu is based on TensorFlow lite C++ API. I need to include header files from tensorflow/lite directory (e.g. tensorflow/lite/interpreter.h). My question is can I build tensorflow only with Lite (not other operations used for training )? if yes, how can I do it? Because installing everything will take long time. 回答1: Assuming that you are using a Linux-based

How can I build only TensorFlow lite and not all TensorFlow from source?

空扰寡人 提交于 2020-06-12 15:29:52
问题 I am trying to use edgetpu USB accelerator with Intel ATOM single board computer and C++ API for real-time inference. C++ API for edgetpu is based on TensorFlow lite C++ API. I need to include header files from tensorflow/lite directory (e.g. tensorflow/lite/interpreter.h). My question is can I build tensorflow only with Lite (not other operations used for training )? if yes, how can I do it? Because installing everything will take long time. 回答1: Assuming that you are using a Linux-based