I\'m trying to get a Mobilenetv2 model (retrained last layers to my data) to run on the Google edge TPU Coral.
I\'ve followed this tuturial https://www.tensorflow.org/li
This problem is fixed in tensorflow1.15-rc. Convert your model to TFLite in the new tf version. Then the TFLite model will work in TPU compiler.
And put these lines which make the TFlite model's input and output as an uint8 type. (I think it should be tf.int8 though.)
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
Check the link below. https://www.tensorflow.org/lite/performance/post_training_quantization