Hard-swish for TFLite

时光怂恿深爱的人放手 提交于 2021-01-03 22:24:09

问题


I have a custom neural network written in Tensorflow.Keras and apply the hard-swish function as activation (as used in the MobileNetV3 paper):

Implementation:

def swish(x):
    return x * tf.nn.relu6(x+3) / 6

I am running quantization aware training and write a protobuf file at the end. Then, I am using this code to convert to tflite (and deploy it finally on the EdgeTPU):

tflite_convert --output_file test.tflite --graph_def_file=test.pb --inference_type=QUANTIZED_UINT8 --input_arrays=input_1 --output_arrays=conv2d_3/Sigmoid --mean_values=0 --std_dev_values=255 --default_ranges_min=0 --default_ranges_max=6

And this works perfectly, when I am not dividing by 6, however, when dividing by 6 I am getting this error:

Unimplemented: this graph contains an operator of type Div for which the quantized form is not yet implemented.

I am using TF 1.14 to train and TF 1.15 last nightly build to convert to TFLITE; I am struggling to get TF 2.x to work for some strange HDF5 incompatibilities, but it would be great if someone knows how to circumvent this issue... Thanks!


回答1:


Since it is a constant division, you could just multiply by (a close approximation of) the inverse:

def swish(x):
    return x * tf.nn.relu6(x+3) * 0.16666667


来源:https://stackoverflow.com/questions/60336568/hard-swish-for-tflite

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!