Running a Tensorflow model on Android

后端 未结 2 623
猫巷女王i
猫巷女王i 2021-01-31 20:21

I\'m trying to figure out the workflow for training and deploying a Tensorflow model on Android. I\'m aware of the other questions similar to this one on StackOverflow, but non

2条回答
  •  陌清茗
    陌清茗 (楼主)
    2021-01-31 20:35

    git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git
    

    Note: --recurse-submodules is important to pull submodules.

    Install Bazel from here. Bazel is the primary build system for TensorFlow. Now, edit the WORKSPACE, we can find the WORKSPACE file in the root directory of the TensorFlow that we have cloned earlier.

    # Uncomment and update the paths in these entries to build the Android demo.
    #android_sdk_repository(
    #    name = "androidsdk",
    #    api_level = 23,
    #    build_tools_version = "25.0.1",
    #    # Replace with path to Android SDK on your system
    #    path = "",
    #)
    #
    #android_ndk_repository(
    #    name="androidndk",
    #    path="",
    #    api_level=14)
    

    Like below with our sdk and ndk path:

    android_sdk_repository(
        name = "androidsdk",
        api_level = 23,
        build_tools_version = "25.0.1",
        # Replace with path to Android SDK on your system
        path = "/Users/amitshekhar/Library/Android/sdk/",
    )
    android_ndk_repository(
        name="androidndk",
        path="/Users/amitshekhar/Downloads/android-ndk-r13/",
        api_level=14)
    

    Then build the .so file.

    bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \
       --crosstool_top=//external:android/crosstool \
       --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
       --cpu=armeabi-v7a
    

    Replacing armeabi-v7a with our desired target architecture. The library will be located at:

    bazel-bin/tensorflow/contrib/android/libtensorflow_inference.so
    

    To build the Java counterpart:

    bazel build //tensorflow/contrib/android:android_tensorflow_inference_java
    

    We can find the JAR file at:

    bazel-bin/tensorflow/contrib/android/libandroid_tensorflow_inference_java.jar
    

    Now we have both jar and .so file. I have already built both .so file and jar, you can directly use from the project.

    Put libandroid_tensorflow_inference_java.jar in libs folder and right click and add as library.

    compile files('libs/libandroid_tensorflow_inference_java.jar')
    

    Create jniLibs folder in main directory and put libtensorflow_inference.so in jniLibs/armeabi-v7a/ folder.

    Now, we will be able to call TensorFlow Java API.

    The TensorFlow Java API has exposed all the required methods through a class TensorFlowInferenceInterface.

    Now, we have to call the TensorFlow Java API with the model path and load it.

    I have written a complete blog here.

提交回复
热议问题