I want to create an Android app which uses BOW + SVM in native (using C++) for predicting. Unfortunately I have problem with building the native part. Since the non-free mod
May I add that, in order to use the new libraries in the running application, there are the following steps to do:
1) in your folder libnonfree/libs/[TARGET PLATFORM]/, there are now 3 files: - libgnustl_shared.so - libnonfree.so - libopencv_java.so
in your own project (my IDE is the Android Studio), you have a folder src/main/, with the subfolders: - java - res
create a new folder (if not already there): "jniLibs" [this folder is auto-parsed by Gradle]
COPY the 3 above-mentioned folders under "libnonfree/libs/" into the "jniLibs" folder. you end up with a structure like that: screenshot of the jniLibs folder
/app/src/main/jniLibs/[armeabi, armeabi-v7a, ...]/[libgnustl_shared.so, libopencv_java.so, libnonfree.so]
2) Somewhere in your code, you have a line like this:
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_10, this, mLoaderCallback);
this line tells your app to dynamically load the pre-compiled library from the locally-installed OpenCV Manager. In order to use the self-compiled non-free version, we replace the above line with the following:
if(!OpenCVLoader.initDebug())
{
}
else
{
System.loadLibrary("nonfree");
}
now, we made sure to use the nonfree-included libraries we provide with the app.
3) well, run a SURF descriptor:
Bitmap mPhotograph = BitmapFactory.decodeFile(_image_path);
Mat real_image = new Mat();
Utils.bitmapToMat(mPhotograph, real_image);
MatOfKeyPoint keypoints_real = new MatOfKeyPoint();
FeatureDetector detector = FeatureDetector.create(FeatureDetector.SURF);
detector.detect(real_image, keypoints_real);
while before, the app would return with a bad signal, this time it does its job and you can evaluate the resulting keypoints.
D:\adt-bundle-windows-x86_64-20140702\android-ndk-r10d\
)D:\CODE\OpenCV-2.4.10-android-sdk\
), Download linkD:\CODE\OpenCV-2.4.10\
), Download linkWe actually only need to copy a few files from OpenCV-2.4.10
source code to OpenCV-2.4.10-android-sdk
, namely:
Copy the nonfree folder from OpenCV-2.4.10\sources\modules\nonfree\include\opencv2\
to OpenCV-2.4.10-android-sdk\sdk\native\jni\include\opencv2
.
Create a folder to hold our new project for libnonfree.so
. Here, I call it libnonfree
. Create a jni
folder under libnonfree
. Copy the following files from OpenCV-2.4.10\sources\modules\nonfree\src
to libnonfree\jni\
folder:
Building libnonfree.so
:
Create Android.mk
and Application.mk
scripts. This Android.mk
is used to build libnonfree.so
.
OPENCV_PATH
where your OpenCV-2.4.10-android-sdk
is)cd
into the project folder libnonfree
and type ndk-build
to build the libnonfree.so
.
So far, you have got libnonfree.so
along with libopencv_java.so
and libgnustl_shared.so
in libnonfree\libs\armeabi-v7a
folder.
You can easily build any SIFT or SURF applications using those libraries. If you want to use SIFT and SURF in JAVA code in your Android application, you only need to write JNI interfaces for the functions you want to use.
Create a project folder call libnonfree_demo
. Create a jni
folder inside the project folder. Then copy libnonfree.so
along with libopencv_java.so
and libgnustl_shared.so
into jni
.
Create a nonfree_jni.cpp in jni
. It is simple SIFT test program. It basically reads an image and detects the keypoints, then extracts feature descriptors, finally draws the keypoints to an output image.
Create Android.mk
and Application.mk
inside jni
:
OPENCV_PATH
where your OpenCV-2.4.10-android-sdk
is)cd
into the project folder libnonfree_demo
and type ndk-build
to build the libnonfree_demo.so
.
At this point you can easily extend the sample app with your SVMDetector
. Just copy the source and include files int to the folder libnonfree_demo\jni
and add cpp files to LOCAL_SRC_FILES
in Android.mk
.
The whole source can be downloaded from: https://github.com/bkornel/opencv_android_nonfree.
Original source from: http://web.guohuiwang.com/technical-notes/sift_surf_opencv_android