针对android–opencv管理器包没找到?如何自动安装?和opencvmanager.apk这两个问题,本篇文章进行了详细的解答,同时本文还将给你拓展AndroidNDKandOpenCVDev
针对android – opencv管理器包没找到?如何自动安装?和opencv manager.apk这两个问题,本篇文章进行了详细的解答,同时本文还将给你拓展Android NDK and OpenCV Development With Android Studio、Android NDK OpenCV 颜色检测、Android NDK和OpenCV整合开发 (2) Android NDK、Android NDK和OpenCV整合开发 (3) OpenCV等相关知识,希望可以帮助到你。
本文目录一览:- android – opencv管理器包没找到?如何自动安装?(opencv manager.apk)
- Android NDK and OpenCV Development With Android Studio
- Android NDK OpenCV 颜色检测
- Android NDK和OpenCV整合开发 (2) Android NDK
- Android NDK和OpenCV整合开发 (3) OpenCV
android – opencv管理器包没找到?如何自动安装?(opencv manager.apk)
解决方法
请参阅此文档: Application Development with Static Initialization
Android NDK and OpenCV Development With Android Studio
---------------- If you do NOT know Chinese, you can just skip this part ----------------
一直打算将原来的XFace进行改进,最近终于有了些时间可以动手了,改进计划如下:开发上使用Android Studio作为新的开发环境,配上新的构建系统Gradle;应用上将修改原来的UI设计,内部代码也将有很大的变化,可能会用上ContentProvider和Service等略高级内容;算法上打算让应用扩展性增强以适应不同的算法,并结合强大的Android Studio和Gradle让这个项目变得更加丰富。说了一堆废话,言归正传,本文的重点是介绍如何在Android Studio中进行NDK开发(目前它还不完全支持NDK开发),难点是NDK中还包含OpenCV的动态库。最后的最后,本文剩下部分将使用英文,因为它要成为我在StackOverflow上的处女答,么么哒 ~O(∩_∩)O~
---------------------------- Here is the right stuff you may need --------------------------------
This post shows how to develop an Android NDK application with OpenCV included using Android Studio and Gradle. If you''re working on migrating your original Eclipse Project to Android Studio, you may find this post is what exactly you want!
OK,Let''s start!
Section 1: Three things you must know
1.Firstly, if you are not familiar with Android Studio and Gradle, you may find these links useful. (if you already know these well, skip this part)
①Creating a new Project with Android Studio
②Building Your Project with Gradle
③Gradle Plugin User Guide or you may want to read a Chinese commented version in my blog here.
2.Secondly, if your android ndk project is not that complicated(for example, having no opencv included), you may wanna see ph0b
''s introduction here, it''s quite a nice job with a video recorded! (you can also follow Section 2 in this post to get a simple Android NDK demo application)
ph0b
''s post: ANDROID STUDIO, GRADLE AND NDK INTEGRATION
3.Thirdly, if those above two do not meet your needs, then I think you may want to customize the Android.mk with Gradle in Android Studio. Thanks to Gaku Ueda
, he had made a great job explaining how to achieve that goal. Actually I have found another nicer solution without adding that many codes and also achieve that goal. :-) Find it out in the next sections.
Gaku Ueda
''s post: Using custom Android.mk with Gradle/Android Studio
OK, I will cover all above and give another nice solution in the end, have fun!
Section 2: A simple Android NDK demo application
This section shows creating a simple Android NDK demo application, if you already know, you can directly go the section 3.
1.Create a new Android project named NDKDemo
with a blank Activity in AS(=Android Studio).
2.Give an id
to the TextView
in activity_my.xml
such as android:id="@+id/textview"
, then add these codes in MyActivity.java
.
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_my);
TextView textView = (TextView) findViewById(R.id.textview);
textView.setText(hello());
}
static {
System.loadLibrary("hello");
}
public native String hello();
3.Create a new directory jni
in folder app/src/main
, then you have java
, jni
and res
in this folder.
4.This step is very important! You can add a external tool to run the javah
command without typing that much code!
Open AS''s Preferences
, then find External Tools
in IDE Settings
, click +
to add one tool with the following configurations. (Make sure you have add JDK tools
in your system path
, if you don''t know how, click here)
With the help of this tool, each time we right click on a class file
, then choose Android Tools -> javah
to run this tool, it will automatically generate a C head file
for us in the target folder $ModuleFileDir$/src/main/jni
, in this case, it is app/src/main/jni
. Try this on MyActivity.java
file now! The console will print out a log like:
/usr/bin/javah -v -jni -d /Users/hujiawei/AndroidStudioProjects/NDKDemo/app/src/main/jni com.android.hacks.ndkdemo.MyActivity
[Creating file RegularFileObject[/Users/hujiawei/AndroidStudioProjects/NDKDemo/app/src/main/jni/
com_android_hacks_ndkdemo_MyActivity.h]]
Then you get a com_android_hacks_ndkdemo_MyActivity.h
file in jni
folder with the following content.
/* DO NOT EDIT THIS FILE - it is machine generated */
#include <jni.h>
/* Header for class com_android_hacks_ndkdemo_MyActivity */
#ifndef _Included_com_android_hacks_ndkdemo_MyActivity
#define _Included_com_android_hacks_ndkdemo_MyActivity
#ifdef __cplusplus
extern "C" {
#endif
/*
* Class: com_android_hacks_ndkdemo_MyActivity
* Method: hello
* Signature: ()Ljava/lang/String;
*/
JNIEXPORT jstring JNICALL Java_com_android_hacks_ndkdemo_MyActivity_hello
(JNIEnv *, jobject);
#ifdef __cplusplus
}
#endif
#endif
5.Write a simple C
implementation file named main.c
in jni
folder
#include <jni.h>
#include "com_android_hacks_ndkdemo_MyActivity.h"
JNIEXPORT jstring JNICALL Java_com_android_hacks_ndkdemo_MyActivity_hello
(JNIEnv * env, jobject obj){
return (*env)->NewStringUTF(env, "Hello from JNI");
}
6.In the build.gradle
file under app
module, add the following codes to configure ndk
in defaultConfig
element, here we just give the uni module a name hello
, you can find other configurations in Gradle Plugin User Guide.
defaultConfig {
applicationId "com.android.hacks.ndkdemo"
minSdkVersion 16
targetSdkVersion 20
versionCode 1
versionName "1.0"
ndk{
moduleName "hello"
}
}
7.In order to let Gradle run ndk-build
command (in some task, maybe NdkCompile
task), we should configure the ndk.dir
in local.properties
file in Project root.
sdk.dir=/Volumes/hujiawei/Users/hujiawei/Android/android_sdk
ndk.dir=/Volumes/hujiawei/Users/hujiawei/Android/android_ndk
8.OK, everything is ready, click Run
to give it a try, you will see the result like
All right, so what''s happening inside?
Since you have a jni
folder, Gradle will consider it as a default native code folder. When Gradle builds the app
, it will run ndk-build
command(since you have configured ndk.dir
, Gradle knows where to find it) with a generated Android.mk
file(locates in app/build/intermediates/ndk/debug/Android.mk
), after compiling the native codes, it will generate the libs
and obj
folder into folder app/build/intermediates/ndk/debug/
. Gradle will then package the libs
into final apk
file in folder app/build/outputs/apk/app-debug.apk
(you can unarchive this file to check whether libs
is contained)
app/build/intermediates/ndk/debug
(lib
and obj
folders)
app/build/outputs/apk/app-debug.apk
(and files within it)
Secontion 3: Using OpenCV
If your project do not use OpenCV, then the section 2 is just enough. But what if you wanna use OpenCV to do other stuff? Of course, we want to use OpenCV for Android
instead of JavaCV
here, and Of course, we need to package OpenCV library for Android into our application''s APK file (then users who use this app does not have to install OpenCV Manager
). So, how can we achieve these goals?
The simplest way has been posted by TGMCians
on Stack Overflow here, that is, let the main app include the OpenCV library as a dependency, and copy all <abi>/*.so
files in OpenCV for Android SDK to jniLibs
folder under app/src/main/
, Gradle will automatically package these <abi>/*.so
files into libs
folder within the final APK file. Of course, this method will work, but it has a few backwards: (1) Unless you only copy the needed *.so
files, you will always have a large APK due to this reason; (2) How about the building of the jni
files? How to run ndk-build
if these files contain opencv
related codes?
So, here comes to our Using custom Android.mk with Gradle and Android Studio
part. For testing, we first creat an Android.mk
and an Application.mk
file under jni
folder.
Android.mk
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_SRC_FILES := main.c
LOCAL_LDLIBS += -llog
LOCAL_MODULE := hello
include $(BUILD_SHARED_LIBRARY)
Application.mk
APP_ABI := armeabi
APP_PLATFORM := android-16
Thanks to Gaku Ueda
, he had made a great job explaining how to achieve that goal with this post. The core idea of his method is to run ndk-build
command in some task, then zip the <abi>/*.so
files under the output app/build/libs/
folder into a jar
file which is finally put in app/build/libs/
folder, then add a compile dependency to this jar file. The key code for his method listed below
Notice 1: When using custom Android.mk, we should first disable Gradle to build the jni
folder as before, and sourceSets.main.jni.srcDirs = []
just does this job!
Notice 2: The code is not exactly the same with Gaku Ueda''s code: tasks.withType(Compile)
to tasks.withType(JavaCompile)
, because Compile
is deprecated.
Notice 3: You can get $ndkDir
variable with project.plugins.findPlugin(''com.android.application'').getNdkFolder()
or you can define it in grade.properties
file under Project root, so you need to add ndkDir=path/to/your/ndk
in that file, if the file is not created, simply create a new one.
android{
...
sourceSets.main.jni.srcDirs = []
task ndkBuild(type: Exec, description: ''Compile JNI source via NDK'') {
ndkDir = project.plugins.findPlugin(''com.android.application'').getNdkFolder()
commandLine "$ndkDir/ndk-build",
''NDK_PROJECT_PATH=build'',
''APP_BUILD_SCRIPT=src/main/jni/Android.mk'',
''NDK_APPLICATION_MK=src/main/jni/Application.mk''
}
task ndkLibsToJar(type: Zip, dependsOn: ''ndkBuild'', description: ''Create a JAR of the native libs'') {
destinationDir new File(buildDir, ''libs'')
baseName ''ndk-libs''
extension ''jar''
from(new File(buildDir, ''libs'')) { include ''**/*.so'' }
into ''lib/''
}
tasks.withType(JavaCompile) {
compileTask -> compileTask.dependsOn ndkLibsToJar
}
...
}
dependencies {
compile fileTree(dir: ''libs'', include: [''*.jar''])
// add begin
compile fileTree(dir: new File(buildDir, ''libs''), include: ''*.jar'')
// add end
}
But we can still do a little improvements here. We have already know that Gradle will take jniLibs
folder as its default native libraries folder, so we can simply output the libs/<abi>/*.so
files generated by ndk-build
command into jniLibs
folder, so there''s no need to zip these *.so
files into a jar
file.
The final build.gradle
file under app
module
apply plugin: ''com.android.application''
android {
compileSdkVersion 20
buildToolsVersion "20.0.0"
defaultConfig {
applicationId "com.android.hacks.ndkdemo"
minSdkVersion 16
targetSdkVersion 20
versionCode 1
versionName "1.0"
}
// add begin
sourceSets.main.jni.srcDirs = []
task ndkBuild(type: Exec, description: ''Compile JNI source via NDK'') {
ndkDir = project.plugins.findPlugin(''com.android.application'').getNdkFolder()
commandLine "$ndkDir/ndk-build",
''NDK_PROJECT_PATH=build/intermediates/ndk'',
''NDK_LIBS_OUT=src/main/jniLibs'',
''APP_BUILD_SCRIPT=src/main/jni/Android.mk'',
''NDK_APPLICATION_MK=src/main/jni/Application.mk''
}
tasks.withType(JavaCompile) {
compileTask -> compileTask.dependsOn ndkBuild
}
// add end
buildTypes {
release {
runProguard false
proguardFiles getDefaultProguardFile(''proguard-android.txt''), ''proguard-rules.pro''
}
}
}
dependencies {
compile fileTree(dir: ''libs'', include: [''*.jar''])
}
So simple, right? ''NDK_LIBS_OUT=src/main/jniLibs''
helps us do the right job!
For testing, you can also add some lines relating with OpenCV in your Android.mk
file and some line in your main.c
to check whether everything is readlly working. For example, add #include <opencv2/core/core.hpp>
in main.c
file, and change Android.mk
to
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
#opencv
OPENCVROOT:= /Volumes/hujiawei/Users/hujiawei/Android/opencv_sdk
OPENCV_CAMERA_MODULES:=on
OPENCV_INSTALL_MODULES:=on
OPENCV_LIB_TYPE:=SHARED
include ${OPENCVROOT}/sdk/native/jni/OpenCV.mk
LOCAL_SRC_FILES := main.c
LOCAL_LDLIBS += -llog
LOCAL_MODULE := hello
include $(BUILD_SHARED_LIBRARY)
In Gradle Console window, you can see these similar lines
*.so
files relating with OpenCV has been packaged into the final APK
One More Thing
Of course, maybe you don''t want to change your build.grale
file with that much code, and Of course, you also don''t want to run ndk-build
outside the IDE, then copy the <abi>/*.so
files into jniLibs
folder each time you want to rebuild the native codes!
At last, I came out another nicer solution, if you like, that is to create a ndk-build
external tool in Android Studio, and every time you want to rebuild the native codes, simply run the external tool, then it automatically generates the libs/<abi>/*.so
files into jniLibs
folder, so everything is ready to run this app, :-)
The configuration is simple
Parameters: NDK_PROJECT_PATH=$ModuleFileDir$/build/intermediates/ndk NDK_LIBS_OUT=$ModuleFileDir$/src/main/jniLibs NDK_APPLICATION_MK=$ModuleFileDir$/src/main/jni/Application.mk APP_BUILD_SCRIPT=$ModuleFileDir$/src/main/jni/Android.mk V=1
OK, I hope it is helpful. Let me know if it is really helpful, or tell me what''s your problem. :-)
Android NDK OpenCV 颜色检测
如何解决Android NDK OpenCV 颜色检测?
我正在尝试检测 Android 上的颜色。
使用 OpenCV 4.5.3 和 Android Studio 4.2.2
下面的代码在纯 C++ 中运行良好。
当我在 Android NDK 中使用它时,只检测到蓝色部分。不是白色和黄色。
如何检测白色和黄色部分?
Java_com_example_opencvapplication_MainActivity_ConvertRGBtoGray(jnienv *env,jobject thiz,jlong mat_addr_input,jlong mat_addr_result) {
// Todo: implement ConvertRGBtoGray()
Mat &matInput = *(Mat *)mat_addr_input;
Mat &matResult = *(Mat *)mat_addr_result;
Mat img_hsv;
Mat white_mask,white_image;
Mat yellow_mask,yellow_image;
matInput.copyTo(matResult);
// white range
Scalar lower_white = Scalar(200,200,200);
Scalar upper_white = Scalar(255,255,255);
// yellow range
Scalar lower_yellow = Scalar(10,100,100);
Scalar upper_yellow = Scalar(40,255);
// white filter
inRange(matResult,lower_white,upper_white,white_mask);
bitwise_and(matResult,matResult,white_image,white_mask);
cvtColor(matResult,img_hsv,COLOR_BGR2HSV);
// yellow filter
inRange(img_hsv,lower_yellow,upper_yellow,yellow_mask);
bitwise_and(matResult,yellow_image,yellow_mask);
addWeighted(white_image,1.0,0.0,matResult);
// cvtColor(matResult,COLOR_RGBA2GRAY);
}
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)
Android NDK和OpenCV整合开发 (2) Android NDK
Android NDK 和 OpenCV 整合开发 (2) Android NDK
这节主要介绍的内容是Android NDK开发的核心内容和开发总结(包括很多常见问题的解决方案),本节主要分为三部分:
* JNI技术和javah命令
* Android NDK Dev Guide
* NDK开发中常见的问题
1.不得不说的JNI和javah命令
NDK开发的核心之一便是JNI,在Oracle官方的JNI相关文档中重要的是里面的第3-4部分(数据类型和函数),本文不会介绍这些,如果想快速入手可以查看这位作者的几篇关于JNI的文章,讲得深入浅出,另外推荐一篇IBM DeveloperWorks上的文章:JNI 对象在函数调用中的生命周期,讲得有点深奥哟。
- javah命令:查看命令详细参数:
javah produces C header files and C source files from a Java class. These files provide the connective glue that allow your Java and C code to interact.
- 在Eclipse中配置万能的javah工具的方法
(1)在External Tools Configurations
中新建Program
(2)Location
设置为/usr/bin/javah
[你可能不是这个位置,试试${system_path:javah}
]
(3)Working Directory
设置为${project_loc}/bin/classes
[适用于Android项目开发]
(4)Arguments
设置为-jni -verbose -d "${project_loc}${system_property:file.separator}jni" ${java_type_name}
(5)OK,以后只要选中要进行"反编译"的Java Class,然后运行这个External Tool就可以了!
注意因为我的Arguments
设置为导出的头文件是放在项目的jni目录中,如果不是Android NDK开发的话,请自行修改输出路径,还有Working Directory
设置为${project_loc}/bin
,不要包含后面的/classes
。如果还有问题的话,推荐看下这位作者的JNI相关配置
2.那些年的Android NDK Dev Guide
在ndk的根目录下有一个html文件document.html
,这个就是Android NDK Dev Guide,用浏览器打开可以看到里面介绍了NDK开发中的很多配置问题,不同版本的NDK差别还是蛮大的,而且NDK开发中问题会很多,不像SDK开发那么简单,所以,一旦出现了问题,运气好能够Google解决,RP弱的时候只能啃这些Guide来找答案了。这几篇文章的简单介绍可以查看Android Developer上的解释。对于这部分的内容,可以阅读下这位作者的几篇NDK Dev Guide的翻译版本,虽然略有过时,但是看后肯定会很受用的,下面我简单介绍下这里的几个内容:
- [1]Android NDK Overview
这篇文章介绍了NDK的目标和NDK开发的简易实践过程,后面的那些文章基本上都是围绕这个核心内容展开的,非常建议阅读。需要注意的是,NDK只支持Android 1.5版本以上的设备。
- [2]Android.mk文件
Android.mk文件是用来描述源代码是如何进行编译的,ndk-build命令实际上对GNU Make命令的一个封装,所以,Android.mk文件的写法就类似Makefile的写法[关于Make的详细内容可以看这本书,[GNU Make的中文手册],虽然是今年读的,但是我记得的也不多了,老了老了…]
Android.mk文件可以生成一个动态链接库或者一个静态链接库,但是只有动态链接库是会复制到应用的安装包中的,静态库一般是用来生成其他的动态链接库的。你可以在一个Android.mk文件定义一个或者多个module,不同的module可以使用相同的source file进行编译得到。你不需要列出头文件,也不需要显示指明要生成的目标文件之间的依赖关系(这些内容在GNU Make中是很重要的,虽然GNU Make中的隐式规则也可以做到)。下面以hello-jni项目中的Android.mk文件为例讲解其中重要的几点。
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := hello-jni
LOCAL_SRC_FILES := hello-jni.c
include $(BUILD_SHARED_LIBRARY)
①LOCAL_PATH := $(call my-dir)
:Android.mk文件的第一行必须要指明LOCAL_PATH
,my-dir
是编译系统提供的一个宏函数,这个宏函数会返回当前Android.mk文件所在的目录
②include $(CLEAR_VARS)
:CLEAR_VARS
是编译系统提供的一个变量,这个变量指向一个特殊的Makefile文件,它会清除所有除了LOCAL_PATH
之外的其他的LOCAL_XXX
变量。
③LOCAL_MODULE := hello-jni
:必须要指定LOCAL_MODULE
,它是指这个Android.mk要生成的目标,这个名称是一个不包含空格的唯一的字符串,编译系统会自动根据名称进行一定的修改,例如foo.so
和libfoo.so
得到的都是libfoo.so
!在Java代码中进行加载的时候使用的是没有lib
的module名。
④LOCAL_SRC_FILES := hello-jni.c
:指定C/C++源文件列表,不要包含头文件。如果需要自定义C++源文件的后缀,可以配置LOCAL_CPP_EXTENSION
参数。注意写法,我给个例子,一定要记住每行后面加上一个反斜线符,并且反斜线符后面不能再有任何内容,否则编译会报错!
LOCAL_SRC_FILES := hello-jni.c \
foo.c \
boo.cpp
⑤include $(BUILD_SHARED_LIBRARY)
:BUILD_SHARED_LIBRARY
是编译系统提供的一个Makefile文件,它会根据你前面提供的参数来生成动态链接库,同理,如果是BUILD_STATIC_LIBRARY
的话,便是生成静态链接库。
最佳实践:一般来说,LOCAL_
作为前缀的一般定义LOCAL Module的变量,PRIVATE_
或者NDK_
或者APP_
一般定义内部使用的变量,lower-case
小写字母的名称一般也是定义内部使用的变量或者函数。如果你要在Android.mk文件定义自己的变量,建议使用MY_
作为前缀!
MY_SOURCES := foo.c
ifneq ($(MY_CONFIG_BAR),)
MY_SOURCES += bar.c
endif
LOCAL_SRC_FILES += $(MY_SOURCES)
Android.mk这篇文章中后面详细介绍了很多编译系统内置的变量和函数,以及该文件内可以设置的变量,此处就不再赘述了。
- [3]Application.mk文件
Application.mk文件描述的是你的应用需要使用哪些native modules,这个文件不是必须的,小项目可以不用编写这个文件。这个文件可以放在两个不同的位置,最常用的是放在jni目录下,和Android.mk文件放在一块,也可以放在$NDK/apps/<myapp>/
目录下(不推荐使用后者,如果使用的是后者,那么必须要显示指定APP_PROJECT_PATH
)
①APP_MODULES
:这个参数在NDK r4之前是一定要指定的,之后便是可选的,默认情况下,NDK将编译Android.mk文件中定义的所有的modules。
②APP_CFLAGS
:这个参数用来指定编译C/C++文件选项参数,例如-frtti -fexceptions
等等,而APP_CPPFLAGS
是专门用来指定编译C++源文件的选项参数。
③APP_ABI
:这个参数很重要,默认情况下,ndk-build将生成对应armeabi
CPU架构的库文件,你可以指定其他的CPU架构,或者同时指定多个(自从NDK r7之后,设置为all
可以生成所有CPU架构的库文件)!关于不同CPU架构的介绍在CPU Arch ABIs
中介绍了,我不是很懂,此文不细讲。如果想要查看某个android设备是什么CPU架构,可以上网查设备的资料,或者通过执行adb shell getprop ro.product.cpu.abi
得到,下面这段摘自OpenCV for Android SDK
armeabi, armv7a-neon, arm7a-neon-android8, mips and x86 stand forplatform targets:
* armeabi is for ARM v5 and ARM v6 architectures with Android API 8+,
* armv7a-neon is for NEON-optimized ARM v7 with Android API 9+,
* arm7a-neon-android8 is for NEON-optimized ARM v7 with Android API 8,
* mips is for MIPS architecture with Android API 9+,
* x86 is for Intel x86 CPUs with Android API 9+.
If using hardware device for testing/debugging, run the following command to learnits CPU architecture:
*** adb shell getprop ro.product.cpu.abi ***
If you’re using an AVD emulator, go Window > AVD Manager to see thelist of availible devices. Click Edit in the context menu of theselected device. In the window, which then pop-ups, find the CPU field.
④APP_STL
:指定STL,默认情况下ndk编译系统使用最精简的C++运行时库/system/lib/libstdc++.so
,但是你可以指定其他的。详细的内容可以查看$NDK/docs/CPLUSPLUS-SUPPORT.html
文件,这个文件可能并没有列出在document.html中!
system -> Use the default minimal system C++ runtime library.
gabi++_static -> Use the GAbi++ runtime as a static library.
gabi++_shared -> Use the GAbi++ runtime as a shared library.
stlport_static -> Use the STLport runtime as a static library.
stlport_shared -> Use the STLport runtime as a shared library.
gnustl_static -> Use the GNU STL as a static library.
gnustl_shared -> Use the GNU STL as a shared library.
我们可以从下面的表格中看出它们对C++语言特性的支持程度:
C++ C++ Standard
Exceptions RTTI Library
system no no no
gabi++ yes yes no
stlport yes yes yes
gnustl yes yes yes
从中我们可以看出gnustl很不错,所以一般会配置为gnustl_static。如果选用的是gnustl的话,一般还需要在C/C++ General
下的Paths and Symbols
中的GNU C
和GNU C++
配置里添加${NDKROOT}/sources/cxx-stl/gnu-libstdc++/4.6/include
和 ${NDKROOT}/sources/cxx-stl/gnu-libstdc++/4.6/libs/armeabi-v7a/include
这两项。
另外需要注意的是,如果你指定的是xxx_shared
,想要在运行时加载它,并且其他的库是基于xxx_shared
的话,一定记得要先加载xxx_shared
,然后再去加载其他的库。
⑤APP_PLATFORM
:指定目标android系统版本,注意,指定的是API level
,一般情况下,这里可能会与AndroidManifest.xml
文件中定义的minSdkVersion
冲突而报错,处理办法是类似上一节中提到的修改APP_PLATFORM
保证两个不冲突就行了。
- [4]Stable-APIS
build system会自动加载C库,Math库以及C++支持库,所以你不需要通过LOCAL_LDLIBS
指定加载他们。Android系统下有多个API level
,每个API level
都对应了一个Android的发布系统,对应关系如下所示。其中android-6
,android-7
和android-5
是一样的NDK,也就是说他们提供的是相同的native ABIs。对应API level
的头文件都放在了$NDK/platforms/android-<level>/arch-arm/usr/include
目录下,这正是上一节中导入的项目中在C/C++ General
下的Paths and Symbols
中的GNU C
和GNU C++
配置。
Note that the build system automatically links the C library, the Math
library and the C++ support library to your native code, there is no
need to list them in a LOCAL_LDLIBS line.
There are several "API Levels" defined. Each API level corresponds to
a given Android system platform release. The following levels are
currently supported:
android-3 -> Official Android 1.5 system images
android-4 -> Official Android 1.6 system images
android-5 -> Official Android 2.0 system images
android-6 -> Official Android 2.0.1 system images
android-7 -> Official Android 2.1 system images
android-8 -> Official Android 2.2 system images
android-9 -> Official Android 2.3 system images
android-14 -> Official Android 4.0 system images
Note that android-6 and android-7 are the same as android-5 for the NDK,
i.e. they provide exactly the same native ABIs!
IMPORTANT:
The headers corresponding to a given API level are now located
under $NDK/platforms/android-<level>/arch-arm/usr/include
介绍几个比较重要的库:
(1)C库(libc):不需要指定 –lpthread –lrt,也就是说它会自动链接
(2)C++库(lstdc++):不需要指定 –lstdc++
(3)Math库(libm):不需要指定 –lm
(4)动态链接器库(libdl):不需要指定 –ldl
(5)Android log(liblog):需要指定 –llog
(6)Jnigraphics库(libjnigraphics):这个C语言库提供了对Java中Bitmap的操作,需要指定 –ljnigraphics,这个库是android-8
新增加的内容,典型的使用方式是:
Briefly, typical usage should look like:
1/ Use AndroidBitmap_getInfo() to retrieve information about a
given bitmap handle from JNI (e.g. its width/height/pixel format)
2/ Use AndroidBitmap_lockPixels() to lock the pixel buffer and
retrieve a pointer to it. This ensures the pixels will not move
until AndroidBitmap_unlockPixels() is called.
3/ Modify the pixel buffer, according to its pixel format, width,
stride, etc.., in native code.
4/ Call AndroidBitmap_unlockPixels() to unlock the buffer.
(7)The Android native application APIs:android-9
新增加的内容,这些API使得你可以完全使用native code编写android app,但是一般情况下还是需要通过jni的,相关API如下:
The following headers correspond to these new native APIs (see comments
inside them for more details):
<android/native_activity.h>
Activity lifecycle management (and general entry point)
<android/looper.h>
<android/input.h>
<android/keycodes.h>
<android/sensor.h>
To Listen to input events and sensors directly from native code.
<android/rect.h>
<android/window.h>
<android/native_window.h>
<android/native_window_jni.h>
Window management, including the ability to lock/unlock the pixel
buffer to draw directly into it.
<android/configuration.h>
<android/asset_manager.h>
<android/storage_manager.h>
<android/obb.h>
Direct (read-only) access to assets embedded in your .apk. or
the Opaque Binary Blob (OBB) files, a new feature of Android X.X
that allows one to distribute large amount of application data
outside of the .apk (useful for game assets, for example).
All the corresponding functions are provided by the "libandroid.so" library
version that comes with API level 9. To use it, use the following:
LOCAL_LDLIBS += -landroid
- [5]NDK Build
使用ndk-build
命令(ndk r4之后引入的)实际上是GNU Make的封装,它等价于make -f $NDK/build/core/build-local.mk [参数]
命令。系统必须要安装GNU Make 3.81以上版本,否则编译将报错!如果你安装了GNU Make 3.81,但是默认的make命令没有启动,那么可以在执行ndk-build
之前定义GNUMAKE这个变量,例如GNUMAKE=/usr/local/bin/gmake ndk-build
。
注意 在Windows下进行NDK开发的话,一般使用的是Cygwin自带的Make工具,但是默认是使用NDK的awk工具,所以可能会报一个错误Android NDK: Host ''awk'' tool is outdated. Please define HOST_AWK to point to Gawk or Nawk !
解决方案就是删除NDK自带的awk工具(参考网址),这也就是第一节中使用ndk-build -v
命令得到的GNU Make信息输出不同了,嘿嘿,我这伏笔埋的够深吧!其实,也可以使用下面的方式直接覆盖系统的环境变量
NDK_HOST_AWK=<path-to-awk>
NDK_HOST_ECHO=<path-to-echo>
NDK_HOST_CMP=<path-to-cmp>
如果还是不行的话,参见StackOverflow上的解答
在Windows先开发还有一个需要注意的是,如果是使用Cygwin对native code进行编译,那么需要在使用ndk-build
之前调用NDK_USE_CYGPATH=1
!(不过不用每次都使用)
下面是ndk-build命令的可用参数,比较常用的是 ndk-build NDK_DEBUG=1
或者 ndk-build V=1
ndk-build --> rebuild required machine code.
ndk-build clean --> clean all generated binaries.
ndk-build NDK_DEBUG=1 --> generate debuggable native code.
ndk-build V=1 --> launch build, displaying build commands.
ndk-build -B --> force a complete rebuild.
ndk-build -B V=1 --> force a complete rebuild and display build
commands.
ndk-build NDK_LOG=1 --> display internal NDK log messages
(used for debugging the NDK itself).
ndk-build NDK_DEBUG=1 --> force a debuggable build (see below)
ndk-build NDK_DEBUG=0 --> force a release build (see below)
ndk-build NDK_HOST_32BIT=1 --> Always use toolchain in 32-bit (see below)
ndk-build NDK_APPLICATION_MK=<file>
--> rebuild, using a specific Application.mk pointed to by
the NDK_APPLICATION_MK command-line variable.
ndk-build -C <project> --> build the native code for the project
path located at <project>. Useful if you
don''t want to ''cd'' to it in your terminal.
[6]NDK GDB,Import Module,Prebuilts,Standalone Toolchains以及和CPU相关的三个内容因为我没有涉及过,自己也不是很了解,所以此处暂时搁置了,以后如果用到以后补充。关于NDK调试环境的搭建可以参见这位作者的实践博文
[7]Tips and Tricks 建议和技巧
那些曾经的头疼的问题
- [1]使用Android SDK Manager下载SDK时失败或者很慢
在Windows下修改hosts文件:C:\Windows\System32\drivers\etc
增加如下一行配置:74.125.237.1 dl-ssl.google.com
-
[2]
Fatal signal 11 (SIGSEGV) at 0x00000004 (code=1), thread 23487 (mple)
错误原因是因为访问了非法访问的内存地址,具体的原因可能是访问了null对象或者数组,很有可能是Java层传给Native层的对象是null,导致Native层访问了非法访问的地址。参考网址1 参考网址2
- [3]使用ADB命令向AVD中复制文件或文件夹时报错
默认情况下avd对应的目录是只读的,去掉只读就好了。参考网址
- [4]对android项目执行
add Native Support
报错
使用add Native Support
时一定要记住项目不能有jni目录!如果有的话,那就只能先删除(或者备份重要内容),然后再执行add Native Support
。
- [5]将String传递到Native层解析出现了乱码!
使用自定义的将jstring转换成char*的函数,内容如下:
c++
static char* jstringToString(JNIEnv* env, jstring jstr) {
char* rtn = NULL;
jclass clsstring = env->FindClass("java/lang/String");
jstring strencode = env->NewStringUTF("utf-8"); //"gbk");//
jmethodID mid = env->GetMethodID(clsstring, "getBytes",
"(Ljava/lang/String;)[B");
jbyteArray barr = (jbyteArray) env->CallObjectMethod(jstr, mid, strencode);
jsize alen = env->GetArrayLength(barr);
jbyte* ba = env->GetByteArrayElements(barr, JNI_FALSE);
if (alen > 0) {
rtn = (char*) malloc(alen + 1);
memcpy(rtn, ba, alen);
rtn[alen] = ''\0'';
}
env->ReleaseByteArrayElements(barr, ba, 0);
return rtn;
}
- [6]To be continued
Android NDK和OpenCV整合开发 (3) OpenCV
Android NDK 和 OpenCV 整合开发总结(3)
这一节的主要内容是OpenCV在Android NDK开发中的应用,包括下面几个方面的内容:
- 如何实现Static Initialization从而不需要安装OpenCV Manager运行含OpenCV library的app
- 对十份论文和报告中的关于OpenCV和Android NDK开发的总结
- 如何使用Android中的摄像头,常见的问题有哪些?
- OpenCV 和 Android NDK 整合开发的一般途径
1.实现Static Initialization
实现Static Initialization就是指将OpenCV Library添加到app package中,不需要安装OpenCV Manager这个app就能运行,官方文档有介绍,但是不详细,尤其是最后那句代码到底要放在什么地方很多人都不清楚,其实并不需要像官方文档中介绍的那样配置,我想在这里介绍下如何修改FaceDetection项目的源码来做到这点。(最好是找一个包含jni代码的项目进行修改)
- [1]打开jni下的Android.mk文件,修改OpenCV的那一部分,将
off
设置为on
,并设置OpenCV_LIB_TYPE
为SHARED
,结果如下:
OpenCV_CAMERA_MODULES:=on
OpenCV_INSTALL_MODULES:=on
OpenCV_LIB_TYPE:=SHARED
include ${OpenCVROOT}/sdk/native/jni/OpenCV.mk
- [2]打开FdActivity.java文件,在其中添加一个静态初始化块代码,它是用来加载
OpenCV_java
库的,由于FaceDetection中还用了另一个库detection_based_tracker(用于人脸跟踪),所以要在else
子句中加载进来:
static {
Log.i(TAG, "OpenCV library load!");
if (!OpenCVLoader.initDebug()) {
Log.i(TAG, "OpenCV load not successfully");
} else {
System.loadLibrary("detection_based_tracker");// load other libraries
}
}
- [3]删除FdActivity.java的OnResume()方法的最后那句,不让它去访问OpenCV Manager。
@Override
public void onResume() {
super.onResume();
//OpenCVLoader.initAsync(OpenCVLoader.OpenCV_VERSION_2_4_3, this, mLoaderCallback);//
}
- [4]修改FdActivity.java的OnCreate()方法,从上面的
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this)
代码块中拷贝try-catch
块放到OnCreate的setContentView()
之后,然后拷贝mOpenCVCameraView.enableView();
放到mOpenCVCameraView = (CameraBridgeViewBase) findViewById(R.id.fd_activity_surface_view);
之后,修改后的OnCreate()方法如下:
public void onCreate(Bundle savedInstanceState) {
Log.i(TAG, "called onCreate");
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.face_detect_surface_view);
//
try {
// load cascade file from application resources
InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);
File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);
mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");
FileOutputStream os = new FileOutputStream(mCascadeFile);
byte[] buffer = new byte[4096];
int bytesRead;
while ((bytesRead = is.read(buffer)) != -1) {
os.write(buffer, 0, bytesRead);
}
is.close();
os.close();
mJavaDetector = new CascadeClassifier(mCascadeFile.getAbsolutePath());
if (mJavaDetector.empty()) {
Log.e(TAG, "Failed to load cascade classifier");
mJavaDetector = null;
} else
Log.i(TAG, "Loaded cascade classifier from " + mCascadeFile.getAbsolutePath());
mNativeDetector = new DetectionBasedTracker(mCascadeFile.getAbsolutePath(), 0);// hujiawei
cascadeDir.delete();
} catch (IOException e) {
e.printStackTrace();
Log.e(TAG, "Failed to load cascade. Exception thrown: " + e);
}
//
mOpenCVCameraView = (CameraBridgeViewBase) findViewById(R.id.fd_activity_surface_view);
mOpenCVCameraView.enableView();//
mOpenCVCameraView.setCvCameraViewListener(this);
}
- [5]OK,卸载安装好的OpenCV Manager,然后重新调试运行FaceDetection试试,它已经可以自行运行了!
2.对十份论文和报告中的关于OpenCV和Android NDK开发的总结
这10篇文献大部分[百度网盘下载地址]都还是停留如何在Android开发中使用OpenCV library,没有牵涉到具体的实现领域。具体总结如下:
- _利用OpenCV实现在Android系统下的人脸检测
本文主要介绍了如何在底层通过OpenCV来对人脸部分进行检测,得到的人脸位置数据通过JNI传递给Java层,详细介绍了其中的JNI代码和共享库的构建过程,对图片是通过图片的路径来进行传递的,因为这里的检测只是对单张静态的图片进行检测。
- _Tutorial-2-OpenCV-for-Android-Setup-Macintosh-API11
本文主要是介绍了OpenCV和Android NDK开发环境的搭建,以及基于示例程序Face-Detection的演示。使用的方式是将OpenCV Library Project作为库,然后调用OpenCV Android API。
- _Android application for Face Recognition
这是一份详细的项目介绍,实现了几种基于Android平台的人脸检测和识别,包括Google API和OpenCV的,但是OpenCV的由于需要Library Project,而且算法过于复杂,作者便自行开发了人脸检测库,有6大特性,其中包括了眼镜和嘴巴的检测。
- _ECCV-2012-OpenCV4Android
这份报告写得精简但是内容丰富,有几个重要点:
(1) 使用OpenCV的Android应用开发方式,对应不同的开发人群:Java developer / Native developer
(2) OpenCV4Android目前的局限性,以及开发过程中对于提高性能和开发效率需要注意的事项
- _Introduction to OpenCV for Android devices
本文设计的内容都很基础,涉及到OpenCV和Android开发的环境搭建,亮点是最后的Using C++ OpenCV code,这里是在Android ndk中使用OpenCV本地代码的重要配置项。
- _OpenCV-facedetection
这份报告讲述了很多OpenCV的相关知识,另外还详细讲述了一个人脸检测的算法
- _OpenCV on Android Platforms
这份报告内容也比较多,但是都很基础。
- _BDTI_ARMTechCon_2012_OpenCV_Android
这份报告讲的是OpenCV在嵌入式设备中的应用,其中介绍了OpenCV在Android上的开发,需要注意的是OpenCV2.4开始提供了native Android camera support!
- _OpenCV Based Real-Time Video Processing Using Android Smartphone
这篇论文介绍了利用OpenCV对实时的视频进行处理和纯Android library进行处理的比较,发现利用OpenCV处理的结果更加准确,效率更快,而且更加省电。比较时使用的都是基本图像处理操作,例如灰度化,高斯模糊,Sobel边缘检测等等。
- _Realtime Computer Vision with OpenCV
这篇文章比较有意思,大致看了下,介绍了OpenCV在移动终端的应用。
3.Android的摄像头
关于如何使用Android的摄像头:Android设备一般有两个摄像头,前置摄像头和后置摄像头,在进行和摄像头相关的应用开发的时候很容易遇到各种问题,推荐以下几篇文章:
Android Developer中有对应的文档:Camera
这位作者的总结:Android相机
StackOverflow上关于如何调用前置摄像头
如何在Android中后台开启摄像头默默拍照
关于Camera的三种Callback关于保存预览图片:Android中的
BitmapFactory.decodeByteArray
只支持一定的格式,Camara默认的previewformat格式为NV21
(对于大多数的Android设备,即使修改CameraParameters的设置也还是不行),所以在获得bitmap时,需要进行转换,通过YuvImage类来转换成JPEG格式,然后再保存到文件中。
Google Group上的讨论关于如何在预览界面上添加一个矩形框,类似二维码扫描那样,原理很简单,一个使用SurfaceView,另一个使用ImageVIew(或者SurfaceView也行),推荐文章:
Android摄像头中预览界面添加矩形框关于如何进行和OpenCV有关的摄像头开发:有了OpenCV的library之后,关于摄像头的开发可谓是简单了很多,可以参见OpenCV for Android中的三个Tutorial(CameraPreview, MixingProcessing和CameraControl),源码都在OpenCV-Android sdk的samples目录下,这里简单介绍下:
OpenCV Library中提供了两种摄像头,一种是Java摄像头-org.OpenCV.Android.JavaCameraView
,另一种是Native摄像头-org.OpenCV.Android.NativeCameraView
(可以运行CameraPreview这个项目来体验下两者的不同,其实差不多)。两者都继承自CameraBridgeViewBase
这个抽象类,但是JavaCamera使用的就是Android SDK中的Camera
,而NativeCamera使用的是OpenCV中的VideoCapture
。关于OpenCV的Camera在Layout文件中的配置:
OpenCV:show_fps
在layout中如果设置为true
的话显示界面中会出现当前摄像头帧率的信息以及图片的大小,OpenCV:camera_id
的配置有三种front
,back
,any
分别对应前置摄像头,后置摄像头和默认的摄像头(其实也就是后置摄像头)。关于
CvCameraViewListener2
接口:它可以方便的处理和摄像头的交互,该接口只有三个函数,分别在Camera打开(onCameraViewStarted
),关闭(onCameraViewStopped
)和预览的图片帧到了的时候(onCameraFrame
)调用。其中OnCameraFrame
这个方法很重要,如果要对图片进行处理的话一般都是在这里面处理的,这个函数的输入参数是CvCameraViewFrame
,需要注意的是,不要在这个方法的外面使用这个变量,因为这个对象没有它自己的状态,在回调方法的外面它的行为是不可预测的!它提供了两个有用的方法rgba()
和gray()
分别得到图像帧的RGBA格式和灰度图,OnCameraFrame
的返回值是RGBA格式的图像,这个很重要!一定要保证处理了之后的图像是RGBA格式的Android系统才能正常显示!来自OpenCV文档:Android Development with Android
Note Do not save or use CvCameraViewFrame object out of onCameraFrame callback. This object does not have its own state and its behavior out of callback is unpredictable!
- 关于如何传递摄像头预览的图像数据给Native代码:这个很重要!我曾经试过很多的方式,大致思路有:
①传递图片路径:这是最差的方式,我使用过,速度很慢,主要用于前期开发的时候进行测试,测试Java层和Native层的互调是否正常。
②传递预览图像的字节数组到Native层,然后将字节数组处理成RGB或者RGBA的格式[具体哪种格式要看你的图像处理函数能否处理RGBA格式的,如果可以的话推荐转换成RGBA格式,因为返回的也是RGBA格式的。网上有很多的文章讨论如何转换:一种方式是使用一个自定义的函数进行编码转换(可以搜索到这个函数),另一个种方式是使用OpenCV中的Mat和cvtColor函数进行转换,接着调用图像处理函数,处理完成之后,将处理的结果保存在一个整形数组中(实际上就是RGB或者RGBA格式的图像数据),最后调用Bitmap的方法将其转换成bitmap返回。这种方法速度也比较慢,但是比第一种方案要快了不少,具体实现过程可以看下面的推荐书籍。
③使用OpenCV的摄像头:JavaCamera或者NativeCamera都行,好处是它进行了很多的封装,可以直接将预览图像的Mat结构传递给Native层,这种传递是使用Mat的内存地址(long型),Native层只要根据这个地址将其封装成Mat就可以进行处理了,另外,它的回调函数的返回值也是Mat,非常方便!这种方式速度较快。详细过程可以查看OpenCV-Android sdk的samples项目中的Tutorial2-MixedProcessing。
- 关于摄像头预览界面倒置的问题:很多时候(一般是将应用设置为
portrait
模式之后)在调用了OpenCV的Camera之后,出现预览内容倒置了90度的现象,原因是OpenCV的Camera默认情况下是以landscape
模式运行的,一个可行但是不是很好的解决方案是修改OpenCV库中的org.opencv.android.CameraBridgeViewBase
类中的deliverAndDrawFrame
方法,问题参考链接
protected void deliverAndDrawFrame(CvCameraViewFrame frame) {
Mat modified;
if (mListener != null) {
modified = mListener.onCameraFrame(frame);
} else {
modified = frame.rgba();
}
boolean bmpValid = true;
if (modified != null) {
try {
Utils.matToBitmap(modified, mCacheBitmap);
} catch(Exception e) {
Log.e(TAG, "Mat type: " + modified);
Log.e(TAG, "Bitmap type: " + mCacheBitmap.getWidth() + "*" + mCacheBitmap.getHeight());
Log.e(TAG, "Utils.matToBitmap() throws an exception: " + e.getMessage());
bmpValid = false;
}
}
if (bmpValid && mCacheBitmap != null) {
Canvas canvas = getHolder().lockCanvas();
if (canvas != null) {
// canvas.drawColor(0, android.graphics.PorterDuff.Mode.CLEAR);
// Log.d(TAG, "mStretch value: " + mScale);
//
// if (mScale != 0) {
// canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
// new Rect((int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2),
// (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2),
// (int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2 + mScale*mCacheBitmap.getWidth()),
// (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2 + mScale*mCacheBitmap.getHeight())), null);
// } else {
// canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
// new Rect((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,
// (canvas.getHeight() - mCacheBitmap.getHeight()) / 2,
// (canvas.getWidth() - mCacheBitmap.getWidth()) / 2 + mCacheBitmap.getWidth(),
// (canvas.getHeight() - mCacheBitmap.getHeight()) / 2 + mCacheBitmap.getHeight()), null);
// }
//ABC : Fixed for image rotation
//TODO Why portrait is not opening in fulls creen
Matrix matrix = new Matrix();
int height_Canvas = canvas.getHeight();
int width_Canvas = canvas.getWidth();
int width = mCacheBitmap.getWidth();
int height = mCacheBitmap.getHeight();
float f1 = (width_Canvas - width) / 2;
float f2 = (height_Canvas - height) / 2;
matrix.preTranslate(f1, f2);
if(getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT)
matrix.postRotate(270f,(width_Canvas) / 2,(height_Canvas) / 2);
canvas.drawBitmap(mCacheBitmap, matrix, new Paint());
if (mFpsMeter != null) {
mFpsMeter.measure();
mFpsMeter.draw(canvas, 20, 30);
}
getHolder().unlockCanvasAndPost(canvas);
}
}
}
3.OpenCV 和 OpenCV NDK 整合开发的一般途径
在进行这类开发的时候,需要考虑如何在Android中使用OpenCV,并且如果需要调用摄像头的话,要考虑以下内容:
首先,是否是在原有的C/C++代码上进行移植,如果是的话,那么尽量考虑使用ndk开发,否则使用OpenCV for Android编写Java代码进行开发,效率不会比native代码低多少;
其次,如果是需要OpenCV library,是否能够容忍运行应用还需要安装OpenCV Manager,如果不能的话,则在开发时要考虑将OpenCV binaries添加到应用中进行static initialization,但其实使用OpenCV Manager是有很多好处的,上面的论文和OpenCV官网都有相应的文档介绍它的好处和使用方式;
接着,是否需要调用摄像头,如果需要的话,是使用原生Android的Camera还是使用OpenCV的Camera,如果是OpenCV Camera的话,是使用Java调用摄像头还是Native调用摄像头;
最后,图片如何进行传递,如果是单张静态图片进行处理的话,只需要路径就行了,但是如果是在视频状态下对图片进行处理的话,那么就只能传递图像数据了,这里涉及到了Android中如何获取预览的图像数据以及如何将其传递到底层,又如何进行转换(一般是YUV转成RGB)使得OpenCV可以进行处理,处理完了之后,又如何将处理得到的图片传递给Java层。
推荐一本书籍《Mastering OpenCV with Practical Computer Vision Projects》,电子书可以在皮皮书屋下载,原书源码在Github上。该书第一章介绍如何开发一个使用OpenCV的Android项目-Cartoonifer and Skin Changer for Android
,这个项目涉及到了OpenCV在Android中的方方面面,采用的是第二种图像数据传递方式,其中他提出了很多可以优化的地方,包括:
①尽量使用Mat而不要使用IplImage
②尽量保证你的图像处理函数能够处理RGBA格式的图像
③如果可以先压缩图像大小再对图像进行处理
④使用noise filter降低图像中的噪声。
关于android – opencv管理器包没找到?如何自动安装?和opencv manager.apk的问题我们已经讲解完毕,感谢您的阅读,如果还想了解更多关于Android NDK and OpenCV Development With Android Studio、Android NDK OpenCV 颜色检测、Android NDK和OpenCV整合开发 (2) Android NDK、Android NDK和OpenCV整合开发 (3) OpenCV等相关内容,可以在本站寻找。
本文标签: