开发安卓项目时要用到opencv,在网上找了很多教程,踩了很多坑才弄好,这里将自己亲自配置的过程记录如下,方便后人
1 上opencv官网,下载opencv for android
2 安装 Android studio
开始配置
新建项目,建完后可通过Android studio下载Android NDK环境
一般是没有Android NDK的,勾上后点Apply,然后等它下载,安装
PS:需要,不知道如何的,参考如下网址
将opencv官网上下载好的opencv for Android 中的native文件夹,拷贝到建好的工程根目录下,如图
拷贝前:
拷贝后:
编辑gradle.properties文件,增加下面的属性使用旧版的ndk功能(不添加会使用实验性的ndk构建工具)
android.useDeprecatedNdk=true
在local.properties文件中配置ndk目录
ndk.dir=D:\Android\sdk\ndk-bundle
sdk.dir=D:\Android\sdk
(如果是通过Android studio 来安装的Android NDK 这一步可以不用,因为Android studio安装NDK了)自动配置好
编辑build.gradle,在android节点中增加下面的代码
sourceSets.main.jni.srcDirs = []
//禁止自带的ndk功能
sourceSets.main.jniLibs.srcDirs = ['src/main/libs','src/main/jniLibs']
//重定向so目录为src/main/libs和src/main/jniLibs,原来为src/main/jniLibs
task ndkBuild(type: Exec, description: 'Compile JNI source with NDK') {
Properties properties = new Properties()
properties.load(project.rootProject.file('local.properties').newDataInputStream())
def ndkDir = properties.getProperty('ndk.dir')
if (org.apache.tools.ant.taskdefs.condition.Os.isFamily(org.apache.tools.ant.taskdefs.condition.Os.FAMILY_WINDOWS)) {
commandLine "$ndkDir/ndk-build.cmd", '-C', file('src/main/jni').absolutePath
} else {
commandLine "$ndkDir/ndk-build", '-C', file('src/main/jni').absolutePath
}
}
tasks.withType(JavaCompile) {
compileTask -> compileTask.dependsOn ndkBuild
}
task ndkClean(type: Exec, description: 'Clean NDK Binaries') {
Properties properties = new Properties()
properties.load(project.rootProject.file('local.properties').newDataInputStream())
def ndkDir = properties.getProperty('ndk.dir')
if (org.apache.tools.ant.taskdefs.condition.Os.isFamily(org.apache.tools.ant.taskdefs.condition.Os.FAMILY_WINDOWS)) {
commandLine "$ndkDir/ndk-build.cmd",'clean', '-C', file('src/main/jni').absolutePath
} else {
commandLine "$ndkDir/ndk-build",'clean', '-C', file('src/main/jni').absolutePath
}
}
clean.dependsOn 'ndkClean'
加入前:
加入后(代码太长截图没截完):
在main中新建jni目录,并在jni中新建Android.mk和Application.mk
在Android.mk中加入如下代码
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
OpenCV_INSTALL_MODULES := on
OpenCV_CAMERA_MODULES := off
OPENCV_LIB_TYPE :=STATIC
ifeq ("$(wildcard $(OPENCV_MK_PATH))","")
include ..\..\..\..\native\jni\OpenCV.mk
else
include $(OPENCV_MK_PATH)
endif
LOCAL_MODULE := OpenCV
LOCAL_SRC_FILES :=
LOCAL_LDLIBS += -lm -llog
include $(BUILD_SHARED_LIBRARY)
在Application.mk中加入如下代码
APP_STL := gnustl_static
APP_CPPFLAGS := -frtti -fexceptions
APP_ABI := armeabi armeabi-v7a
APP_PLATFORM := android-8
这时候,使用gradle构建一下,如果能成功构建出so,说明配置没问题,如下图,点击Android studio右侧的gradle展开,在app->other中找到ndkbuild(没有就刷新)进行构建
在建项目时自带的java文件所在的文件夹中,新建一个java文件,声明native方法,这里尝试实现灰度化方法,所以命名为gray
使用javah命令生成头文件,利用AS(android studio)下方的terminal窗口
如图:
这里需要注意,有大坑,不会javah的人尤其注意,这里不能进入刚才新建的java文件所在那个目录进行生成,会出现找不到“XXX”类错误,使用javah命令时,必须进入java目录,即上图中cd进去的那个目录,然后使用
javah -d ../jni com.example.administrator.tryopencv.OpenCVHelper
其中中间的-d ../jni 是指定输出目录,后面需要生成头文件的java名称一定要写完整,要从com开始写,每一层包含关系都写出来,即写成这样com.example.administrator.tryopencv.OpenCVHelper
这样才没问题,在jni目录中会出现生成的头文件,如上图所示
头文件中内容如下:
然后新建刚才生成的.h对应的cpp文件,注意文件名称和函数名称,一定要一致哦
建好后就在里面写实现函数啦,灰度化图像
//
// Created by Administrator on 2015/12/22.
//
#include "com_example_administrator_tryopencv_OpenCVHelper.h"
#include
#include
#include
using namespace cv;
extern "C" {
JNIEXPORT jintArray JNICALL Java_com_example_administrator_tryopencv_OpenCVHelper_gray(JNIEnv *env,jclass obj,jintArray buf,int w,int h);
JNIEXPORT jintArray JNICALL Java_com_example_administrator_tryopencv_OpenCVHelper_gray(JNIEnv *env,jclass obj,jintArray buf,int w,int h)
{
jint *cbuf;
cbuf = env->GetIntArrayElements(buf,JNI_FALSE);
if (NULL == cbuf)
{
return 0;
}
Mat imgData(h,w,CV_8UC4,(unsigned char*) cbuf);
u_char *ptr = imgData.ptr(0);
for (int i = 0; i < w*h; ++i)
{
//图像存储方式为:BGRA
int grayScale = (int)(ptr[4*i+2]*0.299 + ptr[4*i+1]*0.587 + ptr[4*i+0]*0.144 );
ptr[4*i+0] = grayScale;
ptr[4*i+1] = grayScale;
ptr[4*i+2] = grayScale;
}
int size = w * h;
jintArray result = env->NewIntArray(size);
env->SetIntArrayRegion(result,0,size,cbuf);
env->ReleaseIntArrayElements(buf,cbuf,0);
return result;
}
}
然后在Andorid.mk文件中加入
LOCAL_SRC_FILES :=com_example_administrator_tryopencv_OpenCVHelper.cpp
然后在java主代码中写测试代码调用就行啦
这时布局文件代码
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin"
tools:context="com.example.administrator.choosepic.PicActivity">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="choose picture!"
android:id="@+id/title"/>
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="灰度图"
android:id="@+id/GiveFaceButton"
android:layout_below="@+id/title"
android:layout_alignLeft="@+id/GetFaceButton"
android:layout_alignStart="@+id/GetFaceButton"
android:layout_marginTop="71dp" />
<ImageView
android:adjustViewBounds="true"
android:maxHeight="400dp"
android:maxWidth="200dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:scaleType="fitStart"
android:id="@+id/IV01" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="原图"
android:id="@+id/GetFaceButton"
android:layout_centerVertical="true"
android:layout_alignParentRight="true"
android:layout_alignParentEnd="true" />
<ImageView
android:adjustViewBounds="true"
android:maxHeight="400dp"
android:maxWidth="200dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:scaleType="fitStart"
android:id="@+id/IV02"
android:layout_below="@id/IV01" />
RelativeLayout>
这是java文件中的代码
package com.example.administrator.tryopencv;
import android.content.ContentResolver;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import java.io.FileNotFoundException;
import java.io.InputStream;
public class tryOpenCvActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_try_open_cv);
Button GiveFaceButton = (Button)findViewById(R.id.GiveFaceButton);
GiveFaceButton.setOnClickListener(new Button.OnClickListener()
{
public void onClick(View v)
{
Intent myIntent = new Intent();
//开启picture画面Type设置为image*
myIntent.setType("image/*");
//使用Intent.ACTION_GET_CONTENT这个Action
myIntent.setAction(Intent.ACTION_GET_CONTENT);
/*取得相片后返回本画面*/
startActivityForResult(myIntent,1);
}
});
Button GetFaceButton = (Button)findViewById(R.id.GetFaceButton);
GetFaceButton.setOnClickListener(new Button.OnClickListener()
{
public void onClick(View v)
{
Intent myIntent = new Intent();
//开启picture画面Type设置为image*
myIntent.setType("image/*");
//使用Intent.ACTION_GET_CONTENT这个Action
myIntent.setAction(Intent.ACTION_GET_CONTENT);
/*取得相片后返回本画面*/
startActivityForResult(myIntent,2);
}
});
}
protected void onActivityResult(int requestCode,int resultCode,Intent data)
{
switch (requestCode)
{
case 1 :
Uri uri = data.getData();
Log.e("uri",uri.toString());
ContentResolver cr = this.getContentResolver();
try
{
InputStream input = cr.openInputStream(uri);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(input, null, options);
if (options.outWidth > 1024 || options.outHeight > 1024){
options.inSampleSize = Math.max(options.outWidth / 1024, options.outHeight/1024);
}
options.inJustDecodeBounds = false;
Bitmap bitmap = BitmapFactory.decodeStream(cr.openInputStream(uri), null, options);
/*以下进行灰度化处理*/
int w = bitmap.getWidth();
int h = bitmap.getHeight();
int[] pix = new int[w * h];
bitmap.getPixels(pix, 0, w, 0, 0, w, h);
int[] resultPixels = OpenCVHelper.gray(pix, w, h);
Bitmap result = Bitmap.createBitmap(w, h, Bitmap.Config.RGB_565);
result.setPixels(resultPixels,0,w,0,0,w,h);
ImageView imageView = (ImageView)findViewById(R.id.IV01);
/*将处理好的灰度图设定到ImageView*/
imageView.setImageBitmap(result);
}
catch(FileNotFoundException e)
{
Log.e("Exception",e.getMessage(),e);
}
break;
case 2:
Uri uri2 = data.getData();
Log.e("uri2",uri2.toString());
ContentResolver cr2 = this.getContentResolver();
try
{
InputStream input = cr2.openInputStream(uri2);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(input, null, options);
if (options.outWidth > 1024 || options.outHeight > 1024){
options.inSampleSize = Math.max(options.outWidth / 1024, options.outHeight/1024);
}
options.inJustDecodeBounds = false;
Bitmap bitmap = BitmapFactory.decodeStream(cr2.openInputStream(uri2), null, options);
ImageView imageView = (ImageView)findViewById(R.id.IV02);
/*将Bitmap设定到ImageView*/
imageView.setImageBitmap(bitmap);
}
catch(FileNotFoundException e)
{
Log.e("Exception",e.getMessage(),e);
}
break;
default:
break;
}
super.onActivityResult(requestCode, resultCode,data);
}
}
然后就可以运行啦
前期准备和方法一相同,实际上,按照方法一做到“3 配置编译环境”中的编辑Android.mk和Application.mk这一步
这里需要编译出动态库,所以Android会有所不同
其中内容如下:
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
OpenCV_INSTALL_MODULES := on
OpenCV_CAMERA_MODULES := off
OPENCV_LIB_TYPE :=SHARED
ifeq ("$(wildcard $(OPENCV_MK_PATH))","")
include ..\..\..\..\native\jni\OpenCV.mk
else
include $(OPENCV_MK_PATH)
endif
LOCAL_MODULE := OpenCV
LOCAL_SRC_FILES :=
LOCAL_LDLIBS += -lm -llog
include $(BUILD_SHARED_LIBRARY)
主要改的这两个地方
OPENCV_LIB_TYPE :=SHARED(静态改动态)
LOCAL_SRC_FILES :=(没有c++底层代码所以这里为空)
然后在项目中加入opencv的jar包,将之前下好的,opencv for Android sdk中的org文件夹拷贝到main文件夹中
如图:
在main文件夹下新建aidl文件夹
将org/opencv/engine/XXX.aidl文件连同目录一起拷到刚才新建的文件夹中
如图
这里动态编译出的libopencv.so还依赖于libopencv_java3.so文件,这个文件在native/libs中可以找到,一般添加armeabi和armeabi-v7a中的即可,然后在main文件夹中新建jnilibs文件夹,将armeabi和armeabi-v7a中的libopencv_java3.so连同文件夹一起拷贝到jnilibs,还有一个资源文件,将sdk\java\res\values下的attrs.xml拷贝到工程目录res\values目录下,不然camera会有变量不认识
如图:
布局文件:
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin"
tools:context="com.example.administrator.tryopencv2.tryOpenCv2Activity">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="choose picture!"
android:id="@+id/title"/>
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="原图"
android:id="@+id/GiveFaceButton"
android:layout_below="@+id/title"
android:layout_alignLeft="@+id/GetFaceButton"
android:layout_alignStart="@+id/GetFaceButton"
android:layout_marginTop="71dp" />
<ImageView
android:adjustViewBounds="true"
android:maxHeight="400dp"
android:maxWidth="200dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:scaleType="fitStart"
android:id="@+id/IV01" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="灰度图"
android:id="@+id/GetFaceButton"
android:layout_centerVertical="true"
android:layout_alignParentRight="true"
android:layout_alignParentEnd="true" />
<ImageView
android:adjustViewBounds="true"
android:maxHeight="400dp"
android:maxWidth="200dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:scaleType="fitStart"
android:id="@+id/IV02"
android:layout_below="@id/IV01" />
RelativeLayout>
主程序java文件
package com.example.administrator.tryopencv2;
import android.content.ContentResolver;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import java.io.FileNotFoundException;
import java.io.InputStream;
import org.opencv.android.OpenCVLoader;
import org.opencv.android.Utils;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;
public class tryOpenCv2Activity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_try_open_cv2 );
Button GiveFaceButton = (Button)findViewById(R.id.GiveFaceButton);
GiveFaceButton.setOnClickListener(new Button.OnClickListener()
{
public void onClick(View v)
{
Intent myIntent = new Intent();
//开启picture画面Type设置为image*
myIntent.setType("image/*");
//使用Intent.ACTION_GET_CONTENT这个Action
myIntent.setAction(Intent.ACTION_GET_CONTENT);
/*取得相片后返回本画面*/
startActivityForResult(myIntent,1);
}
});
Button GetFaceButton = (Button)findViewById(R.id.GetFaceButton);
GetFaceButton.setOnClickListener(new Button.OnClickListener()
{
public void onClick(View v)
{
Intent myIntent = new Intent();
//开启picture画面Type设置为image*
myIntent.setType("image/*");
//使用Intent.ACTION_GET_CONTENT这个Action
myIntent.setAction(Intent.ACTION_GET_CONTENT);
/*取得相片后返回本画面*/
startActivityForResult(myIntent,2);
}
});
}
protected void onActivityResult(int requestCode,int resultCode,Intent data)
{
switch (requestCode)
{
case 1 :
Uri uri = data.getData();
Log.e("uri",uri.toString());
ContentResolver cr = this.getContentResolver();
try
{
InputStream input = cr.openInputStream(uri);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(input, null, options);
if (options.outWidth > 1024 || options.outHeight > 1024){
options.inSampleSize = Math.max(options.outWidth / 1024, options.outHeight/1024);
}
options.inJustDecodeBounds = false;
Bitmap bitmap = BitmapFactory.decodeStream(cr.openInputStream(uri), null, options);
/*以下进行灰度化处理(使用C++写底层函数jni调用)*/
/*
int w = bitmap.getWidth();
int h = bitmap.getHeight();
int[] pix = new int[w * h];
bitmap.getPixels(pix, 0, w, 0, 0, w, h);
int[] resultPixels = OpenCVHelper.gray(pix, w, h);
Bitmap result = Bitmap.createBitmap(w, h, Bitmap.Config.RGB_565);
result.setPixels(resultPixels,0,w,0,0,w,h);
*/
ImageView imageView = (ImageView)findViewById(R.id.IV01);
/*将处理好的灰度图设定到ImageView*/
imageView.setImageBitmap(bitmap);
}
catch(FileNotFoundException e)
{
Log.e("Exception",e.getMessage(),e);
}
break;
case 2:
Uri uri2 = data.getData();
Log.e("uri2",uri2.toString());
ContentResolver cr2 = this.getContentResolver();
try
{
InputStream input = cr2.openInputStream(uri2);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(input, null, options);
if (options.outWidth > 1024 || options.outHeight > 1024){
options.inSampleSize = Math.max(options.outWidth / 1024, options.outHeight/1024);
}
options.inJustDecodeBounds = false;
Bitmap bitmap = BitmapFactory.decodeStream(cr2.openInputStream(uri2), null, options);
/*使用opencv包进行灰度化处理*/
OpenCVLoader.initDebug();
Mat rgbMat = new Mat();
Mat grayMat = new Mat();
Bitmap grayBitmap = Bitmap.createBitmap(bitmap.getWidth(),bitmap.getHeight(),Bitmap.Config.RGB_565);
Utils.bitmapToMat(bitmap,rgbMat);
Imgproc.cvtColor(rgbMat, grayMat, Imgproc.COLOR_RGB2GRAY);
Utils.matToBitmap(grayMat,grayBitmap);
/*将grayBitmap设定到ImageView*/
ImageView imageView = (ImageView)findViewById(R.id.IV02);
imageView.setImageBitmap(grayBitmap);
}
catch(FileNotFoundException e)
{
Log.e("Exception",e.getMessage(),e);
}
break;
default:
break;
}
super.onActivityResult(requestCode, resultCode, data);
}
}
运行即可,运行中会遇到,在opencv某个文件中出现找不到R包的错误,出现错误时,进入那个文件,导入自己的R包即可
运行结果:
终于写完了,好累!!!项目所需还研究过android的在android studio中的jni调用,网上的资源也很少且不清楚,过段时间再写那篇教程!!