转载请注明出处,谢谢!
相机服务是如何运作的呢?若弄清这个问题,必须先了解组成Service Framework的各个类与Bind RPC的连接关系。图-1描述了在不同的三个部分中各个类与Binder RPC的关系。
(a) Camera 类继承自ICameraClient类,负责在应用程序与相机服务间传递Binder RPC数据。
(b) CameraService 类继承了ICameraService类,负责应用程序与相机服务间的连接。
(c) CameraService::Client类继承自ICamera类,负责相机设备的设置,控制及来自相机设备的事件。
各个类与Binder RPC的具体关系:
(1) android.hardware.Camera 中的本地方法通过JNI调用本地库中Camera的成员函数。
(2) 在应用程序连接相机服务时,Camera 通过BpCameraService(服务代理)与BnCamera Service (服务stub)进行Binder RPC操作(经由ICameraService接口执行Binder RPC交互)。
(3) 应用程序中请求相机设备或预览功能时,Camera通过BpCamera(服务代理)与bnCamera进行binder RPC操作(经由ICamera接口执行Binder RPC交互)。
(4) 当相机设备发生事件时,CameraService::Client 通过BpCameraClient与BnCamera进行Binder RPC操作(经由ICameraClient接口执行Binder RPC).
/frameworks/base/media/mediaserver/main_mediaserver.cpp
int main(int argc, char** argv)
{
sp
sp
LOGI("ServiceManager: %p", sm.get());
//add for coredump .check only in debug mode
{
char value[PROPERTY_VALUE_MAX];
property_get("ro.debuggable", value, "0");
if(value[0] == '1' )
{
struct rlimit rl;
rl.rlim_cur = -1;
rl.rlim_max = -1;
setrlimit(4,&rl);
}
}
VolumeManager::instantiate(); // volumemanager have to be started before audioflinger
AudioFlinger::instantiate();
MediaPlayerService::instantiate();
CameraService::instantiate();
AudioPolicyService::instantiate();
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
查看CameraService::instantiate()源码遇到问题,到CameraService.cpp文件里面却找不到instantiate()这个方法,它在哪?
frameworks/base/media/mediaserver/main_mediaserver.cpp文件中发现蛛丝马迹:
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
所以去frameworks/base/services/camera/libcameraservice/CameraService.h一探究竟。
class CameraService :
public BinderService
public BnCameraService
{
class Client;
friend class BinderService
public:
static char const* getServiceName() { return "media.camera"; }
CameraService();
virtual ~CameraService();
virtual int32_t getNumberOfCameras();
virtual status_t getCameraInfo(int cameraId,
...
从以上定义可以看出CameraService继承于BinderService和BnCameraService所以CameraService::instantiate()可能继承父类的方法,到父类BinderService中一看,果不其然,其父类BinderService中有instantiate方法,并且是个静态方法。
frameworks/base/include/binder/BinderService.h
class BinderService
{
public:
static status_tpublish() {
sp
returnsm->addService(String16(SERVICE::getServiceName()), new SERVICE());
}
static void publishAndJoinThreadPool() {
sp
sp
sm->addService(String16(SERVICE::getServiceName()), new SERVICE());
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}
static void instantiate() { publish(); }
static status_t shutdown() {
return NO_ERROR;
}
};
}; // namespace android
可以发现在publish()函数中,CameraService完成服务的注册。这里面有个SERVICE,源码中有说明
template
这表示SERVICE是个模板,这里是注册CameraService,所以可以用CameraService代替SERVICE,这段代码可以替换为:
return sm->addService(String16(CameraService::getServiceName()), new CameraService());
好了这样,Camera就在ServiceManager完成服务注册,提供给client随时使用。
main_mediaserver.cpp编译后生成out/target/product/sp8825ea/system/bin/mediaserver ;
Main_MediaServer主函数由init.rc在启动是调用,所以在设备开机的时候Camera就会注册一个服务,用作binder通信。
service media /system/bin/mediaserver
class main
user media
group system audio camera graphics inet net_bt net_bt_admin net_bw_acct drmrpc radio
ioprio rt 4
在使用相机服务之前,应用程序必须先与相机服务连接起来,(在应用程序看来,连接就是调用open函数)。在连接的过程中,ICameraService的Binder RPC扮演连接桥梁的角色。在完成连接以后,相应的Binder RPC关系就确定了。利用它可以设置相机设备,传递相应命令、接收相应的事件。
(1) 应用程序调用android.hardware.Camera的open()方法;
(2) Open方法调用native_setup();
(3) native_setup()方法通过JNI调用android_hardware_Camera_native_setup()函数。
(4) android_hardware_Camera_native_setup()函数调用Camera的connect()成员函数。
(5) Camera的connect方法从Context Manager获取相机服务信息,生成服务代理(BpCameraService)后,通过Binder RPC连接到相机服务stub(BnCameraService)。
(6) 实际连接是由CameraService的connect()方法进行处理的。
如我的demo中:
public void surfaceCreated(SurfaceHolder holder)
{
// int nCameras = Camera.getNumberOfCameras();
mCamera = Camera.open(0);
try {
Log.i(TAG, "SurfaceHolder.Callback:surface Created");
mCamera.setPreviewDisplay(mSurfaceHolder);// set the surface to be
// used for live preview
mCamera.setPreviewCallback(previewCallback);
mCamera.setErrorCallback(errorCallback);
mCamera.startPreview();
.......................................
frameworks/base/core/java/android/hardware/Camera.java
public static Camera open() {
int numberOfCameras = getNumber OfCameras();
CameraInfo cameraInfo = new CameraInfo();
for (int i = 0; i < numberOfCameras; i++) {
getCameraInfo(i, cameraInfo);
if (cameraInfo.facing == CameraInfo.CAMERA_FACING_BACK) {
return new Camera(i);
}
}
return null;
}
Camera(int cameraId) {
mShutterCallback = null;
mRawImageCallback = null;
mJpegCallback = null;
mPreviewCallback = null;
mPostviewCallback = null;
mZoomListener = null;
Looper looper;
if ((looper = Looper.myLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else if ((looper = Looper.getMainLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else {
mEventHandler = null;
}
native_setup(new WeakReference
}
frameworks/base/core/jni
static JNINativeMethod camMethods[] = {
{ "getNumberOfCameras",
"()I",
(void *)android_hardware_Camera_getNumberOfCameras },
{ "getCameraInfo",
"(ILandroid/hardware/Camera$CameraInfo;)V",
(void*)android_hardware_Camera_getCameraInfo },
{ "native_setup",
"(Ljava/lang/Object;I)V",
(void*)android_hardware_Camera_native_setup },
{ "native_release",
"()V",
(void*)android_hardware_Camera_release },
{ "setPreviewDisplay",
"(Landroid/view/Surface;)V",
(void *)android_hardware_Camera_setPreviewDisplay },
看android_hardware_Camera_native_setup原型:
static voidandroid_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
jobject weak_this, jint cameraId)
{
sp
if (camera == NULL) {
jniThrowRuntimeException(env, "Fail to connect to camera service");
return;
}
// make sure camera hardware is alive
if (camera->getStatus() != NO_ERROR) {
jniThrowRuntimeException(env, "Camera initialization failed");
return;
}
jclass clazz = env->GetObjectClass(thiz);
if (clazz == NULL) {
jniThrowRuntimeException(env, "Can't find android/hardware/Camera");
return;
}
// We use a weak reference so the Camera object can be garbage collected.
// The reference is only used as a proxy for callbacks.
sp
context->incStrong(thiz);
camera->setListener(context);
// save context in opaque field
env->SetIntField(thiz, fields.context, (int)context.get());
}
frameworks/base/libs/camera/Camera.cpp
sp
{
LOGV("connect");
sp
const sp
if (cs != 0) {
c->mCamera = cs->connect(c, cameraId);
}
if (c->mCamera != 0) {
c->mCamera->asBinder()->linkToDeath(c);
c->mStatus = NO_ERROR;
} else {
c.clear();
}
return c;
}
frameworks/base/libs/camera/ICameraService.cpp
virtual sp
{
Parcel data, reply;
data.writeInterfaceToken(ICameraService::getInterfaceDescriptor());
data.writeStrongBinder(cameraClient->asBinder());
data.writeInt32(cameraId);
remote()->transact(BnCameraService::CONNECT, data, &reply);
return interface_cast
}
frameworks/base/libs/camera/ICameraService.cpp
status_t BnCameraService::onTransact(
uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)
{
switch(code) {
case GET_NUMBER_OF_CAMERAS: {
CHECK_INTERFACE(ICameraService, data, reply);
reply->writeInt32(getNumberOfCameras());
return NO_ERROR;
} break;
case GET_CAMERA_INFO: {
CHECK_INTERFACE(ICameraService, data, reply);
CameraInfo cameraInfo;
memset(&cameraInfo, 0, sizeof(cameraInfo));
status_t result = getCameraInfo(data.readInt32(), &cameraInfo);
reply->writeInt32(cameraInfo.facing);
reply->writeInt32(cameraInfo.orientation);
reply->writeInt32(result);
return NO_ERROR;
} break;
case CONNECT: {
CHECK_INTERFACE(ICameraService, data, reply);
sp
sp
reply->writeStrongBinder(camera->asBinder());
return NO_ERROR;
} break;
default:
return BBinder::onTransact(code, data, reply, flags);
}
}
frameworks/base/services/camera/libcameraservice/CameraService.cpp
sp
const sp
int callingPid = getCallingPid();
sp
LOG1("CameraService::connect E (pid %d, id %d)", callingPid, cameraId);
if (!mModule) {
LOGE("Camera HAL module not loaded");
return NULL;
}
sp
if (cameraId < 0 || cameraId >= mNumberOfCameras) {
LOGE("CameraService::connect X (pid %d) rejected (invalid cameraId %d).",
callingPid, cameraId);
return NULL;
}
char value[PROPERTY_VALUE_MAX];
property_get("sys.secpolicy.camera.disabled", value, "0");
if (strcmp(value, "1") == 0) {
// Camera is disabled by DevicePolicyManager.
LOGI("Camera is disabled. connect X (pid %d) rejected", callingPid);
return NULL;
}
Mutex::Autolock lock(mServiceLock);
if (mClient[cameraId] != 0) {
client = mClient[cameraId].promote();
if (client != 0) {
if (cameraClient->asBinder() == client->getCameraClient()->asBinder()) {
LOG1("CameraService::connect X (pid %d) (the same client)",
callingPid);
return client;
} else {
LOGW("CameraService::connect X (pid %d) rejected (existing client).",
callingPid);
return NULL;
}
}
mClient[cameraId].clear();
}
if (mBusy[cameraId]) {
LOGW("CameraService::connect X (pid %d) rejected"
" (camera %d is still busy).", callingPid, cameraId);
return NULL;
}
struct camera_info info;
if (mModule->get_camera_info(cameraId, &info) != OK) {
LOGE("Invalid camera id %d", cameraId);
return NULL;
}
char camera_device_name[10];
snprintf(camera_device_name, sizeof(camera_device_name), "%d", cameraId);
hardware = new CameraHardwareInterface(camera_device_name);
if (hardware->initialize(&mModule->common) != OK) {
hardware.clear();
return NULL;
}
client = new Client(this, cameraClient, hardware, cameraId, info.facing, callingPid);
mClient[cameraId] = client;
LOG1("CameraService::connect X");
return client;
}
mModule的实例化是在第一次该对象被应用的时候;
onFirstRef重载父类的方法。
void CameraService::onFirstRef()
{
BnCameraService::onFirstRef();
if (hw_get_module(CAMERA_HARDWARE_MODULE_ID,
(const hw_module_t **)&mModule) < 0) {
LOGE("Could not load camera HAL module");
mNumberOfCameras = 0;
}
else {
mNumberOfCameras = mModule->get_number_of_cameras();
LOGE("CameraService::onFirstRef mNumberOfCameras=%d",mNumberOfCameras);
if (mNumberOfCameras > MAX_CAMERAS) {
LOGE("Number of cameras(%d) > MAX_CAMERAS(%d).",
mNumberOfCameras, MAX_CAMERAS);
mNumberOfCameras = MAX_CAMERAS;
}
//for (int i = 0; i < mNumberOfCameras; i++) {
for (int i = 0; i < MAX_CAMERAS; i++) {
setCameraFree(i);
}
}
// Read the system property to determine if we have to use the
// AUDIO_STREAM_ENFORCED_AUDIBLE type.
char value[PROPERTY_VALUE_MAX];
property_get("ro.camera.sound.forced", value, "0");
if (strcmp(value, "0") != 0) {
mAudioStreamType = AUDIO_STREAM_ENFORCED_AUDIBLE;
} else {
mAudioStreamType = AUDIO_STREAM_MUSIC;
}
//set AudioStreamTyee as "AUDIO_STREAM_ENFORCED_AUDIBLE"
mAudioStreamType = AUDIO_STREAM_ENFORCED_AUDIBLE;
}
frameworks/base/services/camera/libcameraservice/CameraService.cpp
CameraService::Client::Client(const sp
const sp
const sp
int cameraId, int cameraFacing, int clientPid) {
int callingPid = getCallingPid();
LOG1("Client::Client E (pid %d)", callingPid);
mCameraService = cameraService;
mCameraClient = cameraClient;
mHardware = hardware;
mCameraId = cameraId;
mCameraFacing = cameraFacing;
mClientPid = clientPid;
mMsgEnabled = 0;
mSurface = 0;
mPreviewWindow = 0;
mHardware->setCallbacks(notifyCallback,
dataCallback,
dataCallbackTimestamp,
(void *)cameraId);
// Enable zoom, error, focus, and metadata messages by default
enableMsgType(CAMERA_MSG_ERROR | CAMERA_MSG_ZOOM | CAMERA_MSG_FOCUS |
CAMERA_MSG_PREVIEW_METADATA);
// Callback is disabled by default
mPreviewCallbackFlag = CAMERA_FRAME_CALLBACK_FLAG_NOOP;
mOrientation = getOrientation(0, mCameraFacing == CAMERA_FACING_FRONT);
mPlayShutterSound = true;
cameraService->setCameraBusy(cameraId);
cameraService->loadSound();
LOG1("Client::Client X (pid %d)", callingPid);
}
(顺便可以看看析构函数的disconnect过程。)
经过上述过程,camera和camera::client之间形成了服务与用户的关系!
应用程序使用ICamera的Binder RPC请求更改相机设置或预览在应用程序中,调用setParameter()方法,将相应的设置参数以Binder RPC的形式传递给相机设备。
过程分析
(1) 应用层通过android.hardware.Camera的setParameter()方法,变更相机设置。
(2) 通过JNI调用android_hardware_Camera_setParameters()方法。
(3) 在android_hardware_Camera_setParameters()方法中调用Camera的setParameter()的成员方法。
(4) Camera的setParameter成员方法通过BpCamera以Binder RPC的形式向BnCamera请求变更相机设置。
(5) 在CameraService::Client的setParameter()成员方法中,调用CameraHardwareInterface的setParameters()成员方法;
(6) CameraHardwareInterface将设置内容更改到相机设备上。
app层中的代码如demo中所用,这里不再细看。
frameworks/base/core/java/android/hardware/Camera.java
public void setParameters(Parameters params) {
if ("invalid".equals(params.getFocusMode())) {
throw new RuntimeException("Throw exception for invalid parameters,Because FocusMode parameters was invalid.");
}
if ("invalid".equals(params.getFlashMode())) {
throw new RuntimeException("Throw exception for invalid parameters,Because FlashMode parameters was invalid.");
}
native_setParameters(params.flatten());
}
frameworks/base/core/jni/android_hardware_Camera.cpp
................
{ "native_takePicture",
"(I)V",
(void *)android_hardware_Camera_takePicture },
{ "native_setParameters",
"(Ljava/lang/String;)V",
(void *)android_hardware_Camera_setParameters },
{ "native_getParameters",
"()Ljava/lang/String;",
(void *)android_hardware_Camera_getParameters },
{ "reconnect",
....................
frameworks/base/core/jni/android_hardware_Camera.cpp
static voidandroid_hardware_Camera_setParameters(JNIEnv *env, jobject thiz, jstring params)
{
LOGV("setParameters");
sp
if (camera == 0) return;
const jchar* str = env->GetStringCritical(params, 0);
String8 params8;
if (params) {
params8 = String8(str, env->GetStringLength(params));
env->ReleaseStringCritical(params, str);
}
if (camera->setParameters(params8) != NO_ERROR) {
jniThrowRuntimeException(env, "setParameters failed");
return;
}
}
sp
{
sp
Mutex::Autolock _l(sLock);
JNICameraContext* context = reinterpret_cast
if (context != NULL) {
camera = context->getCamera();
}
LOGV("get_native_camera: context=%p, camera=%p", context, camera.get());
if (camera == 0) {
jniThrowRuntimeException(env, "Method called after release()");
}
if (pContext != NULL) *pContext = context;
return camera;
}
frameworks/base/libs/camera/Camera.cpp
status_t Camera::setParameters(const String8& params)
{
LOGV("setParameters");
sp
if (c == 0) return NO_INIT;
return c->setParameters(params);
}
frameworks/base/libs/camera/ICamera.cpp
// set preview/capture parameters - key/value pairs
status_t setParameters(const String8& params)
{
LOGV("setParameters");
Parcel data, reply;
data.writeInterfaceToken(ICamera::getInterfaceDescriptor());
data.writeString8(params);
remote()->transact(SET_PARAMETERS, data, &reply);
return reply.readInt32();
}
frameworks/base/libs/camera/ICamera.cpp
status_t BnCamera::onTransact(
uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)
{
switch(code) {
//此处略去一些代码
case SET_PARAMETERS: {
LOGV("SET_PARAMETERS");
CHECK_INTERFACE(ICamera, data, reply);
String8 params(data.readString8());
reply->writeInt32(setParameters(params));
return NO_ERROR;
} break;
//此处略去一些代码
frameworks/base/services/camera/libcameraservice/CameraService.cpp
status_t CameraService::Client::setParameters(const String8& params) {
LOG1("setParameters (pid %d) (%s)", getCallingPid(), params.string());
Mutex::Autolock lock(mLock);
status_t result = checkPidAndHardware();
if (result != NO_ERROR) return result;
CameraParameters p(params);
return mHardware->setParameters(p);
}
mHardware存储着CameraHarewareInterface实例对象,该语句通过mHardware将设置应用到设备中。
当相机设备发生事件时,相机服务使用ICameraClient的Binder RPC ,将其传递给应用程序。例如:应用程序在调用takePicture()函数获取静态图像时,相机设备准备好静态图像后,将以异步的方式将静态图像已经准备好的信息通知给应用程序。在这以过程中,首先发生Shutter事件,随后分别发生与RAW图像和JPEG图像相关的事件。
当应用程序调用android.hardware.Camera的takePicture()方法请求静态图像时,CameraService::Client::takePicture就会调用(如同在相机设置和控制中讲的一样,从应用层到本地库的调用过程不再赘述)。
frameworks/base/services/camera/libcameraservice/CameraService.cpp
status_t CameraService::Client::takePicture(int msgType) {
LOG1("takePicture (pid %d): 0x%x", getCallingPid(), msgType);
Mutex::Autolock lock(mLock);
status_t result = checkPidAndHardware();
if (result != NO_ERROR) return result;
if ((msgType & CAMERA_MSG_RAW_IMAGE) &&
(msgType & CAMERA_MSG_RAW_IMAGE_NOTIFY)) {
LOGE("CAMERA_MSG_RAW_IMAGE and CAMERA_MSG_RAW_IMAGE_NOTIFY"
" cannot be both enabled");
return BAD_VALUE;
}
// We only accept picture related message types
// and ignore other types of messages for takePicture().
int picMsgType = msgType
& (CAMERA_MSG_SHUTTER |
CAMERA_MSG_POSTVIEW_FRAME |
CAMERA_MSG_RAW_IMAGE |
CAMERA_MSG_RAW_IMAGE_NOTIFY |
CAMERA_MSG_COMPRESSED_IMAGE);
enableMsgType(picMsgType);
return mHardware->takePicture();
}
首先在CameraService::Client的takePicture()中启用相应类型的消息。当相机设备中发生CAMERA_MSG_SHUTTER,CAMERA_MSG_POSTVIEW_FRAME,CAMERA_MSG_RAW_IMAGE等事件时进行捕获。然后调用静态图像处理函数。
下面以CameraService::Client的handleShutter函数为例:
frameworks/base/services/camera/libcameraservice/CameraService.cpp
void CameraService::Client::handleShutter(void) {
if (mPlayShutterSound) {
mCameraService->playSound(SOUND_SHUTTER);
}
sp
if (c != 0) {
mLock.unlock();
c->notifyCallback(CAMERA_MSG_SHUTTER, 0, 0);
if (!lockIfMessageWanted(CAMERA_MSG_SHUTTER)) return;
}
disableMsgType(CAMERA_MSG_SHUTTER);
mLock.unlock();
}
mCameraClient中保存BpCameraClient的实例对象,通过Binder RPC调用Camera的notifyCallback()函数处理发生的事件。disableMsgType()销毁CAMERA_MSG_SHUTTER消息。
frameworks/base/libs/camera/Camera.cpp
void Camera::notifyCallback(int32_t msgType, int32_t ext1, int32_t ext2)
{
sp
{
Mutex::Autolock _l(mLock);
listener = mListener;
}
if (listener != NULL) {
listener->notify(msgType, ext1, ext2);
}
}
在Camera的notifyCallback()函数中,调用CameraListener的notify()函数,将以Binder RPC传递过来的事件发送给应用程序。在变量mListenerz中保存着JNICameraContext的实例对象,它通过JNI在引用程序与Server Framework件传递数据。