Android Camera数据流分析全程记录

Android Camera数据流分析全程记录

花了不少时间在这个数据流的分析上面,自己毕竟没怎么做过android,这里记录一下自己的见解,任何理解错误还望高人指教,以后还需慢慢纠正
整个分析过程从app的onCreate开始:packages/apps/OMAPCamera/src/com/ti/omap4/android/camera/Camera.java
在onCreate中做了很多的初始化,我们真正关注的是一下几条语句:
  1. // don'set mSurfaceHolder here. We have it set ONLY within
  2. // surfaceChanged / surfaceDestroyed, other parts of the code
  3. // assume that when it is set, the surface is also set.
  4. SurfaceView preview = (SurfaceView) findViewById(R.id.camera_preview);
  5. SurfaceHolder holder = preview.getHolder();
  6. holder.addCallback(this);
在这里我们实例化了一个SurfaceView对象,通过这个对象获取SurfaceHolder对象,实现这个addCallback方法,
其中 SurfaceView的定义在以下路径: frameworks/base/core/java/android/view/SurfaceView.java
其中 SurfaceHolder的定义在以下路径: frameworks/base/core/java/android/view/SurfaceHolder.java

这里看看这个文章的解释,写的很是不错:http://blog.chinaunix.net/uid-9863638-id-1996383.html

SurfaceFlinger 是Android multimedia 的一个部分,在Android 的实现中它是一个service ,提供系统范围内的surface composer 功能,它能够将各种应用程序的2D,3D surface 进行组合。
在具体讲SurfaceFlinger 之前,我们先来看一下有关显示方面的一些基础知识 。
Android Camera数据流分析全程记录_第1张图片
每个应用程序可能对应着一个或者多个图形界面,而每个界面我们就称之为一个surface ,或者说是window ,在上面的图中我们能看到4 个surface ,一个是home 界面,还有就是红、绿、蓝分别代表的3 个surface ,而两个button 实际是home surface 里面的内容。在这里我们能看到我们进行图形显示所需要解决 的问题:
a 、首先每个surface 在屏幕上有它的位置,以及大小,然后每个surface 里面还有要显示的内容,内容,大小,位置 这些元素 在我们改变应用程序的时候都可能会改变,改变时应该如何处理 
b 、然后就各个surface 之间可能有重叠,比如说在上面的简略图中,绿色覆盖了蓝色,而红色又覆盖了绿色和蓝色以及下面的home ,而且还具有一定透明度。这种层之间的关系应该如何描述      
我们首先来看第二个问题,我们可以想象在屏幕平面的垂直方向还有一个Z 轴,所有的surface 根据在Z 轴上的坐标来确定前后,这样就可以描述各个surface 之间的上下覆盖关系了,而这个在Z 轴上的顺序,图形上有个专业术语叫Z-order 。  
对于第一个问题,我们需要一个结构来记录应用程序界面的位置,大小,以及一个buffer 来记录需要显示的内容,所以这就是我们surface 的概念,surface 实际我们可以把它理解成一个容器,这个容器记录着应用程序界面的控制信息,比如说大小啊,位置啊,而它还有buffer 来专门存储需要显示的内容。
在这里还存在一个问题,那就是当存在图形重合的时候应该如何处理呢,而且可能有些surface 还带有透明信息,这里就是我们SurfaceFlinger 需要解决问题,它要把各个surface 组合(compose/merge) 成一个main Surface ,最后将Main Surface 的内容发送给FB/V4l2 Output ,这样屏幕上就能看到我们想要的效果。
在实际中对这些Surface 进行merge 可以采用两种方式,一种就是采用软件的形式来merge ,还一种就是采用硬件的方式,软件的方式就是我们的SurfaceFlinger ,而硬件的方式就是Overlay 。

首先继承SurfaceView并实现SurfaceHolder.Callback接口
使用接口的原因:因为使用SurfaceView 有一个原则,所有的绘图工作必须得在Surface 被创建之后才能开始(Surface—表面,基本上我们可以把它当作显存的一个映射,写入到Surface 的内容可以被直接复制到显存从而显示出来,这使得显示速度会非常快),而在Surface 被销毁之前必须结束。所以Callback 中的surfaceCreated 和surfaceDestroyed 就成了绘图处理代码的边界。
需要重写的方法
 (1)public void surfaceChanged(SurfaceHolder holder,int format,int width,int height){}//在surface的大小发生改变时激发
 (2)public void surfaceCreated(SurfaceHolder holder){}//在创建时激发,一般在这里调用画图的线程。
 (3)public void surfaceDestroyed(SurfaceHolder holder) {} //销毁时激发,一般在这里将画图的线程停止、释放。
这几个方法在在app中都已经重新实现了,重点分析 surfaceChanged
  1. public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
  2.         // Make sure we have a surface in the holder before proceeding.
  3.         if (holder.getSurface() == null) {
  4.             Log.d(TAG, "holder.getSurface() == null");
  5.             return;
  6.         }

  7.         Log.v(TAG, "surfaceChanged. w=" + w + ". h=" + h);

  8.         // We need to save the holder for later use, even when the mCameraDevice
  9.         // is null. This could happen if onResume() is invoked after this
  10.         // function.
  11.         mSurfaceHolder = holder;

  12.         // The mCameraDevice will be null if it fails to connect to the camera
  13.         // hardware. In this case we will show a dialog and then finish the
  14.         // activity, so it's OK to ignore it.
  15.         if (mCameraDevice == null) return;

  16.         // Sometimes surfaceChanged is called after onPause or before onResume.
  17.         // Ignore it.
  18.         if (mPausing || isFinishing()) return;

  19.         setSurfaceLayout();

  20.         // Set preview display if the surface is being created. Preview was
  21.         // already started. Also restart the preview if display rotation has
  22.         // changed. Sometimes this happens when the device is held in portrait
  23.         // and camera app is opened. Rotation animation takes some time and
  24.         // display rotation in onCreate may not be what we want.
  25.         if (mCameraState == PREVIEW_STOPPED) {//这里check摄像头是否已经启动,第一次启动摄像头和摄像头已经打开从新进入摄像头实现方法不同
  26.             startPreview(true);
  27.             startFaceDetection();
  28.         } else {
  29.             if (Util.getDisplayRotation(this) != mDisplayRotation) {
  30.                 setDisplayOrientation();
  31.             }
  32.             if (holder.isCreating()) {
  33.                 // Set preview display if the surface is being created and preview
  34.                 // was already started. That means preview display was set to null
  35.                 // and we need to set it now.
  36.                 setPreviewDisplay(holder);
  37.             }
  38.         }

  39.         // If first time initialization is not finished, send a message to do
  40.         // it later. We want to finish surfaceChanged as soon as possible to let
  41.         // user see preview first.
  42.         if (!mFirstTimeInitialized) {
  43.             mHandler.sendEmptyMessage(FIRST_TIME_INIT);
  44.         } else {
  45.             initializeSecondTime();
  46.         }

  47.         SurfaceView preview = (SurfaceView) findViewById(R.id.camera_preview);
  48.         CameraInfo info = CameraHolder.instance().getCameraInfo()[mCameraId];
  49.         boolean mirror = (info.facing == CameraInfo.CAMERA_FACING_FRONT);
  50.         int displayRotation = Util.getDisplayRotation(this);
  51.         int displayOrientation = Util.getDisplayOrientation(displayRotation, mCameraId);

  52.         mTouchManager.initialize(preview.getHeight() / 3, preview.getHeight() / 3,
  53.                preview, this, mirror, displayOrientation);

  54.     }
以上标注部分是关键,现在直接分心startPreview方法,这是第一次打开摄像头的处理函数,进行了一些初始换,而已经处于摄像头打开状态时不必使用startPreview方法,而是用上面的另外一个分支,重新开始显示即可
  1. private void startPreview(boolean updateAll) {
  2.         if (mPausing || isFinishing()) return;

  3.         mFocusManager.resetTouchFocus();

  4.         mCameraDevice.setErrorCallback(mErrorCallback);

  5.         // If we're previewing already, stop the preview first (this will blank
  6.         // the screen).
  7.         if (mCameraState != PREVIEW_STOPPED) stopPreview();

  8.         setPreviewDisplay(mSurfaceHolder);
  9.         setDisplayOrientation();

  10.         if (!mSnapshotOnIdle) {
  11.             // If the focus mode is continuous autofocus, call cancelAutoFocus to
  12.             // resume it because it may have been paused by autoFocus call.
  13.             if (Parameters.FOCUS_MODE_CONTINUOUS_PICTURE.equals(mFocusManager.getFocusMode())) {
  14.                 mCameraDevice.cancelAutoFocus();
  15.             }
  16.             mFocusManager.setAeAwbLock(false); // Unlock AE and AWB.
  17.         }

  18.         if ( updateAll ) {
  19.             Log.v(TAG, "Updating all parameters!");
  20.             setCameraParameters(UPDATE_PARAM_INITIALIZE | UPDATE_PARAM_ZOOM | UPDATE_PARAM_PREFERENCE);
  21.         } else {
  22.             setCameraParameters(UPDATE_PARAM_MODE);
  23.         }

  24.         //setCameraParameters(UPDATE_PARAM_ALL);

  25.         // Inform the mainthread to go on the UI initialization.
  26.         if (mCameraPreviewThread != null) {
  27.             synchronized (mCameraPreviewThread) {
  28.                 mCameraPreviewThread.notify();
  29.             }
  30.         }

  31.         try {
  32.             Log.v(TAG, "startPreview");
  33.             mCameraDevice.startPreview();
  34.         } catch (Throwable ex) {
  35.             closeCamera();
  36.             throw new RuntimeException("startPreview failed", ex);
  37.         }

  38.         mZoomState = ZOOM_STOPPED;
  39.         setCameraState(IDLE);
  40.         mFocusManager.onPreviewStarted();
  41.         if ( mTempBracketingEnabled ) {
  42.             mFocusManager.setTempBracketingState(FocusManager.TempBracketingStates.ACTIVE);
  43.         }

  44.         if (mSnapshotOnIdle) {
  45.             mHandler.post(mDoSnapRunnable);
  46.         }
  47.     }
这里的思路是:先通过 setPreviewDisplay 方法将surface设定为window-player,这个方法会调用到HAL层,进行很重要的初始化,实现数据的回调

这里我必须得着重着重的进行分析,我一直在寻找是什么决定了overlay的使用与不适用,这里就这个 setPreviewDisplay 方法就是“罪魁祸首
在setPreview方法中传入的参数是 surfa ceview ,这个surfa ceview传到底层HAL层是参数形式发生了改变,但是在我的理解下,就是人换衣服一样,
张三今天换了一身衣服,但这个张三跟昨天穿不同衣服的张三是同一个人,到了HAL层这个参数的形式是 preview_stream_ops  ,下面慢慢你就可以知道了,
在camerahal中的 setPreviewDisplay 方法中,是通过判断传下来的的 preview_stream_ops  参数是否为空决定使用overlay还是不适用overlay的,很重要的
这篇文章只是在这里提及一下,下面不会提及overlay的内容,默认是以不适用overlay的方式分析数据流的整个过程的,这里可千万别混淆了
使用overl的数据回流方式将单独作为一章分析,同时会详细分析使用和不适用overlay的最终决定权

流程如下:app-->frameworks-->通过JNI-->camera client--> camera service-->通过hardware-interface -->hal_module-->HAL
这里十分有必要看一下camera service层的调用过程:
  1. // set the Surface that the preview will use
  2. status_t CameraService::Client::setPreviewDisplay(const sp<Surface>& surface) {
  3.     LOG1("setPreviewDisplay(%p) (pid %d)", surface.get(), getCallingPid());

  4.     sp<IBinder> binder(surface != 0 ? surface->asBinder() : 0);
  5.     sp<ANativeWindow> window(surface);
  6.     return setPreviewWindow(binder, window);
  7. }
这里其实我还是理解的不是很透彻,将我们从app传进来的surface转换为IBinder和ANativiWindow,然后以这两个变量为参数接着调用参数不同的 setPreviewWindow
  1. status_t CameraService::Client::setPreviewWindow(const sp<IBinder>& binder,
  2.         const sp<ANativeWindow>& window) {
  3.     Mutex::Autolock lock(mLock);
  4.     status_t result = checkPidAndHardware();
  5.     if (result != NO_ERROR) return result;

  6.     // return if no change in surface.
  7.     if (binder == mSurface) {
  8.         return NO_ERROR;
  9.     }

  10.     if (window != 0) {
  11.         result = native_window_api_connect(window.get(), NATIVE_WINDOW_API_CAMERA);
  12.         if (result != NO_ERROR) {
  13.             LOGE("native_window_api_connect failed: %s (%d)", strerror(-result),
  14.                     result);
  15.             return result;
  16.         }
  17.     }

  18.     // If preview has been already started, register preview buffers now.
  19.     if (mHardware->previewEnabled()) {
  20.         if (window != 0) {
  21.             native_window_set_scaling_mode(window.get(),
  22.                     NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);
  23.             native_window_set_buffers_transform(window.get(), mOrientation);
  24.             result = mHardware->setPreviewWindow(window);
  25.         }
  26.     }

  27.     if (result == NO_ERROR) {
  28.         // Everything has succeeded. Disconnect the old window and remember the
  29.         // new window.
  30.         disconnectWindow(mPreviewWindow);
  31.         mSurface = binder;
  32.         mPreviewWindow = window;
  33.     } else {
  34.         // Something went wrong after we connected to the new window, so
  35.         // disconnect here.
  36.         disconnectWindow(window);
  37.     }

  38.     return result;
  39. }
上面先调用到CameraHardwareInterface中的setPreview方法:

  1. status_t setPreviewWindow(const sp<ANativeWindow>& buf)
  2.     {
  3.         LOGV("%s(%s) buf %p", __FUNCTION__, mName.string(), buf.get());

  4.         if (mDevice->ops->set_preview_window) {
  5.             mPreviewWindow = buf;
  6. #ifdef OMAP_ENHANCEMENT_CPCAM
  7.             mHalPreviewWindow.user = mPreviewWindow.get();
  8. #else
  9.             mHalPreviewWindow.user = this;
  10. #endif
  11.             LOGV("%s &mHalPreviewWindow %p mHalPreviewWindow.user %p", __FUNCTION__,
  12.                     &mHalPreviewWindow, mHalPreviewWindow.user);
  13.             return mDevice->ops->set_preview_window(mDevice,
  14.                     buf.get() ? &mHalPreviewWindow.nw : 0);
  15.         }
  16.         return INVALID_OPERATION;
  17.     }
到这里为止,传输的参数已经由最初的surface-->ANativeWindow-->preview_stream_ops, 传递到底层的参数已经发生了本质的变化,后面数据回调的时候还会见到这里变量,现在先记下它
其实我说的本质的变化这里也只能这么说,但往深入追究,这个preview_stream_ops也可以说只是surface的另外一种形式而已
这样才通过hardware调用到hal-module再调用到hal层
  1. int camera_set_preview_window(struct camera_device * device,
  2.         struct preview_stream_ops *window)
  3. {
  4.     int rv = -EINVAL;
  5.     ti_camera_device_t* ti_dev = NULL;

  6.     LOGV("%s", __FUNCTION__);

  7.     if(!device)
  8.         return rv;

  9.     ti_dev = (ti_camera_device_t*) device;

  10.     rv = gCameraHals[ti_dev->cameraid]->setPreviewWindow(window);

  11.     return rv;
  12. }
HAL层调用:
  1. status_t CameraHal::setPreviewWindow(struct preview_stream_ops *window)
  2. {
  3.     status_t ret = NO_ERROR;
  4.     CameraAdapter::BuffersDescriptor desc;

  5.     LOG_FUNCTION_NAME;
  6.     mSetPreviewWindowCalled = true;

  7.    //If the Camera service passes a null window, we destroy existing window and free the DisplayAdapter
  8.     if(!window)
  9.     {
  10.         if(mDisplayAdapter.get() != NULL)
  11.         {
  12.             ///NULL window passed, destroy the display adapter if present
  13.             CAMHAL_LOGD("NULL window passed, destroying display adapter");
  14.             mDisplayAdapter.clear();
  15.             ///@remarks If there was a window previously existing, we usually expect another valid window to be passed by the client
  16.             ///@remarks so, we will wait until it passes a valid window to begin the preview again
  17.             mSetPreviewWindowCalled = false;
  18.         }
  19.         CAMHAL_LOGD("NULL ANativeWindow passed to setPreviewWindow");
  20.         return NO_ERROR;
  21.     }else if(mDisplayAdapter.get() == NULL)
  22.     {
  23.         // Need to create the display adapter since it has not been created
  24.         // Create display adapter
  25.         mDisplayAdapter = new ANativeWindowDisplayAdapter();
  26.         ret = NO_ERROR;
  27.         if(!mDisplayAdapter.get() || ((ret=mDisplayAdapter->initialize())!=NO_ERROR))
  28.         {
  29.             if(ret!=NO_ERROR)
  30.             {
  31.                 mDisplayAdapter.clear();
  32.                 CAMHAL_LOGEA("DisplayAdapter initialize failed");
  33.                 LOG_FUNCTION_NAME_EXIT;
  34.                 return ret;
  35.             }
  36.             else
  37.             {
  38.                 CAMHAL_LOGEA("Couldn't create DisplayAdapter");
  39.                 LOG_FUNCTION_NAME_EXIT;
  40.                 return NO_MEMORY;
  41.             }
  42.         }

  43.         // DisplayAdapter needs to know where to get the CameraFrames from inorder to display
  44.         // Since CameraAdapter is the one that provides the frames, set it as the frame provider for DisplayAdapter
  45.         mDisplayAdapter->setFrameProvider(mCameraAdapter);

  46.         // Any dynamic errors that happen during the camera use case has to be propagated back to the application
  47.         // via CAMERA_MSG_ERROR. AppCallbackNotifier is the class that notifies such errors to the application
  48.         // Set it as the error handler for the DisplayAdapter
  49.         mDisplayAdapter->setErrorHandler(mAppCallbackNotifier.get());

  50.         // Update the display adapter with the new window that is passed from CameraService
  51.         ret = mDisplayAdapter->setPreviewWindow(window);
  52.         if(ret!=NO_ERROR)
  53.             {
  54.             CAMHAL_LOGEB("DisplayAdapter setPreviewWindow returned error %d", ret);
  55.             }

  56.         if(mPreviewStartInProgress)
  57.         {
  58.             CAMHAL_LOGDA("setPreviewWindow called when preview running");
  59.             // Start the preview since the window is now available
  60.             ret = startPreview();
  61.         }
  62.     } else {
  63.         // Update the display adapter with the new window that is passed from CameraService
  64.         ret = mDisplayAdapter->setPreviewWindow(window);
  65.         if ( (NO_ERROR == ret) && previewEnabled() ) {
  66.             restartPreview();
  67.         } else if (ret == ALREADY_EXISTS) {
  68.             // ALREADY_EXISTS should be treated as a noop in this case
  69.             ret = NO_ERROR;
  70.         }
  71.     }
  72.     LOG_FUNCTION_NAME_EXIT;

  73.     return ret;

  74. }
这里配置好显示数据来源,显示到目标,以及错误信息回调方法,最终开始preview

  1. status_t CameraHal::startPreview() {
  2.     LOG_FUNCTION_NAME;

  3.     // When tunneling is enabled during VTC, startPreview happens in 2 steps:
  4.     // When the application sends the command CAMERA_CMD_PREVIEW_INITIALIZATION,
  5.     // cameraPreviewInitialization() is called, which in turn causes the CameraAdapter
  6.     // to move from loaded to idle state. And when the application calls startPreview,
  7.     // the CameraAdapter moves from idle to executing state.
  8.     //
  9.     // If the application calls startPreview() without sending the command
  10.     // CAMERA_CMD_PREVIEW_INITIALIZATION, then the function cameraPreviewInitialization()
  11.     // AND startPreview() are executed. In other words, if the application calls
  12.     // startPreview() without sending the command CAMERA_CMD_PREVIEW_INITIALIZATION,
  13.     // then the CameraAdapter moves from loaded to idle to executing state in one shot.
  14.     status_t ret = cameraPreviewInitialization();这个地方十分重要,下面会具体分析

  15.     // The flag mPreviewInitializationDone is set to true at the end of the function
  16.     // cameraPreviewInitialization(). Therefore, if everything goes alright, then the
  17.     // flag will be set. Sometimes, the function cameraPreviewInitialization() may
  18.     // return prematurely if all the resources are not available for starting preview.
  19.     // For example, if the preview window is not set, then it would return NO_ERROR.
  20.     // Under such circumstances, one should return from startPreview as well and should
  21.     // not continue execution. That is why, we check the flag and not the return value.
  22.     if (!mPreviewInitializationDone) return ret;

  23.     // Once startPreview is called, there is no need to continue to remember whether
  24.     // the function cameraPreviewInitialization() was called earlier or not. And so
  25.     // the flag mPreviewInitializationDone is reset here. Plus, this preserves the
  26.     // current behavior of startPreview under the circumstances where the application
  27.     // calls startPreview twice or more.
  28.     mPreviewInitializationDone = false;

  29.     //Enable the display adapter if present, actual overlay enable happens when we post the buffer这里说overlay happens,我一直在找的地方,上面棕色标注将来会在详细说说这里
  30.     if(mDisplayAdapter.get() != NULL) {
  31.         CAMHAL_LOGDA("Enabling display");
  32.         int width, height;
  33.         mParameters.getPreviewSize(&width, &height);

  34. #if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS
  35.         ret = mDisplayAdapter->enableDisplay(width, height, &mStartPreview);
  36. #else
  37.         ret = mDisplayAdapter->enableDisplay(width, height, NULL);
  38. #endif

  39.         if ( ret != NO_ERROR ) {
  40.             CAMHAL_LOGEA("Couldn't enable display");

  41.             // FIXME: At this stage mStateSwitchLock is locked and unlock is supposed to be called
  42.             // only from mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW)
  43.             // below. But this will never happen because of goto error. Thus at next
  44.             // startPreview() call CameraHAL will be deadlocked.
  45.             // Need to revisit mStateSwitch lock, for now just abort the process.
  46.             CAMHAL_ASSERT_X(false,
  47.                 "At this stage mCameraAdapter->mStateSwitchLock is still locked, "
  48.                 "deadlock is guaranteed");

  49.             goto error;
  50.         }

  51.     }

  52.     CAMHAL_LOGDA("Starting CameraAdapter preview mode");
  53.     //Send START_PREVIEW command to adapter
  54.     ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW);//从这里开始调用到BaseCameraAdapter

  55.     if(ret!=NO_ERROR) {
  56.         CAMHAL_LOGEA("Couldn't start preview w/ CameraAdapter");
  57.         goto error;
  58.     }
  59.     CAMHAL_LOGDA("Started preview");

  60.     mPreviewEnabled = true;
  61.     mPreviewStartInProgress = false;
  62.     return ret;

  63.     error:

  64.         CAMHAL_LOGEA("Performing cleanup after error");

  65.         //Do all the cleanup
  66.         freePreviewBufs();
  67.         mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
  68.         if(mDisplayAdapter.get() != NULL) {
  69.             mDisplayAdapter->disableDisplay(false);
  70.         }
  71.         mAppCallbackNotifier->stop();
  72.         mPreviewStartInProgress = false;
  73.         mPreviewEnabled = false;
  74.         LOG_FUNCTION_NAME_EXIT;

  75.         return ret;
  76. }
BaseCameraAdapter实现了父类的sendcommand方法:

  1. case CameraAdapter::CAMERA_START_PREVIEW:
  2.             {

  3.                 CAMHAL_LOGDA("Start Preview");

  4.             if ( ret == NO_ERROR )
  5.                 {
  6.                 ret = setState(operation);
  7.                 }

  8.             if ( ret == NO_ERROR )
  9.                 {
  10.                 ret = startPreview();
  11.                 }

  12.             if ( ret == NO_ERROR )
  13.                 {
  14.                 ret = commitState();
  15.                 }
  16.             else
  17.                 {
  18.                 re|= rollbackState();
  19.                 }

  20.             break;

  21.             }
这里我们接着分析startPreview方法,之前的文章中已经分析过,这里调用的startPreview方法不是BaseCameraAdapter中的startPreview,而是调用的V4LCameraAdapter中的startPreview方法:

  1. status_t V4LCameraAdapter::startPreview()
  2. {
  3.     status_t ret = NO_ERROR;

  4.     LOG_FUNCTION_NAME;
  5.     Mutex::Autolock lock(mPreviewBufsLock);

  6.     if(mPreviewing) {
  7.         ret = BAD_VALUE;
  8.         goto EXIT;
  9.     }

  10.     for (int i = 0; i < mPreviewBufferCountQueueable; i++) {

  11.         mVideoInfo->buf.index = i;
  12.         mVideoInfo->buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
  13.         mVideoInfo->buf.memory = V4L2_MEMORY_MMAP;

  14.         ret = v4lIoctl(mCameraHandle, VIDIOC_QBUF, &mVideoInfo->buf);//申请内存空间
  15.         if (ret < 0) {
  16.             CAMHAL_LOGEA("VIDIOC_QBUF Failed");
  17.             goto EXIT;
  18.         }
  19.         nQueued++;
  20.     }

  21.     ret = v4lStartStreaming();

  22.     // Create and start preview thread for receiving buffers from V4L Camera
  23.     if(!mCapturing) {
  24.         mPreviewThread = new PreviewThread(this);//开始preview线程
  25.         CAMHAL_LOGDA("Created preview thread");
  26.     }

  27.     //Update the flag to indicate we are previewing
  28.     mPreviewing = true;
  29.     mCapturing = false;

  30. EXIT:
  31.     LOG_FUNCTION_NAME_EXIT;
  32.     return ret;
  33. }

  1. status_t V4LCameraAdapter::v4lStartStreaming () {
  2.     status_t ret = NO_ERROR;
  3.     enum v4l2_buf_type bufType;

  4.     if (!mVideoInfo->isStreaming) {
  5.         bufType = V4L2_BUF_TYPE_VIDEO_CAPTURE;

  6.         ret = v4lIoctl (mCameraHandle, VIDIOC_STREAMON, &bufType);开始preview
  7.         if (ret < 0) {
  8.             CAMHAL_LOGEB("StartStreaming: Unable to start capture: %s", strerror(errno));
  9.             return ret;
  10.         }
  11.         mVideoInfo->isStreaming = true;
  12.     }
  13.     return ret;
  14. }
现在我们就看看开启的preview线程都在干什么:

  1. int V4LCameraAdapter::previewThread()
  2. {
  3.     status_t ret = NO_ERROR;
  4.     int width, height;
  5.     CameraFrame frame;
  6.     void *y_uv[2];
  7.     int index = 0;
  8.     int stride = 4096;
  9.     char *fp = NULL;

  10.     mParams.getPreviewSize(&width, &height);

  11.     if (mPreviewing) {

  12.         fp = this->GetFrame(index);
  13.         if(!fp) {
  14.             ret = BAD_VALUE;
  15.             goto EXIT;
  16.         }
  17.         CameraBuffer *buffer = mPreviewBufs.keyAt(index);
  18.         CameraFrame *lframe = (CameraFrame *)mFrameQueue.valueFor(buffer);
  19.         if (!lframe) {
  20.             ret = BAD_VALUE;
  21.             goto EXIT;
  22.         }

  23.         debugShowFPS();

  24.         if ( mFrameSubscribers.size() == 0 ) {
  25.             ret = BAD_VALUE;
  26.             goto EXIT;
  27.         }
  28.         //从这里开始以我的理解是进行数据的转换和保存操作
  29.         y_uv[0] = (void*) lframe->mYuv[0];
  30.         //y_uv[1] = (void*) lframe->mYuv[1];
  31.         //y_uv[1] = (void*) (lframe->mYuv[0] + height*stride);
  32.         convertYUV422ToNV12Tiler ( (unsigned char*)fp, (unsigned char*)y_uv[0], width, height);
  33.         CAMHAL_LOGVB("##...index= %d.;camera buffer= 0x%x; y= 0x%x; UV= 0x%x.",index, buffer, y_uv[0], y_uv[1] );

  34. #ifdef SAVE_RAW_FRAMES
  35.         unsigned char* nv12_buff = (unsigned char*) malloc(width*height*3/2);
  36.         //Convert yuv422i to yuv420sp(NV12) & dump the frame to a file
  37.         convertYUV422ToNV12 ( (unsigned char*)fp, nv12_buff, width, height);
  38.         saveFile( nv12_buff, ((width*height)*3/2) );
  39.         free (nv12_buff);
  40. #endif

  41.         frame.mFrameType = CameraFrame::PREVIEW_FRAME_SYNC;
  42.         frame.mBuffer = buffer;
  43.         frame.mLength = width*height*3/2;
  44.         frame.mAlignment = stride;
  45.         frame.mOffset = 0;
  46.         frame.mTimestamp = systemTime(SYSTEM_TIME_MONOTONIC);
  47.         frame.mFrameMask = (unsigned int)CameraFrame::PREVIEW_FRAME_SYNC;

  48.         if (mRecording)
  49.         {
  50.             frame.mFrameMask |= (unsigned int)CameraFrame::VIDEO_FRAME_SYNC;
  51.             mFramesWithEncoder++;
  52.         }

  53.         ret = setInitFrameRefCount(frame.mBuffer, frame.mFrameMask);
  54.         if (ret != NO_ERROR) {
  55.             CAMHAL_LOGDB("Error in setInitFrameRefCount %d", ret);
  56.         } else {
  57.             ret = sendFrameToSubscribers(&frame);
  58.         }
  59.     }
  60. EXIT:

  61.     return ret;
  62. }
就上面这段代码做一下说明,这里在我看来就是整个数据回流过程的中转站了,上面棕色部分buffer中拿到就就是底层driver返回回来的视频数据了,
那么我不是很明白的是,driver中的视频数据是怎么和 mPreviewBufs 还有index关联在一起的,并且这里可以通过 buffer  =  mPreviewBufs . keyAt ( index ) 获取到CameraBuffer,这里待会会详细探究一下
先接着往下说,获取到视频数据之后,如果需要,会将数据经过转换保存到file中方便之后使用,
最后使用得到的camerabuffer填充CameraFrame,这个结构至关重要,在我的理解,最终是通过 sendFrameToSubscribers ( & frame ) ; 方法将数据回流的

这里就先追踪一下 driver中的视频数据是怎么和 mPreviewBufs 还有index关联在一起的
到了这里就不得不提及上面已经说的一个很重要的方法,先看看这个方法:
他是startPreview的第一步, cameraPreviewInitialization
  1. status_t CameraHal::cameraPreviewInitialization()
  2. {

  3.     status_t ret = NO_ERROR;
  4.     CameraAdapter::BuffersDescriptor desc;
  5.     CameraFrame frame;
  6.     unsigned int required_buffer_count;
  7.     unsigned int max_queueble_buffers;

  8. #if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS
  9.         gettimeofday(&mStartPreview, NULL);
  10. #endif

  11.     LOG_FUNCTION_NAME;

  12.     if (mPreviewInitializationDone) {
  13.         return NO_ERROR;
  14.     }

  15.     if ( mPreviewEnabled ){
  16.       CAMHAL_LOGDA("Preview already running");
  17.       LOG_FUNCTION_NAME_EXIT;
  18.       return ALREADY_EXISTS;
  19.     }

  20.     if ( NULL != mCameraAdapter ) {
  21.       ret = mCameraAdapter->setParameters(mParameters);配置参数到CameraAdapter
  22.     }

  23.     if ((mPreviewStartInProgress == false) && (mDisplayPaused == false)){
  24.       ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_RESOLUTION_PREVIEW,( int ) &frame);//通过这个command获取frame
  25.       if ( NO_ERROR != ret ){
  26.         CAMHAL_LOGEB("Error: CAMERA_QUERY_RESOLUTION_PREVIEW %d", ret);
  27.         return ret;
  28.       }

  29.       ///Update the current preview width and height
  30.       mPreviewWidth = frame.mWidth;//初始化宽和高
  31.       mPreviewHeight = frame.mHeight;
  32.     }

  33.     ///If we don't have the preview callback enabled and display adapter,
  34.     if(!mSetPreviewWindowCalled || (mDisplayAdapter.get() == NULL)){
  35.       CAMHAL_LOGD("Preview not started. Preview in progress flag set");
  36.       mPreviewStartInProgress = true;
  37.       ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_SWITCH_TO_EXECUTING);
  38.       if ( NO_ERROR != ret ){
  39.         CAMHAL_LOGEB("Error: CAMERA_SWITCH_TO_EXECUTING %d", ret);
  40.         return ret;
  41.       }
  42.       return NO_ERROR;
  43.     }

  44.     if( (mDisplayAdapter.get() != NULL) && ( !mPreviewEnabled ) && ( mDisplayPaused ) )
  45.         {
  46.         CAMHAL_LOGDA("Preview is in paused state");

  47.         mDisplayPaused = false;
  48.         mPreviewEnabled = true;
  49.         if ( NO_ERROR == ret )
  50.             {
  51.             ret = mDisplayAdapter->pauseDisplay(mDisplayPaused);

  52.             if ( NO_ERROR != ret )
  53.                 {
  54.                 CAMHAL_LOGEB("Display adapter resume failed %x", ret);
  55.                 }
  56.             }
  57.         //restart preview callbacks
  58.         if(mMsgEnabled & CAMERA_MSG_PREVIEW_FRAME)
  59.         {
  60.             mAppCallbackNotifier->enableMsgType (CAMERA_MSG_PREVIEW_FRAME);//
  61.         }

  62.         signalEndImageCapture();
  63.         return ret;
  64.         }

  65.     required_buffer_count = atoi(mCameraProperties->get(CameraProperties::REQUIRED_PREVIEW_BUFS));

  66.     ///Allocate the preview buffers
  67.     ret = allocPreviewBufs(mPreviewWidth, mPreviewHeight, mParameters.getPreviewFormat(), required_buffer_count,max_queueble_buffers);

  68.     if ( NO_ERROR != ret )
  69.         {
  70.         CAMHAL_LOGEA("Couldn't allocate buffers for Preview");
  71.         goto error;
  72.         }

  73.     if ( mMeasurementEnabled )
  74.         {

  75.         ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA,
  76.                                           ( int ) &frame,
  77.                                           required_buffer_count);
  78.         if ( NO_ERROR != ret )
  79.             {
  80.             return ret;
  81.             }

  82.          ///Allocate the preview data buffers
  83.         ret = allocPreviewDataBufs(frame.mLength, required_buffer_count);
  84.         if ( NO_ERROR != ret ) {
  85.             CAMHAL_LOGEA("Couldn't allocate preview data buffers");
  86.             goto error;
  87.            }

  88.         if ( NO_ERROR == ret )
  89.             {
  90.             desc.mBuffers = mPreviewDataBuffers;
  91.             desc.mOffsets = mPreviewDataOffsets;
  92.             desc.mFd = mPreviewDataFd;
  93.             desc.mLength = mPreviewDataLength;
  94.             desc.mCount = ( size_t ) required_buffer_count;
  95.             desc.mMaxQueueable = (size_t) required_buffer_count;

  96.             mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW_DATA,
  97.                                         ( int ) &desc);
  98.             }

  99.         }

  100.     ///Pass the buffers to Camera Adapter
  101.     desc.mBuffers = mPreviewBuffers;
  102.     desc.mOffsets = mPreviewOffsets;
  103.     desc.mFd = mPreviewFd;
  104.     desc.mLength = mPreviewLength;
  105.     desc.mCount = ( size_t ) required_buffer_count;
  106.     desc.mMaxQueueable = (size_t) max_queueble_buffers;

  107.     ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW,( int ) &desc);

  108.     if ( NO_ERROR != ret )
  109.         {
  110.         CAMHAL_LOGEB("Failed to register preview buffers: 0x%x", ret);
  111.         freePreviewBufs();
  112.         return ret;
  113.         }

  114.     mAppCallbackNotifier->startPreviewCallbacks(mParameters, mPreviewBuffers, mPreviewOffsets, mPreviewFd, mPreviewLength, required_buffer_count);
  115.     ///Start the callback notifier
  116.     ret = mAppCallbackNotifier->start();

  117.     if( ALREADY_EXISTS == ret )
  118.         {
  119.         //Already running, do nothing
  120.         CAMHAL_LOGDA("AppCallbackNotifier already running");
  121.         ret = NO_ERROR;
  122.         }
  123.     else if ( NO_ERROR == ret ) {
  124.         CAMHAL_LOGDA("Started AppCallbackNotifier..");
  125.         mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);
  126.         }
  127.     else
  128.         {
  129.         CAMHAL_LOGDA("Couldn't start AppCallbackNotifier");
  130.         goto error;
  131.         }

  132.     if (ret == NO_ERROR) mPreviewInitializationDone = true;
  133.     return ret;

  134.     error:

  135.         CAMHAL_LOGEA("Performing cleanup after error");

  136.         //Do all the cleanup
  137.         freePreviewBufs();
  138.         mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
  139.         if(mDisplayAdapter.get() != NULL)
  140.             {
  141.             mDisplayAdapter->disableDisplay(false);
  142.             }
  143.         mAppCallbackNotifier->stop();
  144.         mPreviewStartInProgress = false;
  145.         mPreviewEnabled = false;
  146.         LOG_FUNCTION_NAME_EXIT;

  147.         return ret;
  148. }
这里先为preview buffer申请内存,并将preview set进cameraAdapter通过方法 mCameraAdapter - > sendCommand ( CameraAdapter : : CAMERA_USE_BUFFERS_PREVIEW , (   int   )   & desc )
在sendcommand中实现如下:

  1.             case CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW:
  2.                 CAMHAL_LOGDA("Use buffers for preview");
  3.                 desc = ( BuffersDescriptor * ) value1;

  4.                 if ( NULL == desc )
  5.                     {
  6.                     CAMHAL_LOGEA("Invalid preview buffers!");
  7.                     return -EINVAL;
  8.                     }

  9.                 if ( ret == NO_ERROR )
  10.                     {
  11.                     ret = setState(operation);
  12.                     }

  13.                 if ( ret == NO_ERROR )
  14.                     {
  15.                     Mutex::Autolock lock(mPreviewBufferLock);
  16.                     mPreviewBuffers = desc->mBuffers;
  17.                     mPreviewBuffersLength = desc->mLength;
  18.                     mPreviewBuffersAvailable.clear();
  19.                     mSnapshotBuffersAvailable.clear();
  20.                     for ( uint32_t i = 0 ; i < desc->mMaxQueueable ; i++ )
  21.                         {
  22.                         mPreviewBuffersAvailable.add(&mPreviewBuffers[i], 0);这里实现了mPreviewBuffersAvailable与mPreviewBuffers的关联
  23.                         }
  24.                     // initial ref count for undeqeueued buffers is 1 since buffer provider
  25.                     // is still holding on to it
  26.                     for ( uint32_t i = desc->mMaxQueueable ; i < desc->mCount ; i++ )
  27.                         {
  28.                         mPreviewBuffersAvailable.add(&mPreviewBuffers[i], 1);
  29.                         }
  30.                     }

  31.                 if ( NULL != desc )
  32.                     {
  33.                     ret = useBuffers(CameraAdapter::CAMERA_PREVIEW,
  34.                                      desc->mBuffers,
  35.                                      desc->mCount,
  36.                                      desc->mLength,
  37.                                      desc->mMaxQueueable);
  38.                     }

  39.                 if ( ret == NO_ERROR )
  40.                     {
  41.                     ret = commitState();
  42.                     }
  43.                 else
  44.                     {
  45.                     ret |= rollbackState();
  46.                     }

  47.                 break;
调用V4LCameraAdapter中的useBuffers方法,然后他会接着调用UseBuffersPreview方法:

  1. status_t V4LCameraAdapter::UseBuffersPreview(CameraBuffer *bufArr, int num)
  2. {
  3.     int ret = NO_ERROR;
  4.     LOG_FUNCTION_NAME;

  5.     if(NULL == bufArr) {
  6.         ret = BAD_VALUE;
  7.         goto EXIT;
  8.     }

  9.     ret = v4lInitMmap(num);
  10.     if (ret == NO_ERROR) {
  11.         for (int i = 0; i < num; i++) {
  12.             //Associate each Camera internal buffer with the one from Overlay
  13.             mPreviewBufs.add(&bufArr[i], i);//这里实现了mPreviewBufsdesc->mBuffers的关联
  14.             CAMHAL_LOGDB("Preview- buff [%d] = 0x%x ",i, mPreviewBufs.keyAt(i));
  15.         }

  16.         // Update the preview buffer count
  17.         mPreviewBufferCount = num;
  18.     }
  19. EXIT:
  20.     LOG_FUNCTION_NAME_EXIT;
  21.     return ret;
  22. }
在这里我们还是很有必要去深入研究一下 mAppCallbackNotifier 这个变量的初始化过程,他决定很多回调函数的初始化
的初始化是在哪里实现的呢??在在camerahal文件的initial中初始化的

  1. /**
  2.    @brief Initialize the Camera HAL

  3.    Creates CameraAdapter, AppCallbackNotifier, DisplayAdapter and MemoryManager

  4.    @param None
  5.    @return NO_ERROR - On success
  6.          NO_MEMORY - On failure to allocate memory for any of the objects
  7.    @remarks Camera Hal internal function

  8.  */

  9. status_t CameraHal::initialize(CameraProperties::Properties* properties)
  10. {
  11.     LOG_FUNCTION_NAME;

  12.     int sensor_index = 0;
  13.     const char* sensor_name = NULL;

  14.     ///Initialize the event mask used for registering an event provider for AppCallbackNotifier
  15.     ///Currently, registering all events as to be coming from CameraAdapter
  16.     int32_t eventMask = CameraHalEvent::ALL_EVENTS;

  17.     // Get my camera properties
  18.     mCameraProperties = properties;

  19.     if(!mCameraProperties)
  20.     {
  21.         goto fail_loop;
  22.     }

  23.     // Dump the properties of this Camera
  24.     // will only print if DEBUG macro is defined
  25.     mCameraProperties->dump();

  26.     if (strcmp(CameraProperties::DEFAULT_VALUE, mCameraProperties->get(CameraProperties::CAMERA_SENSOR_INDEX)) !=)
  27.         {
  28.         sensor_index = atoi(mCameraProperties->get(CameraProperties::CAMERA_SENSOR_INDEX));
  29.         }

  30.     if (strcmp(CameraProperties::DEFAULT_VALUE, mCameraProperties->get(CameraProperties::CAMERA_NAME)) != 0 ) {
  31.         sensor_name = mCameraProperties->get(CameraProperties::CAMERA_NAME);
  32.     }
  33.     CAMHAL_LOGDB("Sensor index= %d; Sensor name= %s", sensor_index, sensor_name);

  34.     if (strcmp(sensor_name, V4L_CAMERA_NAME_USB) == 0) {
  35. #ifdef V4L_CAMERA_ADAPTER
  36.         mCameraAdapter = V4LCameraAdapter_Factory(sensor_index);
  37. #endif
  38.     }
  39.     else {
  40. #ifdef OMX_CAMERA_ADAPTER
  41.         mCameraAdapter = OMXCameraAdapter_Factory(sensor_index);
  42. #endif
  43.     }

  44.     if ( ( NULL == mCameraAdapter ) || (mCameraAdapter->initialize(properties)!=NO_ERROR))
  45.         {
  46.         CAMHAL_LOGEA("Unable to create or initialize CameraAdapter");
  47.         mCameraAdapter = NULL;
  48.         goto fail_loop;
  49.         }

  50.     mCameraAdapter->incStrong(mCameraAdapter);
  51.     mCameraAdapter->registerImageReleaseCallback(releaseImageBuffers, (void *) this);
  52.     mCameraAdapter->registerEndCaptureCallback(endImageCapture, (void *)this);

  53.     if(!mAppCallbackNotifier.get())
  54.         {
  55.         /// Create the callback notifier
  56.         mAppCallbackNotifier = new AppCallbackNotifier();
  57.         if( ( NULL == mAppCallbackNotifier.get() ) || ( mAppCallbackNotifier->initialize() != NO_ERROR))
  58.             {
  59.             CAMHAL_LOGEA("Unable to create or initialize AppCallbackNotifier");
  60.             goto fail_loop;
  61.             }
  62.         }

  63.     if(!mMemoryManager.get())
  64.         {
  65.         /// Create Memory Manager
  66.         mMemoryManager = new MemoryManager();
  67.         if( ( NULL == mMemoryManager.get() ) || ( mMemoryManager->initialize() != NO_ERROR))
  68.             {
  69.             CAMHAL_LOGEA("Unable to create or initialize MemoryManager");
  70.             goto fail_loop;
  71.             }
  72.         }

  73.     ///Setup the class dependencies...

  74.     ///AppCallbackNotifier has to know where to get the Camera frames and the events like auto focus lock etc from.
  75.     ///CameraAdapter is the one which provides those events
  76.     ///Set it as the frame and event providers for AppCallbackNotifier
  77.     ///@remarks setEventProvider API takes in a bit mask of events for registering a provider for the different events
  78.     /// That way, if events can come from DisplayAdapter in future, we will be able to add it as provider
  79.     /// for any event
  80.     mAppCallbackNotifier->setEventProvider(eventMask, mCameraAdapter);
  81.     mAppCallbackNotifier->setFrameProvider(mCameraAdapter);

  82.     ///Any dynamic errors that happen during the camera use case has to be propagated back to the application
  83.     ///via CAMERA_MSG_ERROR. AppCallbackNotifier is the class that notifies such errors to the application
  84.     ///Set it as the error handler for CameraAdapter
  85.     mCameraAdapter->setErrorHandler(mAppCallbackNotifier.get());

  86.     ///Start the callback notifier
  87.     if(mAppCallbackNotifier->start() != NO_ERROR)
  88.       {
  89.         CAMHAL_LOGEA("Couldn't start AppCallbackNotifier");
  90.         goto fail_loop;
  91.       }

  92.     CAMHAL_LOGDA("Started AppCallbackNotifier..");
  93.     mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);

  94.     ///Initialize default parameters
  95.     initDefaultParameters();


  96.     if ( setParameters(mParameters) != NO_ERROR )
  97.         {
  98.         CAMHAL_LOGEA("Failed to set default parameters?!");
  99.         }

  100.     // register for sensor events
  101.     mSensorListener = new SensorListener();
  102.     if (mSensorListener.get()) {
  103.         if (mSensorListener->initialize() == NO_ERROR) {
  104.             mSensorListener->setCallbacks(orientation_cb, this);
  105.             mSensorListener->enableSensor(SensorListener::SENSOR_ORIENTATION);
  106.         } else {
  107.             CAMHAL_LOGEA("Error initializing SensorListener. not fatal, continuing");
  108.             mSensorListener.clear();
  109.             mSensorListener = NULL;
  110.         }
  111.     }

  112.     LOG_FUNCTION_NAME_EXIT;

  113.     return NO_ERROR;

  114.     fail_loop:

  115.         ///Free up the resources because we failed somewhere up
  116.         deinitialize();
  117.         LOG_FUNCTION_NAME_EXIT;

  118.         return NO_MEMORY;

  119. }
这里实例化了一些对象,我真正关注的是 mAppCallbackNotifier 这个对象,实例化这个对象,并且initialize,并且设置EventProvider和FrameProvider
我们就看一下setFrameProvider这个方法都做了什么事情,

  1. void AppCallbackNotifier::setFrameProvider(FrameNotifier *frameNotifier)
  2. {
  3.     LOG_FUNCTION_NAME;
  4.     ///@remarks There is no NULL check here. We will check
  5.     ///for NULL when we get the start command from CameraAdapter
  6.     mFrameProvider = new FrameProvider(frameNotifier, this, frameCallbackRelay);
  7.     if ( NULL == mFrameProvider )
  8.         {
  9.         CAMHAL_LOGEA("Error in creating FrameProvider");
  10.         }
  11.     else
  12.         {
  13.         //Register only for captured images and RAW for now
  14.         //TODO: Register for and handle all types of frames
  15.         mFrameProvider->enableFrameNotification(CameraFrame::IMAGE_FRAME);
  16.         mFrameProvider->enableFrameNotification(CameraFrame::RAW_FRAME);
  17.         }

  18.     LOG_FUNCTION_NAME_EXIT;
  19. }
实例化一个FrameProvider对象,并且enable响应notification,这里实例化的对象里面的参数中有一个frameCallbackRely是一个回调函数,这里我们暂且回过头看看previewthread中的那个方法 sendFrameToSubscribers
这个方法只是调用了下面这个方法实现:
  1. status_t BaseCameraAdapter::__sendFrameToSubscribers(CameraFrame* frame,
  2.                                                      KeyedVector<int, frame_callback> *subscribers,
  3.                                                      CameraFrame::FrameType frameType)
  4. {
  5.     size_t refCount = 0;
  6.     status_t ret = NO_ERROR;
  7.     frame_callback callback = NULL;

  8.     frame->mFrameType = frameType;

  9.     if ( (frameType == CameraFrame::PREVIEW_FRAME_SYNC) ||
  10.          (frameType == CameraFrame::VIDEO_FRAME_SYNC) ||
  11.          (frameType == CameraFrame::SNAPSHOT_FRAME) ){
  12.         if (mFrameQueue.size() > 0){
  13.           CameraFrame *lframe = (CameraFrame *)mFrameQueue.valueFor(frame->mBuffer);
  14.           frame->mYuv[0] = lframe->mYuv[0];
  15.           frame->mYuv[1] = frame->mYuv[0] + (frame->mLength + frame->mOffset)*2/3;
  16.         }
  17.         else{
  18.           CAMHAL_LOGDA("Empty Frame Queue");
  19.           return -EINVAL;
  20.         }
  21.       }

  22.     if (NULL != subscribers) {
  23.         refCount = getFrameRefCount(frame->mBuffer, frameType);

  24.         if (refCount == 0) {
  25.             CAMHAL_LOGDA("Invalid ref count of 0");
  26.             return -EINVAL;
  27.         }

  28.         if (refCount > subscribers->size()) {
  29.             CAMHAL_LOGEB("Invalid ref count for frame type: 0x%x", frameType);
  30.             return -EINVAL;
  31.         }

  32.         CAMHAL_LOGVB("Type of Frame: 0x%x address: 0x%x refCount start %d",
  33.                      frame->mFrameType,
  34.                      ( uint32_t ) frame->mBuffer,
  35.                      refCount);

  36.         for ( unsigned int i = 0 ; i < refCount; i++ ) {
  37.             frame->mCookie = ( void * ) subscribers->keyAt(i);
  38.             callback = (frame_callback) subscribers->valueAt(i);

  39.             if (!callback) {
  40.                 CAMHAL_LOGEB("callback not set for frame type: 0x%x", frameType);
  41.                 return -EINVAL;
  42.             }

  43.             callback(frame);
  44.         }
  45.     } else {
  46.         CAMHAL_LOGEA("Subscribers is null??");
  47.         return -EINVAL;
  48.     }

  49.     return ret;
  50. }
最重要的部分我在上面已经表示出来,通过subscribers这个全局KeyedVector变量找到相应的frame->mCookie和callback方法,
这里所要获取到的callback方法就是上面setFrameProvider时引入的 frameCallbackRelay 这个函数,我们看看这个函数的具体实现

  1. void AppCallbackNotifier::frameCallbackRelay(CameraFrame* caFrame)
  2. {
  3.     LOG_FUNCTION_NAME;
  4.     AppCallbackNotifier *appcbn = (AppCallbackNotifier*) (caFrame->mCookie);
  5.     appcbn->frameCallback(caFrame);
  6.     LOG_FUNCTION_NAME_EXIT;
  7. }

  8. void AppCallbackNotifier::frameCallback(CameraFrame* caFrame)
  9. {
  10.     ///Post the event to the event queue of AppCallbackNotifier
  11.     TIUTILS::Message msg;
  12.     CameraFrame *frame;

  13.     LOG_FUNCTION_NAME;

  14.     if ( NULL != caFrame )
  15.         {

  16.         frame = new CameraFrame(*caFrame);
  17.         if ( NULL != frame )
  18.             {
  19.               msg.command = AppCallbackNotifier::NOTIFIER_CMD_PROCESS_FRAME;
  20.               msg.arg1 = frame;
  21.               mFrameQ.put(&msg);
  22.             }
  23.         else
  24.             {
  25.             CAMHAL_LOGEA("Not enough resources to allocate CameraFrame");
  26.             }

  27.         }

  28.     LOG_FUNCTION_NAME_EXIT;
  29. }
这个回调函数只是将数据装载入mag这个消息结构体,比且把这个消息put到mFrameQ这个全局消息中心,app就是从这个消息中心把数据取走的
我们可以看一下在 AppCallbackNotifier初始化的时候就调用了initialize做一下初始设置

  1. /**
  2.   * NotificationHandler class
  3.   */

  4. ///Initialization function for AppCallbackNotifier
  5. status_t AppCallbackNotifier::initialize()
  6. {
  7.     LOG_FUNCTION_NAME;

  8.     mPreviewMemory = 0;

  9.     mMeasurementEnabled = false;

  10.     mNotifierState = NOTIFIER_STOPPED;

  11.     ///Create the app notifier thread
  12.     mNotificationThread = new NotificationThread(this);
  13.     if(!mNotificationThread.get())
  14.         {
  15.         CAMHAL_LOGEA("Couldn't create Notification thread");
  16.         return NO_MEMORY;
  17.         }

  18.     ///Start the display thread
  19.     status_t ret = mNotificationThread->run("NotificationThread", PRIORITY_URGENT_DISPLAY);
  20.     if(ret!=NO_ERROR)
  21.         {
  22.         CAMHAL_LOGEA("Couldn't run NotificationThread");
  23.         mNotificationThread.clear();
  24.         return ret;
  25.         }

  26.     mUseMetaDataBufferMode = true;
  27.     mRawAvailable = false;

  28.     mRecording = false;
  29.     mPreviewing = false;

  30.     LOG_FUNCTION_NAME_EXIT;

  31.     return ret;
  32. }
这个初始化的方法里面最重要就是开启了一个线程,用来监听HAL层发送来的一切消息,并在其中将消息或者数据告诉app,看看这个线程的具体实现

  1. bool AppCallbackNotifier::notificationThread()
  2. {
  3.     bool shouldLive = true;
  4.     status_t ret;

  5.     LOG_FUNCTION_NAME;

  6.     //CAMHAL_LOGDA("Notification Thread waiting for message");
  7.     ret = TIUTILS::MessageQueue::waitForMsg(&mNotificationThread->msgQ(),
  8.                                             &mEventQ,
  9.                                             &mFrameQ,
  10.                                             AppCallbackNotifier::NOTIFIER_TIMEOUT);

  11.     //CAMHAL_LOGDA("Notification Thread received message");

  12.     if (mNotificationThread->msgQ().hasMsg()) {
  13.         ///Received a message from CameraHal, process it
  14.         CAMHAL_LOGDA("Notification Thread received message from Camera HAL");
  15.         shouldLive = processMessage();
  16.         if(!shouldLive) {
  17.           CAMHAL_LOGDA("Notification Thread exiting.");
  18.           return shouldLive;
  19.         }
  20.     }

  21.     if(mEventQ.hasMsg()) {
  22.         ///Received an event from one of the event providers
  23.         CAMHAL_LOGDA("Notification Thread received an event from event provider (CameraAdapter)");
  24.         notifyEvent();
  25.      }

  26.     if(mFrameQ.hasMsg()) {
  27.        ///Received a frame from one of the frame providers
  28.        //CAMHAL_LOGDA("Notification Thread received a frame from frame provider (CameraAdapter)");
  29.        notifyFrame();
  30.     }

  31.     LOG_FUNCTION_NAME_EXIT;
  32.     return shouldLive;
  33. }
这里等待到有消息时,我们直接分析我需要的,preview,如果检测到mFrameQ中有消息,则调用notifyFrame方法

  1. void AppCallbackNotifier::notifyFrame()
  2. {
  3.     ///Receive and send the frame notifications to app
  4.     TIUTILS::Message msg;
  5.     CameraFrame *frame;
  6.     MemoryHeapBase *heap;
  7.     MemoryBase *buffer = NULL;
  8.     sp<MemoryBase> memBase;
  9.     void *buf = NULL;

  10.     LOG_FUNCTION_NAME;

  11.     {
  12.         Mutex::Autolock lock(mLock);
  13.         if(!mFrameQ.isEmpty()) {
  14.             mFrameQ.get(&msg);
  15.         } else {
  16.             return;
  17.         }
  18.     }

  19.     bool ret = true;

  20.     frame = NULL;
  21.     switch(msg.command)
  22.         {
  23.         case AppCallbackNotifier::NOTIFIER_CMD_PROCESS_FRAME:

  24.                 frame = (CameraFrame *) msg.arg1;
  25.                 if(!frame)
  26.                     {
  27.                     break;
  28.                     }

  29.                 if ( (CameraFrame::RAW_FRAME == frame->mFrameType )&&
  30.                     ( NULL != mCameraHal ) &&
  31.                     ( NULL != mDataCb) &&
  32.                     ( NULL != mNotifyCb ) )
  33.                     {

  34.                     if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE) )
  35.                         {
  36. #ifdef COPY_IMAGE_BUFFER
  37.                         copyAndSendPictureFrame(frame, CAMERA_MSG_RAW_IMAGE);
  38. #else
  39.                         //TODO: Find a way to map a Tiler buffer to a MemoryHeapBase
  40. #endif
  41.                         }
  42.                     else {
  43.                         if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE_NOTIFY) ) {
  44.                             mNotifyCb(CAMERA_MSG_RAW_IMAGE_NOTIFY, 0, 0, mCallbackCookie);
  45.                         }
  46.                         mFrameProvider->returnFrame(frame->mBuffer,
  47.                                                     (CameraFrame::FrameType) frame->mFrameType);
  48.                     }

  49.                     mRawAvailable = true;

  50.                     }
  51.                 else if ( (CameraFrame::IMAGE_FRAME == frame->mFrameType) &&
  52.                           (NULL != mCameraHal) &&
  53.                           (NULL != mDataCb) &&
  54.                           (CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG & frame->mQuirks) )
  55.                     {

  56.                     int encode_quality = 100, tn_quality = 100;
  57.                     int tn_width, tn_height;
  58.                     unsigned int current_snapshot = 0;
  59.                     Encoder_libjpeg::params *main_jpeg = NULL, *tn_jpeg = NULL;
  60.                     void* exif_data = NULL;
  61.                     const char *previewFormat = NULL;
  62.                     camera_memory_t* raw_picture = mRequestMemory(-1, frame->mLength, 1, NULL);

  63.                     if(raw_picture) {
  64.                         buf = raw_picture->data;
  65.                     }

  66.                     CameraParameters parameters;
  67.                     char *params = mCameraHal->getParameters();
  68.                     const String8 strParams(params);
  69.                     parameters.unflatten(strParams);

  70.                     encode_quality = parameters.getInt(CameraParameters::KEY_JPEG_QUALITY);
  71.                     if (encode_quality < 0 || encode_quality > 100) {
  72.                         encode_quality = 100;
  73.                     }

  74.                     tn_quality = parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_QUALITY);
  75.                     if (tn_quality < 0 || tn_quality > 100) {
  76.                         tn_quality = 100;
  77.                     }

  78.                     if (CameraFrame::HAS_EXIF_DATA & frame->mQuirks) {
  79.                         exif_data = frame->mCookie2;
  80.                     }

  81.                     main_jpeg = (Encoder_libjpeg::params*)
  82.                                     malloc(sizeof(Encoder_libjpeg::params));

  83.                     // Video snapshot with LDCNSF on adds a few bytes start offset
  84.                     // and a few bytes on every line. They must be skipped.
  85.                     int rightCrop = frame->mAlignment/- frame->mWidth;

  86.                     CAMHAL_LOGDB("Video snapshot right crop = %d", rightCrop);
  87.                     CAMHAL_LOGDB("Video snapshot offset = %d", frame->mOffset);

  88.                     if (main_jpeg) {
  89.                         main_jpeg->src = (uint8_t *)frame->mBuffer->mapped;
  90.                         main_jpeg->src_size = frame->mLength;
  91.                         main_jpeg->dst = (uint8_t*) buf;
  92.                         main_jpeg->dst_size = frame->mLength;
  93.                         main_jpeg->quality = encode_quality;
  94.                         main_jpeg->in_width = frame->mAlignment/2; // use stride here
  95.                         main_jpeg->in_height = frame->mHeight;
  96.                         main_jpeg->out_width = frame->mAlignment/2;
  97.                         main_jpeg->out_height = frame->mHeight;
  98.                         main_jpeg->right_crop = rightCrop;
  99.                         main_jpeg->start_offset = frame->mOffset;
  100.                         if ( CameraFrame::FORMAT_YUV422I_UYVY & frame->mQuirks) {
  101.                             main_jpeg->format = TICameraParameters::PIXEL_FORMAT_YUV422I_UYVY;
  102.                         }
  103.                         else { //if ( CameraFrame::FORMAT_YUV422I_YUYV & frame->mQuirks)
  104.                             main_jpeg->format = CameraParameters::PIXEL_FORMAT_YUV422I;
  105.                         }
  106.                     }

  107.                     tn_width = parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_WIDTH);
  108.                     tn_height = parameters.getInt(CameraParameters::KEY_JPEG_THUMBNAIL_HEIGHT);
  109.                     previewFormat = parameters.getPreviewFormat();

  110.                     if ((tn_width > 0) && (tn_height > 0) && ( NULL != previewFormat )) {
  111.                         tn_jpeg = (Encoder_libjpeg::params*)
  112.                                       malloc(sizeof(Encoder_libjpeg::params));
  113.                         // if malloc fails just keep going and encode main jpeg
  114.                         if (!tn_jpeg) {
  115.                             tn_jpeg = NULL;
  116.                         }
  117.                     }

  118.                     if (tn_jpeg) {
  119.                         int width, height;
  120.                         parameters.getPreviewSize(&width,&height);
  121.                         current_snapshot = (mPreviewBufCount + MAX_BUFFERS - 1) % MAX_BUFFERS;
  122.                         tn_jpeg->src = (uint8_t *)mPreviewBuffers[current_snapshot].mapped;
  123.                         tn_jpeg->src_size = mPreviewMemory->size / MAX_BUFFERS;
  124.                         tn_jpeg->dst_size = calculateBufferSize(tn_width,
  125.                                                                 tn_height,
  126.                                                                 previewFormat);
  127.                         tn_jpeg->dst = (uint8_t*) malloc(tn_jpeg->dst_size);
  128.                         tn_jpeg->quality = tn_quality;
  129.                         tn_jpeg->in_width = width;
  130.                         tn_jpeg->in_height = height;
  131.                         tn_jpeg->out_width = tn_width;
  132.                         tn_jpeg->out_height = tn_height;
  133.                         tn_jpeg->right_crop = 0;
  134.                         tn_jpeg->start_offset = 0;
  135.                         tn_jpeg->format = CameraParameters::PIXEL_FORMAT_YUV420SP;;
  136.                     }

  137.                     sp<Encoder_libjpeg> encoder = new Encoder_libjpeg(main_jpeg,
  138.                                                       tn_jpeg,
  139.                                                       AppCallbackNotifierEncoderCallback,
  140.                                                       (CameraFrame::FrameType)frame->mFrameType,
  141.                                                       this,
  142.                                                       raw_picture,
  143.                                                       exif_data, frame->mBuffer);
  144.                     gEncoderQueue.add(frame->mBuffer->mapped, encoder);
  145.                     encoder->run();
  146.                     encoder.clear();
  147.                     if (params != NULL)
  148.                       {
  149.                         mCameraHal->putParameters(params);
  150.                       }
  151.                     }
  152.                 else if ( ( CameraFrame::IMAGE_FRAME == frame->mFrameType ) &&
  153.                              ( NULL != mCameraHal ) &&
  154.                              ( NULL != mDataCb) )
  155.                     {

  156.                     // CTS, MTS requirements: Every 'takePicture()' call
  157.                     // who registers a raw callback should receive one
  158.                     // as well. This is not always the case with
  159.                     // CameraAdapters though.
  160.                     if (!mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE)) {
  161.                         dummyRaw();
  162.                     } else {
  163.                         mRawAvailable = false;
  164.                     }

  165. #ifdef COPY_IMAGE_BUFFER
  166.                     {
  167.                         Mutex::Autolock lock(mBurstLock);
  168. #if defined(OMAP_ENHANCEMENT)
  169.                         if ( mBurst )
  170.                         {
  171.                             copyAndSendPictureFrame(frame, CAMERA_MSG_COMPRESSED_BURST_IMAGE);
  172.                         }
  173.                         else
  174. #endif
  175.                         {
  176.                             copyAndSendPictureFrame(frame, CAMERA_MSG_COMPRESSED_IMAGE);
  177.                         }
  178.                     }
  179. #else
  180.                      //TODO: Find a way to map a Tiler buffer to a MemoryHeapBase
  181. #endif
  182.                     }
  183.                 else if ( ( CameraFrame::VIDEO_FRAME_SYNC == frame->mFrameType ) &&
  184.                              ( NULL != mCameraHal ) &&
  185.                              ( NULL != mDataCb) &&
  186.                              ( mCameraHal->msgTypeEnabled(CAMERA_MSG_VIDEO_FRAME) ) )
  187.                     {
  188.                     AutoMutex locker(mRecordingLock);
  189.                     if(mRecording)
  190.                         {
  191.                         if(mUseMetaDataBufferMode)
  192.                             {
  193.                             camera_memory_t *videoMedatadaBufferMemory =
  194.                                              mVideoMetadataBufferMemoryMap.valueFor(frame->mBuffer->opaque);
  195.                             video_metadata_t *videoMetadataBuffer = (video_metadata_t *) videoMedatadaBufferMemory->data;

  196.                             if( (NULL == videoMedatadaBufferMemory) || (NULL == videoMetadataBuffer) || (NULL == frame->mBuffer) )
  197.                                 {
  198.                                 CAMHAL_LOGEA("Error! One of the video buffers is NULL");
  199.                                 break;
  200.                                 }

  201.                             if ( mUseVideoBuffers )
  202.                               {
  203.                                 CameraBuffer *vBuf = mVideoMap.valueFor(frame->mBuffer->opaque);
  204.                                 GraphicBufferMapper &mapper = GraphicBufferMapper::get();
  205.                                 Rect bounds;
  206.                                 bounds.left = 0;
  207.                                 bounds.top = 0;
  208.                                 bounds.right = mVideoWidth;
  209.                                 bounds.bottom = mVideoHeight;

  210.                                 void *y_uv[2];
  211.                                 mapper.lock((buffer_handle_t)vBuf, CAMHAL_GRALLOC_USAGE, bounds, y_uv);
  212.                                 y_uv[1] = y_uv[0] + mVideoHeight*4096;

  213.                                 structConvImage input = {frame->mWidth,
  214.                                                           frame->mHeight,
  215.                                                           4096,
  216.                                                           IC_FORMAT_YCbCr420_lp,
  217.                                                           (mmByte *)frame->mYuv[0],
  218.                                                           (mmByte *)frame->mYuv[1],
  219.                                                           frame->mOffset};

  220.                                 structConvImage output = {mVideoWidth,
  221.                                                           mVideoHeight,
  222.                                                           4096,
  223.                                                           IC_FORMAT_YCbCr420_lp,
  224.                                                           (mmByte *)y_uv[0],
  225.                                                           (mmByte *)y_uv[1],
  226.                                                           0};

  227.                                 VT_resizeFrame_Video_opt2_lp(&input, &output, NULL, 0);
  228.                                 mapper.unlock((buffer_handle_t)vBuf->opaque);
  229.                                 videoMetadataBuffer->metadataBufferType = (int) kMetadataBufferTypeCameraSource;
  230.                                 /* FIXME remove cast */
  231.                                 videoMetadataBuffer->handle = (void *)vBuf->opaque;
  232.                                 videoMetadataBuffer->offset = 0;
  233.                               }
  234.                             else
  235.                               {
  236.                                 videoMetadataBuffer->metadataBufferType = (int) kMetadataBufferTypeCameraSource;
  237.                                 videoMetadataBuffer->handle = camera_buffer_get_omx_ptr(frame->mBuffer);
  238.                                 videoMetadataBuffer->offset = frame->mOffset;
  239.                               }

  240.                             CAMHAL_LOGVB("mDataCbTimestamp : frame->mBuffer=0x%x, videoMetadataBuffer=0x%x, videoMedatadaBufferMemory=0x%x",
  241.                                             frame->mBuffer->opaque, videoMetadataBuffer, videoMedatadaBufferMemory);

  242.                             mDataCbTimestamp(frame->mTimestamp, CAMERA_MSG_VIDEO_FRAME,
  243.                                                 videoMedatadaBufferMemory, 0, mCallbackCookie);
  244.                             }
  245.                         else
  246.                             {
  247.                             //TODO: Need to revisit this, should ideally be mapping the TILER buffer using mRequestMemory
  248.                             camera_memory_t* fakebuf = mRequestMemory(-1, sizeof(buffer_handle_t), 1, NULL);
  249.                             if( (NULL == fakebuf) || ( NULL == fakebuf->data) || ( NULL == frame->mBuffer))
  250.                                 {
  251.                                 CAMHAL_LOGEA("Error! One of the video buffers is NULL");
  252.                                 break;
  253.                                 }

  254.                             *reinterpret_cast<buffer_handle_t*>(fakebuf->data) = reinterpret_cast<buffer_handle_t>(frame->mBuffer->mapped);
  255.                             mDataCbTimestamp(frame->mTimestamp, CAMERA_MSG_VIDEO_FRAME, fakebuf, 0, mCallbackCookie);
  256.                             fakebuf->release(fakebuf);
  257.                             }
  258.                         }
  259.                     }
  260.                 else if(( CameraFrame::SNAPSHOT_FRAME == frame->mFrameType ) &&
  261.                              ( NULL != mCameraHal ) &&
  262.                              ( NULL != mDataCb) &&
  263.                              ( NULL != mNotifyCb)) {
  264.                     //When enabled, measurement data is sent instead of video data
  265.                     if ( !mMeasurementEnabled ) {
  266.                         copyAndSendPreviewFrame(frame, CAMERA_MSG_POSTVIEW_FRAME);
  267.                     } else {
  268.                         mFrameProvider->returnFrame(frame->mBuffer,
  269.                                                     (CameraFrame::FrameType) frame->mFrameType);
  270.                     }
  271.                 }
  272.                 else if ( ( CameraFrame::PREVIEW_FRAME_SYNC== frame->mFrameType ) &&
  273.                             ( NULL != mCameraHal ) &&
  274.                             ( NULL != mDataCb) &&
  275.                             ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) {
  276.                     //When enabled, measurement data is sent instead of video data
  277.                     if ( !mMeasurementEnabled ) {
  278.                         copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);
  279.                     } else {
  280.                          mFrameProvider->returnFrame(frame->mBuffer,
  281.                                                      (CameraFrame::FrameType) frame->mFrameType);
  282.                     }
  283.                 }
  284.                 else if ( ( CameraFrame::FRAME_DATA_SYNC == frame->mFrameType ) &&
  285.                             ( NULL != mCameraHal ) &&
  286.                             ( NULL != mDataCb) &&
  287.                             ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) {
  288.                     copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);
  289.                 } else {
  290.                     mFrameProvider->returnFrame(frame->mBuffer,
  291.                                                 ( CameraFrame::FrameType ) frame->mFrameType);
  292.                     CAMHAL_LOGDB("Frame type 0x%x is still unsupported!", frame->mFrameType);
  293.                 }

  294.                 break;

  295.         default:

  296.             break;

  297.         };

  298. exit:

  299.     if ( NULL != frame )
  300.         {
  301.         delete frame;
  302.         }

  303.     LOG_FUNCTION_NAME_EXIT;
  304. }
这里对不同的操作方式做了不同的处理,我们还是先通过preview过程作分析,如上面标注的那样处理,这里看看 copyAndSendPreviewFrame 的实现方法:

  1. void AppCallbackNotifier::copyAndSendPreviewFrame(CameraFrame* frame, int32_t msgType)
  2. {
  3.     camera_memory_t* picture = NULL;
  4.     CameraBuffer * dest = NULL;

  5.     // scope for lock
  6.     {
  7.         Mutex::Autolock lock(mLock)

你可能感兴趣的:(android,Camera)