virtual camera

        敲下标题的这一刻,内心还是有点儿小激动的。毕竟虚拟摄像头,也做了几个星期了,硬生生的在android原生系统不支持的情况下,绕过重重限制,完美的实现了这一功能。接下来几天,终于可以睡个好觉了。

        好了,闲话少说,我们先来分析下虚拟摄像头。一说到虚拟摄像头,大家印象里,肯定首先想到的是这么一个情景:某个猥琐男在网上正兴致勃勃的撩妹,看着视频里美女漂亮的容颜,不仅春心大动,口水流了一地。而实际上,在网络的另一端,和这猥琐男聊天视频的,根本就不是视频里的美女,而是一位一边抠着脚一边抽着烟的大汉。呃,那么问题来了,这个抠脚大汉,是怎么变成了迷死人不偿命的美女的呢?这其中原理,就是一个虚拟摄像头。抠脚大汉手机端,通过虚拟摄像头技术,将一段事先让美女录好的视频,喂给摄像头接口。摄像头这时读取的,不是真实的实时数据,而是事先录好的这视频里的数据。猥琐男那边,展现出来的,自然也就是录好了的视频了。

        上面所说的,是虚拟摄像头的一种应用情景,另外还有一种情景。那就是多个app,同时打开同一个摄像头,预览同一份camera数据。当然,这里所说的多个app打开同一个摄像头,不是指在同一个app里,创建多个surface,让多个surface显示同一份摄像头数据。在同一个app里将camera数据分发到不同的surface上,这很简单,原生的camera框架就支持,具体的可以参考我之前写的那篇博客。

        而要让多个app,同时打开同一个摄像头,显示同一份数据,这就有些麻烦了。首先,camera框架,不允许同时有多个app使用camera,哪怕是多个app打开不同id的摄像头。当然,这个限制,也很好破解。我们可以将services/camera/libcameraservice/utils/ClientManager.h这个文件里的ClientManager构造函数,做如下修改:

//这是原生的,mMaxCost表示最多同时可以有多少个app打开camera
template
ClientManager::ClientManager(int32_t totalCost) : mMaxCost(totalCost) {}


//解决多进程打开camera冲突的问题, *10表示最多可以同时打开10个摄像头
template
ClientManager::ClientManager(int32_t totalCost) : mMaxCost(totalCost*10) {}

        上面这样改的原理是,我们在打开一个camera的时候,会调用到CameraService.cpp里的connectHelper函数,在这个函数里,会调用handleEvictionsLocked这个函数去检测当前有无冲突。再跟进handleEvictionsLocked这个函数,会发现它又调用了mActiveClientManager.wouldEvict这个函数,当这个函数返回有冲突的client后,紧接着,就会调用下面的代码,将这个client给disconnect掉。

    for (auto& i : evictedClients) {
        // Disconnect is blocking, and should only have returned when HAL has cleaned up
        i->getValue()->disconnect(); // Clients will remove themselves from the active client list
    }

        wouldEvict这个函数定义在frameworks\av\services\camera\libcameraservice\utils\ClientManager.h里,它调用了wouldEvictLocked这个函数。在这个函数里,会去判断当前调用的app的优先级,以及当前打开的camera的个数。

    for (const auto& i : mClients) {
        const KEY& curKey = i->getKey();
        int32_t curCost = i->getCost();
        ClientPriority curPriority = i->getPriority();
        int32_t curOwner = i->getOwnerId();

        bool conflicting = (curKey == key || i->isConflicting(key) ||
                client->isConflicting(curKey));
         //下面如果强制写成false,则cameraservice::handleEvictionsLocked函数里,不会再去强制检测camera是否已经打开过
        conflicting = false;          
        ALOGD("wouldEvictLocked totalCost=%" PRId64 ", mMaxCost=%d, conflicting=%d", totalCost, mMaxCost, conflicting);
        if (!returnIncompatibleClients) {
            // Find evicted clients

            if (conflicting && curPriority < priority) {
                // Pre-existing conflicting client with higher priority exists
                evictList.clear();
                evictList.push_back(client);
                return evictList;
            } else if (conflicting || ((totalCost > mMaxCost && curCost > 0) &&
                    (curPriority >= priority) &&
                    !(highestPriorityOwner == owner && owner == curOwner))) {
                // Add a pre-existing client to the eviction list if:
                // - We are adding a client with higher priority that conflicts with this one.
                // - The total cost including the incoming client's is more than the allowable
                //   maximum, and the client has a non-zero cost, lower priority, and a different
                //   owner than the incoming client when the incoming client has the
                //   highest priority.
                evictList.push_back(i);
                totalCost -= curCost;
            }
        } else {
            // Find clients preventing the incoming client from being added

            if (curPriority < priority && (conflicting || (totalCost > mMaxCost && curCost > 0))) {
                // Pre-existing conflicting client with higher priority exists
                evictList.push_back(i);
            }
        }
    }

        当如果打开的个数,大于了mMaxCost,就会将当前的app当成有冲突的,push到冲突list里去,然后在cameraService.cpp里给disconnect掉。

        好了,这个多个app打开camera的个数的限制,就先讲到这里了。这个网上有很多更详细的资料,大家可以去参考一下。

        现在解决了第一个问题,让多app,可以同时打开不同的摄像头了。要想打开同一个摄像头,还有问题。如果是同一个摄像头,且当前的app的优先级,低于已经打开的app的优先级,就会将当前的这个app给关掉。如果当前的优先级高于已经打开的app,那么就会将已经打开的app里的camera给关掉。这也在上面这段代码里可以体现出来,我们直接将conflicting写成false即可。

        将conflicting写成false后,cameraService不再会去检测当前的camera id是否已经打开过,它会继续往下走,会去调用makeClient。最终在打开真实摄像头时,由hal层上报一个已经打开过的错误。

        为了解决这个问题,让多应用打开同一个camera,就必须让hal层也不报这个错。但是如果按现有流程走下去,调用open函数打开camera,肯定是不行的。明摆着嘛,已经打开了一次,怎么能再打开一次呢?那么这个问题,该怎么去处理呢?

        答案就是——虚拟摄像头。

        我们可以在hal层,新增一个camera的hal模块出来,它定义了camera_module_t

camera_module_t HAL_MODULE_INFO_SYM = {
    .common = {
        .tag                = HARDWARE_MODULE_TAG,
        .module_api_version = CAMERA_MODULE_API_VERSION_2_3,
        .hal_api_version    = HARDWARE_HAL_API_VERSION,
        //.id                 = CAMERA_HARDWARE_MODULE_ID, 
        .id                 = "virtual_camera",
        .name               = "virtual_camera", 
        .author             = "Antmicro Ltd.",
        .methods            = &android::HalModule::moduleMethods,
        .dso                = NULL,
        .reserved           = {0}
    },
    .get_number_of_cameras  = android::HalModule::getNumberOfCameras,
    .get_camera_info        = android::HalModule::getCameraInfo,
    .set_callbacks          = android::HalModule::setCallbacks,
};

        它实现了open函数

static struct hw_module_methods_t moduleMethods = {
    .open = openDevice
};
static int openDevice(const hw_module_t *module, const char *name, hw_device_t **device) {
    ALOGI("%s: lihb openDevice, name=%s", __FUNCTION__, name);
    if (module != &HAL_MODULE_INFO_SYM.common) {
        ALOGI("%s: invalid module (%p != %p)", __FUNCTION__, module, &HAL_MODULE_INFO_SYM.common);
        return -EINVAL;
    }
    if (name == NULL) {
        ALOGI("%s: NULL name", __FUNCTION__);
        return -EINVAL;
    }
    errno = 0;
    int cameraId = (int)strtol(name, NULL, 10);
    ALOGI("%s: cameraId: %d, getNumberOfCameras: %d", __FUNCTION__, cameraId, getNumberOfCameras());
    if(errno || cameraId < 0 || cameraId >= getNumberOfCameras()) {
        ALOGI("%s: invalid camera ID (%s)", __FUNCTION__, name);
        return -EINVAL;
    }
    if(!cams[cameraId]->isValid()) {
        ALOGI("%s: camera %d is not initialized", __FUNCTION__, cameraId);
        *device = NULL;
        return -ENODEV;
    }

    return cams[cameraId]->openDevice(device);
}

        它实现了getCameraInfo、setCallbacks等标准的hal接口,它的camera类继承于camera3_device,它实现了camera hal3的所有标准接口:

class Camera: public camera3_device {
public:
    Camera();
    virtual ~Camera();

    bool isValid() { return mValid; }

    virtual status_t cameraInfo(struct camera_info *info);

    virtual int openDevice(hw_device_t **device);
    virtual int closeDevice();
    void YV12ToI420(uint8_t *YV12,char *I420, int w,int h);
protected:
    virtual camera_metadata_t * staticCharacteristics();
    virtual int initialize(const camera3_callback_ops_t *callbackOps);
    virtual int configureStreams(camera3_stream_configuration_t *streamList);
    virtual const camera_metadata_t * constructDefaultRequestSettings(int type);
    virtual int registerStreamBuffers(const camera3_stream_buffer_set_t *bufferSet);
    virtual int processCaptureRequest(camera3_capture_request_t *request);

    /* HELPERS/SUBPROCEDURES */

    void notifyShutter(uint32_t frameNumber, uint64_t timestamp);
    void processCaptureResult(uint32_t frameNumber, const camera_metadata_t *result, const Vector &buffers);

    camera_metadata_t *mStaticCharacteristics;
    camera_metadata_t *mDefaultRequestSettings[CAMERA3_TEMPLATE_COUNT];
    CameraMetadata mLastRequestSettings;

    bool mValid;
    const camera3_callback_ops_t *mCallbackOps;

    size_t mJpegBufferSize;

private:
    ImageConverter mConverter;
    Mutex mMutex;
    uint8_t* mFrameBuffer;
    uint8_t* rszbuffer;

    /* STATIC WRAPPERS */

    static int sClose(hw_device_t *device);
    static int sInitialize(const struct camera3_device *device, const camera3_callback_ops_t *callback_ops);
    static int sConfigureStreams(const struct camera3_device *device, camera3_stream_configuration_t *stream_list);
    static int sRegisterStreamBuffers(const struct camera3_device *device, const camera3_stream_buffer_set_t *buffer_set);
    static const camera_metadata_t * sConstructDefaultRequestSettings(const struct camera3_device *device, int type);
    static int sProcessCaptureRequest(const struct camera3_device *device, camera3_capture_request_t *request);
    static void sGetMetadataVendorTagOps(const struct camera3_device *device, vendor_tag_query_ops_t* ops);
    static void sDump(const struct camera3_device *device, int fd);
    static int sFlush(const struct camera3_device *device);

    static camera3_device_ops_t sOps;
};

}; /* namespace android */

        总之,大家可以把它当成一个真实的camera,只不过它的open函数,不是去打开真实的物理摄像头,而是去mmap一块共享内存。它的processCaptureRequest,不是从真实的摄像头里实时的取数据,而是从这块共享内存里取数据。我们暂且将这个虚拟摄像头叫做virtual camera。这个virtual camera的逻辑,大致就是这样,hal的接口都是标准的,只是喂的数据,是从共享内存里拿过来的。而这块共享内存里的数据,又是谁喂给它的呢?答案是——真实的已经打开了的摄像头。

        我目前是在mtk 6762 8.1平台上调试的,在mtk平台上,camera hal里,camera出来的数据,经过3A之类的处理后,会先发给DisplayClient.BufOps.cpp这个文件的handleReturnBuffers。我们可以在这个文件里,通过uint8_t * srcBuf = (uint8_t *)pStreamImgBuf->getVirAddr(),来将当前帧的数据给取出来,然后写到共享内存里去。

        当然,其他的平台,比如高通、展迅等,都是一个原理。找到hal层数据出来的地方,写到共享内存里,然后在virtual camera里的processCaptureRequest里给取出来。这样virtual camera就可得到跟真实camera同样的数据了。

        不过,真实的camera,从hal层出来的数据,都是yuv格式的,比如我这个平台的这个摄像头,出来的原始数据就是yv12格式的。在virtual camera里的processCaptureRequest里,不能直接将它丢给process_capture_result回调函数。因为camera预览数据,是要ABGR格式的。因为android的surface上显示的数据,它的buffer,实际上也就是GraphicBuffer。系统的GraphicBuffer的默认格式,一般都是rgb的。所以我们在从共享内存里取到数据后,还需要转换一下行才。

        而yv12,不能直接转成rgb的,要先转成i420的,下面附上转换函数:

/*
I420: YYYYYYYY UU VV    =>YUV420P
YV12: YYYYYYYY VV UU    =>YUV420P
yv12转I420,只需要将它们的u、v分量调换位置即可
*/
void Camera::YV12ToI420(uint8_t *YV12,char *I420, int w,int h)
{
    memcpy(I420, YV12, w*h);//y分量
    memcpy(I420+w*h, YV12+w*h+w*h/4, w*h/4);//V分量
    memcpy(I420+w*h+w*h/4, YV12+w*h, w*h/4);//u分量
}

        上面转成了i420后,就可以调用libyuv的I420ToABGR函数了。这里转出来的数据,是可以直接送到surface上显示的。至于共享内存这一块,在android8.0以后是怎么处理的,大家也可以参照我之前的博客。

        总的来说,需要添加一个多应用打开同一个camera的虚拟摄像头,需要处理的有下面几大块:

        1.)在cameraService里,去掉多应用打开摄像头(不限于同一个camera id)的限制

        2.)新增共享内存hidl服务模块

        3.)新增一个虚拟摄像头的hal(详细过程,参照我之前写的新增一个摄像头的博客)

        4.)在真实的camera hal里,将从驱动读取到的数据,拷到共享内存里去

        5.)在virtual camera的processCaptureRequest里,将数据读出来,并转换成对应的rgba格式的。

        做完上面的这些工作后,新增虚拟摄像头的主要工作,就做完了。不过还有很多细节,需要一一处理。

        比如,我们新增了virtual camera,但是app不知道,我们只告诉app开发者,你可以用多个app同时打开同一个camera id了。至于virtual camera 在hal层和framework是怎么处理的,app开发者并不知道,也不应该让他们来关心这些。他们需要做的只有一点,即在app1里,Camera.open(0);这样打开id为0的摄像头。同时,有可能在app2里,也调用Camera.open(0);,打开同一个摄像头。甚至还有可能在app3、app4........里,都调用Camera.open(0);去打开id为0的同一个摄像头。

        那么,问题来了。都是用的同一个id来打开的摄像头,framework层,怎么去调到我们的virtual camera呢?这就需要在connectHelper里处理了,我们需要在这个函数里,调用handleEvictionsLocked这个函数之前,加上下面一段代码:

            auto current = mActiveClientManager.get(cameraId);
            if (current != nullptr) 
            {
                char c_camera_ref[PROPERTY_VALUE_MAX] = {'\0'};
                int  i_camera_ref = 0;
                property_get(YOV_VIRTUAL_CAMERA_REF, c_camera_ref, "0");
                i_camera_ref = atoi(c_camera_ref);

                cameraId = "2";
                auto tempClient = mActiveClientManager.get(cameraId);
                if(tempClient != nullptr)
                {
                    cameraId = "3";
                }
            }

        我调试的平台上,有前后两个摄像头,所以我的virtual camera,它的id是从2开始的(因为mtk camera hal是用的hal1,我的virtual camera hal是hal3。所以在处理hal1和hal3对应的camera个数时,它们是互相不知道对方有多少个数的,我这里得将它写死),如果想要有多个虚拟摄像头,它们的id就会往下顺延2、3、4........

        上面这段代码的意思是,如果当前的camera id是已经打开过的,那么,我就将id悄悄的改为id为2的虚拟摄像头。如果id为2的virtual camera也打开了,那么我们就打开id为3的virtual camera。当然,如果我们有10个、20个虚拟摄像头,都可以按上面的逻辑来处理。

        我们每打开一个camera,在connectHelper里每makeClient出来一个client,都紧接着调用了finishConnectLocked(client, partial);这个函数,将这个client,通过finishConnectLocked函数里的mActiveClientManager.addAndEvict(clientDescriptor);       这行代码,给添加到了 mActiveClientManager这个list里。这正是我们通过它来判断一个id对应的camera有没有被打开的关键。

        到这里,我们基本上能做到两个app同时打开一个camera了。但是,如果主app(我们这里将第一个打开camera,打开了真实的camera的app称做主app)退出了,从app(我们将第二个、第三个打开已经打开过的id的camera的,调到了virtual camera 的app称做从app)会立马停止预览。因为主app退出时,会从hal层,将真实的camera给关掉。真实的camera给关掉了,virtual camera从共享内存里拿不到新的数据,自然会停止预览了。

        为了解决这个问题,我们需要引入“引用计数”的概念。首先,我们需要在真实的camera打开的地方,将计数加1.mtk6762平台上,camera hal1对应的打开的地方是vendor\mediatek\proprietary\hardware\mtkcam\main\hal\device\1.x\device\CameraDevice1Base.cpp。我们在这个文件的open函数里,如下给对应的camera id做引用计数处理:

Return
CameraDevice1Base::
open(const sp& callback)
{
    ........
    //真实设备只可能打开一次,计数只能为0或1
    if(mInstanceId == 0)
    {
        property_set(CAMERA0_REF, "1");
    }
    else if(mInstanceId == 1)
    {
        property_set(CAMERA1_REF, "1");
    }
    ........
}

           在close里,如下做引用计数递减处理:

Return
CameraDevice1Base::
close()
{
    ........
    if(mInstanceId == 0)
    {
        property_set(CAMERA0_REF, "0");
    }
    else if(mInstanceId == 1)
    {
        property_set(CAMERA1_REF, "0");
    }
    ........
}

         在virtual camera的open函数里,做如下引用计数处理:

int Camera::open(hw_device_t **device) 
{
    ........
    //虚拟摄像头,可以打开多个,所以引用计数可以一直增加
    char c_camera_ref[PROPERTY_VALUE_MAX] = {'\0'};
    int  i_camera_ref = 0;
    property_get(VIRTUAL_CAMERA_REF, c_camera_ref, "0");
    i_camera_ref = atoi(c_camera_ref);    
    ++i_camera_ref;
    memset(c_camera_ref, '\0', PROPERTY_VALUE_MAX);
    sprintf(c_camera_ref, "%d", i_camera_ref);
    property_set(VIRTUAL_CAMERA_REF, c_camera_ref);
    ........
}

        这里要注意, virtual camera的减计数,不能放在virtual camera的close函数里. 因为我们呆待在cameraService.cpp里,还要用到它.        

        然后,在调用真实camera的app退出,调用camera.stopPreview()、camera.release()后,需要用上面的引用计数来拦截对底层camera驱动的关闭释放操作。mtk 6762 8.1上,用到的camera hal是hal1。无论你用的camera api是1还是2,makeclient时,都是make的CameraClient这一个。app对camera的操作,最终都会经过这里。所以我们拦截时,就可以在这里下手。

        比如当用户调用stopPreview时,我们需要在frameworks\av\services\camera\libcameraservice\api1\CameraClient.cpp里做如下处理:

// stop preview mode
void CameraClient::stopPreview() {
    LOG1("stopPreview (pid %d), mCameraId=%d", getCallingPid(), mCameraId);
    Mutex::Autolock lock(mLock);

    if(mCameraId != 2 && mCameraId != 3)
    {
        //add by xuhui
        char c_camera_ref[PROPERTY_VALUE_MAX] = {'\0'};
        int  i_camera_ref = 0;
        property_get(VIRTUAL_CAMERA_REF, c_camera_ref, "0");
        i_camera_ref = atoi(c_camera_ref);

        //当前virtual camera已经打开,这时不能关闭。要等virtual camera先关掉,再来关这里。
        if(i_camera_ref > 0)
        {
            ALOGE("CameraClient::stopPreview, virtual camera exist, don't stopPreview");
            return;
        }
    }

    
    if (checkPidAndHardware() != NO_ERROR) return;


    disableMsgType(CAMERA_MSG_PREVIEW_FRAME);
    mHardware->stopPreview();
    sCameraService->updateProxyDeviceState(
        hardware::ICameraServiceProxy::CAMERA_STATE_IDLE,
        mCameraIdStr, mCameraFacing, mClientPackageName);
    mPreviewBuffer.clear();
}

        然后在用户调用release时,需要在disconnect里做如下处理:

binder::Status CameraClient::disconnect() {
    int callingPid = getCallingPid();
    Mutex::Autolock lock(mLock); 
    binder::Status res = binder::Status::ok(); 
    if(mCameraId != 2 && mCameraId != 3 )
    {
        //add by xuhui
        sp client = NULL;
        char c_camera0_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera1_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera_virtual_ref[PROPERTY_VALUE_MAX] = {'\0'};
        int  i_camera_virtual_ref = 0;
        int  i_camera0_ref = 0;
        int  i_camera1_ref = 0;
        property_get(CAMERA0_REF, c_camera0_ref, "0");
        i_camera0_ref = atoi(c_camera0_ref);
        property_get(CAMERA1_REF, c_camera1_ref, "0");
        i_camera1_ref = atoi(c_camera1_ref);
        property_get(VIRTUAL_CAMERA_REF, c_camera_virtual_ref, "0");
        i_camera_virtual_ref = atoi(c_camera_virtual_ref);    
        if(i_camera_virtual_ref > 0)
        {
            ALOGE("CameraClient::disconnect, mCameraId=%d, virtual camera exist, don't disconnect", mCameraId);
            //-1表示正常的camera想退出,但是虚拟摄像头打开了,暂时不能退出,等虚拟摄像头关闭时,再退出
            if(mCameraId == 0)
            {
                property_set(CAMERA0_REF, "-1");
            }
            else if(mCameraId == 1)
            {
                property_set(AMERA1_REF, "-1");
            }
            return res;
        }
        
    }        
#endif
    // Allow both client and the cameraserver to disconnect at all times
    if (callingPid != mClientPid && callingPid != mServicePid) {
        ALOGE("different client - don't disconnect");
        //这里如果不屏掉,那么就不能通过其他进程来关闭之前打开的camera
       // return res;
    }
    // Make sure disconnect() is done once and once only, whether it is called
    // from the user directly, or called by the destructor.
    if (mHardware == 0) return res;

    LOG1("CameraClient::disconnect, hardware teardown");
    // Before destroying mHardware, we must make sure it's in the
    // idle state.
    // Turn off all messages.
    disableMsgType(CAMERA_MSG_ALL_MSGS);
//!++
    disableMsgType(MTK_CAMERA_MSG_ALL_MSGS);
//!--
    mHardware->stopPreview();
    sCameraService->updateProxyDeviceState(
            hardware::ICameraServiceProxy::CAMERA_STATE_IDLE,
            mCameraIdStr, mCameraFacing, mClientPackageName);
    mHardware->cancelPicture();
    // Release the hardware resources.
    mHardware->release();
    // Release the held ANativeWindow resources.
    if (mPreviewWindow != 0) {
        disconnectWindow(mPreviewWindow);
        mPreviewWindow = 0;
        mHardware->setPreviewWindow(mPreviewWindow);
    }
    mHardware.clear();

    CameraService::Client::disconnect();

    LOG1("CameraClient::disconnect end,  (pid %d)", callingPid);

    return res;
}

        这里是将真实的camera对hal层的stopPreview和release给拦截掉了,那么它该在什么地方去真正的释放呢?按常理来讲,主app都已经退出了,比对失去了对主app里面创建的camera对象的控制权,其他的app更加没有权限去操作它。那么该怎么让真实的app真正的退出呢?

        不知道大家还记不记得上面说过的,在cameraService.cpp里,每makeclient出来一个client,都被保存到了mActiveClientManager这个里面。这个make出来的client,是一个BasicClient,是cameraClient的祖父类。所以在一个camera被关闭的时候,一定会走到BasicClient的disconnect。我们真正关闭真实camera的地方,就在这个函数里。

        

binder::Status CameraService::BasicClient::disconnect() {
    binder::Status res = Status::ok();
        
    if (mDisconnected) {
        return res;
    }
    mDisconnected = true;

    sCameraService->removeByClient(this);
    sCameraService->logDisconnected(mCameraIdStr, mClientPid,
            String8(mClientPackageName));

    sp remote = getRemote();
    if (remote != nullptr) {
        remote->unlinkToDeath(sCameraService);
    }

    finishCameraOps();
    // Notify flashlight that a camera device is closed.
    sCameraService->mFlashlight->deviceClosed(mCameraIdStr);
    ALOGI("%s: Disconnected client for camera %s for PID %d", __FUNCTION__, mCameraIdStr.string(),
            mClientPid);

    // client shouldn't be able to call into us anymore
    mClientPid = 0;

    
        int id = cameraIdToInt(mCameraIdStr);
    
        if(id == 2 || id == 3)
        {
            //add by xuhui
            sp client = NULL;
            
            char c_camera0_ref[PROPERTY_VALUE_MAX] = {'\0'};
            char c_camera1_ref[PROPERTY_VALUE_MAX] = {'\0'};
            char c_camera_virtual_ref[PROPERTY_VALUE_MAX] = {'\0'};
            int  i_camera_virtual_ref = 0;
            int  i_camera0_ref = 0;
            int  i_camera1_ref = 0;
            
            property_get(CAMERA0_REF, c_camera0_ref, "0");
            i_camera0_ref = atoi(c_camera0_ref);
            
            property_get(CAMERA1_REF, c_camera1_ref, "0");
            i_camera1_ref = atoi(c_camera1_ref);
    
            property_get(VIRTUAL_CAMERA_REF, c_camera_virtual_ref, "0");
            i_camera_virtual_ref = atoi(c_camera_virtual_ref);  
            --i_camera_virtual_ref;
            if(i_camera_virtual_ref < 0)
            {
                i_camera_virtual_ref = 0;
            }
            sprintf(c_camera_virtual_ref, "%d", i_camera_virtual_ref);
            property_set(VIRTUAL_CAMERA_REF, c_camera_virtual_ref);
    
            //-1表示调用真实摄像头的app调用了close,这时如果虚拟摄像头的计数为0了,表示
            //没有人引用虚拟摄像头了,那么真实摄像头也可以关了。
            if(i_camera_virtual_ref == 0)
            {        
                if(i_camera0_ref == -1)
                {
                    String8 cameraId0("0");
                    client = sCameraService->mActiveClientManager.getCameraClient(cameraId0);
                }
                else if(i_camera1_ref == -1)
                {
                    String8 cameraId1("1");
                    client = sCameraService->mActiveClientManager.getCameraClient(cameraId1);
                }      
                if(client != NULL)
                {
                    client->disconnect();
                }
            }
        }            

    return res;
}

        通过上面的这个函数,就可以将真实的virtual camera给关了。

        到这一步,真实的camera和virtual camera可以同时打开了,并且真实的摄像头在关闭时,并不会关闭真实的摄像头的驱动,而是等到虚拟摄像头被关闭时,才去关闭它。

        但是现在这样还够完善,比如,主app退出时,调用了windowManager.removeView(surfaceView);之类的,将主app里的摄像头对应的surface给移除了的话,在hal层,就会因为没有了ANativeWindow,也即surface,没有了它对应的dequeue_buffer、enqueue_buffer等函数的调用,而导致camera hal的数据无法取出。体现在app上也就是预览界面突然黑屏了,或者定格在上一帧不同了。体现在log上,会报如下的错:

05-25 18:22:20.292  3542 15151 E BufferQueueProducer: [SurfaceTexture-1-3542-0](this:0xc35b5000,id:1,api:4,p:442,c:-1) queueBuffer: BufferQueue has been abandoned
05-25 18:22:20.293   442   663 E Surface : queueBuffer: error queuing buffer to SurfaceTexture, -19
05-25 18:22:20.293   461 16924 E MtkCam/DisplayClient: (16924)[enquePrvOps] mpStreamOps->enqueue_buffer failed: status[Function not implemented(38)], rpImgBuf(0xdd60c510,0xd5bbd000) (enquePrvOps){#589:vendor/mediatek/proprietary/hardware/mtkcam/middleware/v1/client/DisplayClient/DisplayClient.Stream.cpp}

        这里顺便说一下,这个app上的surface和mtk camera hal层的关系。app上的surface,对应的就是DisplayClient.h里的mpStreamOps,它声明在DisplayClient.h里

preview_stream_ops*             mpStreamOps;

        preview_stream_ops的结构定义如下:

typedef struct preview_stream_ops {
    int (*dequeue_buffer)(struct preview_stream_ops* w,
                          buffer_handle_t** buffer, int *stride);
    int (*enqueue_buffer)(struct preview_stream_ops* w,
                buffer_handle_t* buffer);
    int (*cancel_buffer)(struct preview_stream_ops* w,
                buffer_handle_t* buffer);
    int (*set_buffer_count)(struct preview_stream_ops* w, int count);
    int (*set_buffers_geometry)(struct preview_stream_ops* pw,
                int w, int h, int format);
    int (*set_crop)(struct preview_stream_ops *w,
                int left, int top, int right, int bottom);
    int (*set_usage)(struct preview_stream_ops* w, int usage);
    int (*set_swap_interval)(struct preview_stream_ops *w, int interval);
    int (*get_min_undequeued_buffer_count)(const struct preview_stream_ops *w,
                int *count);
    int (*lock_buffer)(struct preview_stream_ops* w,
                buffer_handle_t* buffer);
    // Timestamps are measured in nanoseconds, and must be comparable
    // and monotonically increasing between two frames in the same
    // preview stream. They do not need to be comparable between
    // consecutive or parallel preview streams, cameras, or app runs.
    int (*set_timestamp)(struct preview_stream_ops *w, int64_t timestamp);
} preview_stream_ops_t;

        它通过DisplayClient.cpp里的setWindow调用DisplayClient.Stream.cpp文件中的set_preview_stream_ops来将app上传下来的surface传下来设置给mpStreamOps。如果一层层的追上去,我们可以从CameraClient.cpp里的setPreviewTarget函数看起。

// set the buffer consumer that the preview will use
status_t CameraClient::setPreviewTarget(
        const sp& bufferProducer) {
    sp binder;
    sp window;     
    if (bufferProducer != 0) {
        binder = IInterface::asBinder(bufferProducer);
        window = new Surface(bufferProducer, /*controlledByApp*/ true);
    }
    return setPreviewWindow(binder, window);
}

status_t CameraClient::setPreviewWindow(const sp& binder,
        const sp& window) {
    ......
    result = mHardware->setPreviewWindow(window);
    ......
}


        然后,如果在app上将surface给移除后,会在DisplayClient.Stream.cpp里的dequePrvOps函数里,调用err = mpStreamOps->dequeue_buffer(mpStreamOps, &phBuffer, &stride);这一行时报错,取不到buff。然后影响到上一层的DisplayClient.BufOps.cpp文件里的DisplayClient::prepareOneTodoBuffer这个函数,在这里调用dequePrvOps(pStreamImgBuf)会报错。再然后会影响到DisplayClient.BufOps.cpp文件里的DisplayClient::prepareAllTodoBuffers函数,在prepareOneTodoBuffer(rpBufQueue) 时报错,再接着会影响到DisplayClient.BufOps.cpp文件里的DisplayClient::onThreadLoop函数的死循环。结果是在这个死循环里,一直取不到数据,导致预览黑屏。

        要解决这个问题,就需要让app上的这个surface不要释放。但这是这可能的,即使app上不显示的将它释放掉,在app退出一段时间后,系统也会将退出了的进程占用的资源给释放掉。所以要另想其他的办法。

        所谓的其他办法,也很简单。重新new一个Surface给hal层,就可以了。

void CameraClient::setNewPreviewWindow()
{
    ALOGE("CameraClient::setNewPreviewWindow() start");
    BufferQueue::createBufferQueue(&mNewProducer, &mNewConsumer);
    ALOGE("CameraClient::setNewPreviewWindow() 1");    
    GLuint texName;
    glGenTextures(1, &texName);
    mNewSurfaceTexture = new GLConsumer(mNewConsumer, texName,
                    GL_TEXTURE_EXTERNAL_OES, true, true);
    if (mNewSurfaceTexture == 0) {
        ALOGE("CameraClient::setNewPreviewWindow() 2, Unable to create native SurfaceTexture");
        return;
    }
    ALOGE("CameraClient::setNewPreviewWindow() 3"); 
    mNewSurfaceTexture->setName(String8::format("SurfaceTexture-%d-%d-%d",
            texName,
            getpid(),
            createProcessUniqueId()));    
                    
    sp stListener(new SurfaceTextureListener());
    stListener->mtListener=stListener;
    mNewSurfaceTexture->setFrameAvailableListener(stListener);  

    //下面这一行,看可不可以不需要
    mNewSurfaceTexture->setDefaultBufferSize(1280, 720);    
    bool useAsync = false;
    status_t res;
    int32_t consumerUsage;
    if ((res = mNewProducer->query(NATIVE_WINDOW_CONSUMER_USAGE_BITS,
            &consumerUsage)) != OK) {
        ALOGE("CameraClient::setNewPreviewWindow() 5: Camera : Failed to query mNewConsumer usage");
        return;
    }
    if (consumerUsage & GraphicBuffer::USAGE_HW_TEXTURE) {
        ALOGE("CameraClient::setNewPreviewWindow() 6: Camera : Forcing asynchronous mode for stream");
        useAsync = true;
    }
    ALOGE("CameraClient::setNewPreviewWindow() 7"); 
    mNewSurface = new Surface(mNewProducer, useAsync);
    
    //ANativeWindow *anw = surface.get();

    ALOGE("CameraClient::setNewPreviewWindow() 8");  
    sp binder;
  //  sp window;     
    if (mNewProducer != 0) {
        ALOGE("CameraClient::setNewPreviewWindow() 9"); 
        binder = IInterface::asBinder(mNewProducer);
    }
    setPreviewWindow(binder, mNewSurface);    
    ALOGE("CameraClient::setNewPreviewWindow() end"); 
}

        这个函数在哪里调呢,在CameraClient::disconnect()里,当app release camera的时候,我们手动再设置一个surface下去,就可以了。

binder::Status CameraClient::disconnect() {
    ......
    if(mCameraId != 2 && mCameraId != 3 )
    {
        //add by xuhui
        sp client = NULL;
        
        char c_camera0_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera1_ref[PROPERTY_VALUE_MAX] = {'\0'};
        char c_camera_virtual_ref[PROPERTY_VALUE_MAX] = {'\0'};
        int  i_camera_virtual_ref = 0;
        int  i_camera0_ref = 0;
        int  i_camera1_ref = 0;
        ALOGE("CameraClient::disconnect 3");
        property_get(CAMERA0_REF, c_camera0_ref, "0");
        i_camera0_ref = atoi(c_camera0_ref);
        
        property_get(CAMERA1_REF, c_camera1_ref, "0");
        i_camera1_ref = atoi(c_camera1_ref);

        ALOGE("CameraClient::disconnect 4");
        property_get(VIRTUAL_CAMERA_REF, c_camera_virtual_ref, "0");
        i_camera_virtual_ref = atoi(c_camera_virtual_ref);    
        ALOGE("CameraClient::disconnect, i_camera0_ref=%d, i_camera1_ref=%d, , i_camera_virtual_ref=%d", 
               i_camera0_ref, i_camera1_ref, i_camera_virtual_ref);
        if(i_camera_virtual_ref > 0)
        {
            ALOGE("CameraClient::disconnect, mCameraId=%d, virtual camera exist, don't disconnect", mCameraId);
            //-1表示正常的camera想退出,但是虚拟摄像头打开了,辊时不能退出,等虚拟摄像头关闭时,再退出
            if(mCameraId == 0)
            {
                property_set(CAMERA0_REF, "-1");
            }
            else if(mCameraId == 1)
            {
                property_set(CAMERA1_REF, "-1");
            }
            ALOGE("CameraClient::disconnect setNewPreviewWindow");
            setNewPreviewWindow();
            return res;
        }
        
    }    
    ......   
}

        好了,多app打开同一个摄像头的方案,到此就全部讲完了。现在我这边可以同时让一个app在后台录相,在前台同时有一个app同开打一个摄像头了。

        如果这个时候,需要有后台录相,则还需要改动如下两个地方:

        1.)vendor\mediatek\proprietary\hardware\mtkcam\utils\hw\CamManager.cpp

       

bool
CamManager::
getPermission() const
{
    Mutex::Autolock _l(mLockMtx);
    //
    MY_LOGE("OpenId.size(%zu), mbRecord(%d), mbAvailable(%d), mbStereo(%d), 0:fps(%d); 1:fps(%d)",
            mvOpenId.size(), mbRecord, mbAvailable, mbStereo, getFrameRate(0), getFrameRate(1));
    //要允许在录相的同时打开摄像头,要不然如果一个app在录相,另一个就没办法打开了。
#ifdef ENABLE_VIRTUAL_CAMERA    
    return mbAvailable && !mbStereo;
#else
    return !mbRecord && mbAvailable && !mbStereo;
#endif    
}

          如果上面这样改一下,那么如果后台正在录相时,另一个app去打开摄像头的话,就会报错"Cannot start preview ... Permission denied"。这是因为正在录相时,如果再用另一个app去打开摄像头的话,在vendor\mediatek\proprietary\hardware\mtkcam\main\hal\device\1.x\device\CameraDevice1Base.cpp的startPreview()函数里,会做如下的检查:

Return
CameraDevice1Base::
startPreview()
{
    ......
        //  Check Permission.
        if ( ! pCamMgr->getPermission() )
        {
            MY_LOGE("Cannot start preview ... Permission denied");
            sp spFrameworkCBThread = IFrameworkCBThread::createInstance(getInstanceId(),mpCamMsgCbInfo);
            spFrameworkCBThread->init();
            IFrameworkCBThread::callback_data cbData;
            cbData.callbackType = IFrameworkCBThread::CALLBACK_TYPE_NOTIFY;
            cbData.type         = CAMERA_MSG_ERROR;
            cbData.ext1         = CAMERA_ERROR_SERVER_DIED;
            cbData.ext2         = 0;
            spFrameworkCBThread->postCB(cbData);
            spFrameworkCBThread->uninit();
            spFrameworkCBThread = NULL;
            status = OK;
            goto lbExit;
        }
    ......
}

        这里就会调到vendor\mediatek\proprietary\hardware\mtkcam\utils\hw\CamManager.cpp里的getPermission函数。它会检查,当前开启预览时,是不是有其他地方正在录相,如果有的话,就不准再打开预览了。如果想让一个app录相的同时,另一个app可以正常打开进行操作,这里就必须将这个检查给屏掉。

        2.)vendor\mediatek\proprietary\hardware\mtkcam\main\hal\device\1.x\device\depend\DefaultCameraDevice1.cpp

bool
DefaultCameraDevice1::
onStartRecording()
{
#if '1'== MTKCAM_HAVE_CAM_MANAGER
    //  (1) Check Permission.
#ifdef ENABLE_VIRTUAL_CAMERA
    return true;
#else
    if ( CamManager::getInstance()->isMultiDevice() )
    {
        MY_LOGE("Cannot start recording ... Permission denied");
        sp spFrameworkCBThread = IFrameworkCBThread::createInstance(getInstanceId(),mpCamMsgCbInfo);
        spFrameworkCBThread->init();
        IFrameworkCBThread::callback_data cbData;
        cbData.callbackType = IFrameworkCBThread::CALLBACK_TYPE_NOTIFY;
        cbData.type         = CAMERA_MSG_ERROR;
        cbData.ext1         = CAMERA_ERROR_SERVER_DIED;
        cbData.ext2         = 0;
        spFrameworkCBThread->postCB(cbData);
        spFrameworkCBThread->uninit();
        spFrameworkCBThread = NULL;
        return false;
    }
#endif    
#endif
    return true;
}

        onStartRecording()这个函数,被vendor\mediatek\proprietary\hardware\mtkcam\main\hal\device\1.x\device\CameraDevice1Base.cpp里的startRecording()函数调用

Return
CameraDevice1Base::
startRecording()
{
    ....
    if  ( ! onStartRecording() )
    {
        MY_LOGE("onStartRecording() fail");
        status = INVALID_OPERATION;
        goto lbExit;
    }
    ....
}

         这里的意思是,当app开启录相的时候,hal层会去检测一下,是否有其他摄像头打开了,如果有的话,就不准录相,会报错。当然,这里不改的话,对虚拟摄像头没有影响。虚拟摄像头打开了,再这个已经打开的物理摄像头录相,是不受影响的。因为虚拟摄像头我用的是hal3,物理摄像头用的是mtk自带的Hal1,它们是互不影响的。  受影响是另一个物理摄像头。比如和虚拟摄像头对应的物理摄像头是后摄,另一个物理摄像头是前摄。这是后摄已经打开了,虚拟摄像头也打开了。然后另一个app打开了前摄,这时打开后摄的应用再去录相,这里就会检测到,hal1里还有一个前摄是已经打开过的,所以这里就直接报错返回了。

        需要注意的是,上面这两个检查,只针对api1,api2不会做这个检查。因为api1和api2录相走的流程不一样,api2录相时,是直接将hal层传上来的buff给drawtoSurface上去了,api2录相一般用的摄像头源是surface,而不是camerasource,所以它不会在hal层去设置录相状态。

        

你可能感兴趣的:(camera,mtk,camera,hal,android,virtual,camera)