http://guoh.org/lifelog/2013/07/glance-at-camera-hal-2-0/
Android在4.2的时候对Camera HAL做了比较大的改动,基本是废弃了原先的CameraHardwareInterface,又弄了一套新的。所以它提供了两种方式实现,根据厂商实现HAL的版本在Camera Service层自动加载对应版本的fwk HAL。目前这块的介绍还是比较少,实现的厂商也比较少,大概有Qualcomm和Samsung在其platform上有去实现。
以下文字是以Qualcomm平台为基础,在AOSP代码基础上得出的一些理解,存在错误的地方请指出,本文也会随着进一步的学习而对错误的地方进行修正!
我有观看这样一个介绍的视频(题外话,开挂国家的男人讲的英语基本听不懂),自己想法出去看,因为视频无法下载下来,所以我截图了一些,放在Flickr上。
www.youtube.com/watch?v=Lald5txnoHw
以下是一段有关这段视频以及HAL 2.0的简单介绍
The Linux Foundation
Android Builders Summit 2013
Camera 2.0: The New Camera Hardware Interface in Android 4.2
By Balwinder Kaur & Ashutosh Gupta
San Francisco, CaliforniaAndroid 4.2 was released with a new Camera Hardware Abstraction Layer (HAL) Camera 2.0. Camera 2.0 has a big emphasis on collection and providing metadata associated with each frame. It also provides the ability to re-process streams. Although the APIs at the SDK level are yet to expose any new APIS to the end user, the Camera HAL and the Camera Service architecture has been revamped to a different architecture. This presentation provides an insight into the new architecture as well as covering some of the challenges faced in building production quality Camera HAL implementations.
The intended audience for this conference session is engineers who want to learn more about the Android Camera Internals. This talk should facilitate engineers wanting to integrate, improve or innovate using the Camera subsystem.
以下是我自己的一些理解。
HAL2如何同DRV沟通的?
HAL2 fwk和Vendor implementation是通过/path/to/aosp/hardware/qcom/camera/QCamera/HAL2/wrapper/QualcommCamera.cpp来绑定的,初始化名字为HMI的struct,这个跟原先的是一样的。
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
|
static
hw_module_methods_t camera_module_methods = {
open: camera_device_open,
};
static
hw_module_t camera_common = {
tag: HARDWARE_MODULE_TAG,
module_api_version: CAMERA_MODULE_API_VERSION_2_0,
hal_api_version: HARDWARE_HAL_API_VERSION,
id: CAMERA_HARDWARE_MODULE_ID,
name:
"Qcamera"
,
author:
"Qcom"
,
methods: &camera_module_methods,
dso: NULL,
reserved: {
0
},
};
camera_module_t HAL_MODULE_INFO_SYM = {
common: camera_common,
get_number_of_cameras: get_number_of_cameras,
get_camera_info: get_camera_info,
};
camera2_device_ops_t camera_ops = {
set_request_queue_src_ops: android::set_request_queue_src_ops,
notify_request_queue_not_empty: android::notify_request_queue_not_empty,
set_frame_queue_dst_ops: android::set_frame_queue_dst_ops,
get_in_progress_count: android::get_in_progress_count,
flush_captures_in_progress: android::flush_captures_in_progress,
construct_default_request: android::construct_default_request,
allocate_stream: android::allocate_stream,
register_stream_buffers: android::register_stream_buffers,
release_stream: android::release_stream,
allocate_reprocess_stream: android::allocate_reprocess_stream,
allocate_reprocess_stream_from_stream: android::allocate_reprocess_stream_from_stream,
release_reprocess_stream: android::release_reprocess_stream,
trigger_action: android::trigger_action,
set_notify_callback: android::set_notify_callback,
get_metadata_vendor_tag_ops: android::get_metadata_vendor_tag_ops,
dump: android::dump,
};
|
QCameraHWI当中
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
|
QCameraHardwareInterface::
QCameraHardwareInterface(
int
cameraId,
int
mode)
: mCameraId(cameraId)
{
cam_ctrl_dimension_t mDimension;
/* Open camera stack! */
memset(&mMemHooks, 0, sizeof(mm_camear_mem_vtbl_t));
mMemHooks.user_data=this;
mMemHooks.get_buf=get_buffer_hook;
mMemHooks.put_buf=put_buffer_hook;
mCameraHandle=camera_open(mCameraId, &mMemHooks);
ALOGV("Cam open returned %p",mCameraHandle);
if(mCameraHandle == NULL) {
ALOGE("startCamera: cam_ops_open failed: id = %d", mCameraId);
return;
}
mCameraHandle->ops->sync(mCameraHandle->camera_handle);
mChannelId=mCameraHandle->ops->ch_acquire(mCameraHandle->camera_handle);
if(mChannelId<=0)
{
ALOGE("%s:Channel aquire failed",__func__);
mCameraHandle->ops->camera_close(mCameraHandle->camera_handle);
return;
}
/* Initialize # of frame requests the HAL is handling to zero*/
mPendingRequests=
0
;
}
|
调用mm_camera_interface当中的camera_open()
进而调用mm_camera当中的mm_camera_open()
这里还是是通过V4L2去同DRV沟通的
但是为什么wrapper当中有一些操作Qualcomm没有去实现呢?
1
2
3
4
5
6
7
|
int
trigger_action(
const
struct camera2_device *,
uint32_t trigger_id,
int32_t ext1,
int32_t ext2)
{
return
INVALID_OPERATION;
}
|
比如这个上面理论上来说auto focus等等一些操作需要通过它来触发,但是它却是stub实现,这些不是必须的,还是藏在了某个角落我没有发现?
目前用的是libmmcamera_interface还是libmmcamera_interface2?
从代码看应该是libmmcamera_interface
再看一个实例,start preview 这是一个怎样的过程?
Camera2Client::startPreview(...)
Camera2Client::startPreviewL(...)
StreamingProcessor::updatePreviewStream(...)
Camera2Device::createStream(...)
Camera2Device::StreamAdapter::connectToDevice(...)
camera2_device_t->ops->allocate_stream(...)
这个allocate_stream是Vendor实现的,对于Qualcomm的Camera,位于/path/to/aosp/hardware/qcom/camera/QCamera/HAL2/wrapper/QualcommCamera.cpp
android::allocate_stream(...)
QCameraHardwareInterface::allocate_stream(...)
QCameraStream::createInstance(...)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
|
QCameraStream_preview::createInstance(uint32_t CameraHandle,
uint32_t ChannelId,
uint32_t Width,
uint32_t Height,
int
requestedFormat,
mm_camera_vtbl_t *mm_ops,
camera_mode_t mode)
{
QCameraStream* pme =
new
QCameraStream_preview(CameraHandle,
ChannelId,
Width,
Height,
requestedFormat,
mm_ops,
mode);
return
pme;
}
|
尽管这改进目前还是还是算比较“新”的一个东西,但这是趋势,所以了解下也无妨!
P.S. 4.3 在今天都发布了,我还在看4.2的东西,吼吼
Qualcomm Camera HAL 2.0
我们知道在HAL的Vendor实现当中会动态去load一个名字为camera.$platform$.so的档案,然后去加载Android HAL当中定义的方法,这里以Camera HAL 2.0并且Qualcomm msm8960为例子看下,结合之前的一篇文章(http://guoh.org/lifelog/2013/07/glance-at-camera-hal-2-0/)。
(注:这篇文章已经草稿比较久了,但是一直没有发出来,因为手里的这版代码没有设备可以跑,另外也无法确定代码是否完全正确,至少发现了一些地方都是stub实现,文中可能存在一些错误,如发现不正确的地方欢迎指出,我也会尽量发现错误并修正!)
我们知道在camera2.h当中定义了很多方法,那么在msm8960 HAL就是在如下地方
/path/to/qcam-hal/QCamera/HAL2
这编译出来就是一个camera.$platform$.so,请看它的实现
首先是HAL2/wrapper/QualcommCamera.h|cpp
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
|
/**
* The functions need to be provided by the camera HAL.
*
* If getNumberOfCameras() returns N, the valid cameraId for getCameraInfo()
* and openCameraHardware() is 0 to N-1.
*/
static
hw_module_methods_t camera_module_methods = {
open: camera_device_open,
};
static
hw_module_t camera_common = {
tag: HARDWARE_MODULE_TAG,
module_api_version: CAMERA_MODULE_API_VERSION_2_0,
// 这样Camera Service才会去初始化Camera2Client一系列
hal_api_version: HARDWARE_HAL_API_VERSION,
id: CAMERA_HARDWARE_MODULE_ID,
name:
"Qcamera"
,
author:
"Qcom"
,
methods: &camera_module_methods,
dso: NULL,
reserved: {
0
},
};
camera_module_t HAL_MODULE_INFO_SYM = {
// 这个HMI,每个HAL模块都必须有的
common: camera_common,
get_number_of_cameras: get_number_of_cameras,
get_camera_info: get_camera_info,
};
camera2_device_ops_t camera_ops = {
// 注意这些绑定的函数
set_request_queue_src_ops: android::set_request_queue_src_ops,
notify_request_queue_not_empty: android::notify_request_queue_not_empty,
set_frame_queue_dst_ops: android::set_frame_queue_dst_ops,
get_in_progress_count: android::get_in_progress_count,
flush_captures_in_progress: android::flush_captures_in_progress,
construct_default_request: android::construct_default_request,
allocate_stream: android::allocate_stream,
register_stream_buffers: android::register_stream_buffers,
release_stream: android::release_stream,
allocate_reprocess_stream: android::allocate_reprocess_stream,
allocate_reprocess_stream_from_stream: android::allocate_reprocess_stream_from_stream,
release_reprocess_stream: android::release_reprocess_stream,
trigger_action: android::trigger_action,
set_notify_callback: android::set_notify_callback,
get_metadata_vendor_tag_ops: android::get_metadata_vendor_tag_ops,
dump: android::dump,
};
typedef struct {
// 注意这个是Qualcomm自己定义的一个wrap结构
camera2_device_t hw_dev;
// 这里是标准的
QCameraHardwareInterface *hardware;
int
camera_released;
int
cameraId;
} camera_hardware_t;
/* HAL should return NULL if it fails to open camera hardware. */
extern "C" int camera_device_open(
const struct hw_module_t* module, const char* id,
struct hw_device_t** hw_device)
{
int rc = -1;
int mode = 0;
camera2_device_t *device = NULL;
if (module && id && hw_device) {
int cameraId = atoi(id);
if (!strcmp(module->name, camera_common.name)) {
camera_hardware_t *camHal =
(camera_hardware_t *) malloc(sizeof (camera_hardware_t));
if (!camHal) {
*hw_device = NULL;
ALOGE("%s: end in no mem", __func__);
return rc;
}
/* we have the camera_hardware obj malloced */
memset(camHal, 0, sizeof (camera_hardware_t));
camHal->hardware = new QCameraHardwareInterface(cameraId, mode);
if (camHal->hardware && camHal->hardware->isCameraReady()) {
camHal->cameraId = cameraId;
device = &camHal->hw_dev; // 这里camera2_device_t
device->common.close = close_camera_device; // 初始化camera2_device_t
device->common.version = CAMERA_DEVICE_API_VERSION_2_0;
device->ops = &camera_ops;
device->priv = (void *)camHal;
rc = 0;
} else {
if (camHal->hardware) {
delete camHal->hardware;
camHal->hardware = NULL;
}
free(camHal);
device = NULL;
}
}
}
/* pass actual hw_device ptr to framework. This amkes that we actally be use memberof() macro */
*hw_device = (hw_device_t*)&device->common;
// 这就是kernel或者Android native framework常用的一招
return
rc;
}
|
看看allocate stream
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
|
int
allocate_stream(
const
struct camera2_device *device,
uint32_t width,
uint32_t height,
int
format,
const
camera2_stream_ops_t *stream_ops,
uint32_t *stream_id,
uint32_t *format_actual,
uint32_t *usage,
uint32_t *max_buffers)
{
QCameraHardwareInterface *hardware = util_get_Hal_obj(device);
hardware->allocate_stream(width, height, format, stream_ops,
stream_id, format_actual, usage, max_buffers);
return
rc;
}
|
这里注意QCameraHardwareInterface在QCameraHWI.h|cpp当中
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
|
int
QCameraHardwareInterface::allocate_stream(
uint32_t width,
uint32_t height,
int
format,
const
camera2_stream_ops_t *stream_ops,
uint32_t *stream_id,
uint32_t *format_actual,
uint32_t *usage,
uint32_t *max_buffers)
{
int
ret = OK;
QCameraStream *stream = NULL;
camera_mode_t myMode = (camera_mode_t)(CAMERA_MODE_2D|CAMERA_NONZSL_MODE);
stream = QCameraStream_preview::createInstance(
mCameraHandle->camera_handle,
mChannelId,
width,
height,
format,
mCameraHandle,
myMode);
stream->setPreviewWindow(stream_ops);
// 这里,也就是只要通过该方法创建的stream,都会有对应的ANativeWindow进来
*stream_id = stream->getStreamId();
*max_buffers= stream->getMaxBuffers();
// 从HAL得到的
*usage = GRALLOC_USAGE_HW_CAMERA_WRITE | CAMERA_GRALLOC_HEAP_ID
| CAMERA_GRALLOC_FALLBACK_HEAP_ID;
/* Set to an arbitrary format SUPPORTED by gralloc */
*format_actual = HAL_PIXEL_FORMAT_YCrCb_420_SP;
return
ret;
}
|
QCameraStream_preview::createInstance直接调用自己的构造方法,也就是下面
(相关class在QCameraStream.h|cpp和QCameraStream_Preview.cpp)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
|
QCameraStream_preview::QCameraStream_preview(uint32_t CameraHandle,
uint32_t ChannelId,
uint32_t Width,
uint32_t Height,
int
requestedFormat,
mm_camera_vtbl_t *mm_ops,
camera_mode_t mode) :
QCameraStream(CameraHandle,
ChannelId,
Width,
Height,
mm_ops,
mode),
mLastQueuedFrame(NULL),
mDisplayBuf(NULL),
mNumFDRcvd(
0
)
{
mStreamId = allocateStreamId();
// 分配stream id(根据mStreamTable)
switch
(requestedFormat) {
// max buffer number
case
CAMERA2_HAL_PIXEL_FORMAT_OPAQUE:
mMaxBuffers =
5
;
break
;
case
HAL_PIXEL_FORMAT_BLOB:
mMaxBuffers =
1
;
break
;
default
:
ALOGE(
"Unsupported requested format %d"
, requestedFormat);
mMaxBuffers =
1
;
break
;
}
/*TODO: There has to be a better way to do this*/
}
|
再看看
/path/to/qcam-hal/QCamera/stack/mm-camera-interface/
mm_camera_interface.h
当中
1
2
3
4
5
|
typedef struct {
uint32_t camera_handle;
/* camera object handle */
mm_camera_info_t *camera_info; /* reference pointer of camear info */
mm_camera_ops_t *ops; /* API call table */
} mm_camera_vtbl_t;
|
mm_camera_interface.c
当中
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
|
/* camera ops v-table */
static
mm_camera_ops_t mm_camera_ops = {
.sync = mm_camera_intf_sync,
.is_event_supported = mm_camera_intf_is_event_supported,
.register_event_notify = mm_camera_intf_register_event_notify,
.qbuf = mm_camera_intf_qbuf,
.camera_close = mm_camera_intf_close,
.query_2nd_sensor_info = mm_camera_intf_query_2nd_sensor_info,
.is_parm_supported = mm_camera_intf_is_parm_supported,
.set_parm = mm_camera_intf_set_parm,
.get_parm = mm_camera_intf_get_parm,
.ch_acquire = mm_camera_intf_add_channel,
.ch_release = mm_camera_intf_del_channel,
.add_stream = mm_camera_intf_add_stream,
.del_stream = mm_camera_intf_del_stream,
.config_stream = mm_camera_intf_config_stream,
.init_stream_bundle = mm_camera_intf_bundle_streams,
.destroy_stream_bundle = mm_camera_intf_destroy_bundle,
.start_streams = mm_camera_intf_start_streams,
.stop_streams = mm_camera_intf_stop_streams,
.async_teardown_streams = mm_camera_intf_async_teardown_streams,
.request_super_buf = mm_camera_intf_request_super_buf,
.cancel_super_buf_request = mm_camera_intf_cancel_super_buf_request,
.start_focus = mm_camera_intf_start_focus,
.abort_focus = mm_camera_intf_abort_focus,
.prepare_snapshot = mm_camera_intf_prepare_snapshot,
.set_stream_parm = mm_camera_intf_set_stream_parm,
.get_stream_parm = mm_camera_intf_get_stream_parm
};
|
以start stream为例子
1
2
3
4
5
6
7
8
9
|
mm_camera_intf_start_streams(mm_camera_interface
mm_camera_start_streams(mm_camera
mm_channel_fsm_fn(mm_camera_channel
mm_channel_fsm_fn_active(mm_camera_channel
mm_channel_start_streams(mm_camera_channel
mm_stream_fsm_fn(mm_camera_stream
mm_stream_fsm_reg(mm_camera_stream
mm_camera_cmd_thread_launch(mm_camera_data
mm_stream_streamon(mm_camera_stream
|
注意:本文当中,如上这种梯度摆放,表示是调用关系,如果梯度是一样的,就表示这些方法是在上层同一个方法里面被调用的
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
|
int32_t mm_stream_streamon(mm_stream_t *my_obj)
{
int32_t rc;
enum
v4l2_buf_type buf_type = V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE;
/* Add fd to data poll thread */
rc = mm_camera_poll_thread_add_poll_fd(&my_obj->ch_obj->poll_thread[0],
my_obj->my_hdl,
my_obj->fd,
mm_stream_data_notify,
(void*)my_obj);
if (rc < 0) {
return rc;
}
rc = ioctl(my_obj->fd, VIDIOC_STREAMON, &buf_type);
if (rc < 0) {
CDBG_ERROR("%s: ioctl VIDIOC_STREAMON failed: rc=%d\n",
__func__, rc);
/* remove fd from data poll thread in case of failure */
mm_camera_poll_thread_del_poll_fd(&my_obj->ch_obj->poll_thread[
0
], my_obj->my_hdl);
}
return
rc;
}
|
看到ioctl,VIDIOC_STREAMON,可以高兴一下了,这就是V4L2规范当中用户空间和内核空间通信的方法,V4L2(Video for Linux Two)是一种经典而且成熟的视频通信协议,之前是V4L,不清楚的可以去下载它的规范,另外The Video4Linux2(http://lwn.net/Articles/203924/)也是很好的资料。
这里简单介绍下:
open(VIDEO_DEVICE_NAME, ...) // 开启视频设备,一般在程序初始化的时候调用
ioctl(...) // 主要是一些需要传输数据量很小的控制操作
这里可以用的参数很多,并且通常来说我们会按照以下方式来使用,比如
VIDIOC_QUERYCAP // 查询设备能干什么
VIDIOC_CROPCAP // 查询设备crop能力
VIDIOC_S_* // set/get方法,设置/获取参数
VIDIOC_G_*
VIDIOC_REQBUFS // 分配buffer,可以有多种方式
VIDIOC_QUERYBUF // 查询分配的buffer的信息
VIDIOC_QBUF // QUEUE BUFFER 把buffer压入DRV缓存队列(这时候buffer是空的)
VIDIOC_STREAMON // 开始视频数据传输
VIDIOC_DQBUF // DEQUEUE BUFFER 把buffer从DRV缓存队列中取出(这时候buffer是有数据的)[0...n]
QBUF -> DQBUF // 可以一直重复这个动作VIDIOC_STREAMOFF // 停止视频数据传输
close(VIDEO_DEVICE_FD) // 关闭设备
上面就是主要的函数和简单的调用顺序,另外还有几个函数select() // 等待事件发生,主要用在我们把存frame的buffer推给DRV以后,等待它的反应
mmap/munmap // 主要处理我们request的buffer的,buffer分配在设备的内存空间的时候需要
并且看看mm_camera_stream这个文件里面也都是这么实现的。
看完这里,我们回过头来继续看QCam HAL,当然它实现的细节也不是我上面start stream所列的那么简单,但是其实也不算复杂,觉得重要的就是状态和用到的结构。
首先是channel状态,目前只支持1个channel,但是可以有多个streams(后面会介绍,而且目前最多支持8个streams)
1
2
3
4
5
6
7
8
|
/* mm_channel */
typedef enum {
MM_CHANNEL_STATE_NOTUSED = 0, /* not used */
MM_CHANNEL_STATE_STOPPED, /* stopped */
MM_CHANNEL_STATE_ACTIVE, /* active, at least one stream active */
MM_CHANNEL_STATE_PAUSED, /* paused */
MM_CHANNEL_STATE_MAX
} mm_channel_state_type_t;
|
它可以执行的事件
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
|
typedef
enum
{
MM_CHANNEL_EVT_ADD_STREAM,
MM_CHANNEL_EVT_DEL_STREAM,
MM_CHANNEL_EVT_START_STREAM,
MM_CHANNEL_EVT_STOP_STREAM,
MM_CHANNEL_EVT_TEARDOWN_STREAM,
MM_CHANNEL_EVT_CONFIG_STREAM,
MM_CHANNEL_EVT_PAUSE,
MM_CHANNEL_EVT_RESUME,
MM_CHANNEL_EVT_INIT_BUNDLE,
MM_CHANNEL_EVT_DESTROY_BUNDLE,
MM_CHANNEL_EVT_REQUEST_SUPER_BUF,
MM_CHANNEL_EVT_CANCEL_REQUEST_SUPER_BUF,
MM_CHANNEL_EVT_START_FOCUS,
MM_CHANNEL_EVT_ABORT_FOCUS,
MM_CHANNEL_EVT_PREPARE_SNAPSHOT,
MM_CHANNEL_EVT_SET_STREAM_PARM,
MM_CHANNEL_EVT_GET_STREAM_PARM,
MM_CHANNEL_EVT_DELETE,
MM_CHANNEL_EVT_MAX
} mm_channel_evt_type_t;
|
1
2
3
4
5
6
7
8
9
10
11
12
|
/* mm_stream */
typedef enum { // 这里的状态要仔细,每执行一次方法,状态就需要变化
MM_STREAM_STATE_NOTUSED = 0, /* not used */
MM_STREAM_STATE_INITED, /* inited */
MM_STREAM_STATE_ACQUIRED, /* acquired, fd opened */
MM_STREAM_STATE_CFG, /* fmt & dim configured */
MM_STREAM_STATE_BUFFED, /* buf allocated */
MM_STREAM_STATE_REG, /* buf regged, stream off */
MM_STREAM_STATE_ACTIVE_STREAM_ON, /* active with stream on */
MM_STREAM_STATE_ACTIVE_STREAM_OFF, /* active with stream off */
MM_STREAM_STATE_MAX
} mm_stream_state_type_t;
|
同样,stream可以执行的事件