android camera系统介绍(kernel部分)

一、前言

这段时间继续做freescale iMX6Q平台上做项目,项目中用到了一个电视信号输入设备,这玩意就是一个外部电视信号过来我们得通过APP显示出图像和声音来,听上去跟android智能电视一样,猜对了!这就是一个工业用的的视频设备。在开始做之前,我就知道android本身是不支持电视卡这种东西的,但是仔细考虑一下,电视输出分为声音了图像两部分,声音比较简单,就是I2S的输入,然后播放出去。图像部分是难点,但考虑到这玩意跟camera的预览是类似的,可能在现有的camera基础上进行一些小小的修改就可以实现。有了这么一个大概的思路。等硬件来了以后我们开始开始动手,先从难点 图像部分入手, 这篇文章就介绍camera系统kernel部分的代码流程。

二、android camera系统介绍

camera系统做为一个标准的android系统跑不开android的一些基本的结构,如下图是android camera的层次结构

这个结构没有特别的不做过多的描述,这篇文章重点来看硬件和驱动部分。

三、硬件

硬件方面,对于CPU来讲,电视卡就是一个16bit paralle接口的视频输入设备,接口图如下


接口一共20跟线:

PCLK:像素时钟 是像素点同步的时钟信号,也就是每个PCLK对应一个像素点

HSYNC:列同步 在HSYNC有效时段内,sensor所有信号输出属同一行

VSYNC:行同步 在HSYNC有效时段内,sensor所有信号输出属同一帧

DATA_EN:数据使能
data0~data15:16位数据线

再具体的说明大家可以自行百度,这是不做说明。另外电视卡输出信号是UYVY格式的,对于这个格式说明这里贴一个百度百科地址,大家自行研究

http://baike.baidu.com/view/189685.htm


四、驱动

对于kernel来讲camera走的是V4L2的 capture interfac设备,所以我们主要工作就是完成V4L2设备框架,对于每个v4l2 的capture设备都分为两部分,一是master:是视频采集接口的驱动,也就是cpu parallel接口的驱动,一个是slave,是视频设备也就是camera的驱动。这两部分都要由我们来编写或修改来完成。

1.slave设备

  我们设备跟CPU接口芯片是GS2971,我们就要写这个片子的驱动,根据GS2971手册来看,它一共支持11种视频格式,v4l2需要我们提供每种视频格式的Timing数据,这个是根据CEA861标准来确定的,可以百度到相关文档,这里拿出一个格式做为说明,

下面这个图是1280x720P50Hz   CEA861标准时序图


对于一帧电视图像来说,并不是只有图像本身,在传输过程中是有边界的,就如上图所示其实这帧图像总大小是1980x750,其中1280x720是真正的图像,HSYNC是行同步信号,我们CPU是上升沿触发的,因此从上图可知图像左边界是220+40个像素点,右边界是440个像素点,VSYNC是帧同步和信号,也是上升沿触发,所以上边界是25行,下边界是5行,因此,这组数据是

{
	 .v4l2_id = V4L2_STD_ALL,
	 .name = "1280x720P50", 	//1280x720P50 
	 .raw_width = 1980,	
	 .raw_height = 750,
	 .act_width = 1280,	
	 .act_height = 720,
	 .win_top = 20+5,
	 .win_left = 220+40,
	 .hs_inv = 0,
	 .vs_inv = 0,
	 .is_interlace = 0,
	},

数据是根据cpu所需要的写的,右边界和下边界没用所以没有写,其它几种格式,可以按照这个方式写就不多说了。

其它与GS2971片子相关的驱动代码就不多说了,下面看好v4l2设备的注册,主要是给设备提供一个初始状态并且提供v4l2所需的 ioctl,其实也很简单

static struct v4l2_int_slave gs2971_slave = {
	.ioctls = gs2971_ioctl_desc,
	.num_ioctls = ARRAY_SIZE(gs2971_ioctl_desc),
};

static struct v4l2_int_device gs2971_int_device = {
	.module = THIS_MODULE,
	.name = "gs2971",
	.type = v4l2_int_type_slave,
	.u = {
		.slave = &gs2971_slave,
	},
};


	memset(&gs2971_data, 0, sizeof(gs2971_data));
	gs2971_data.sen.platform_data =  gs2971_tvin_plat;
	gs2971_data.sen.mclk = gs2971_tvin_plat->mclk;
	gs2971_data.sen.mclk_source = gs2971_tvin_plat->mclk_source;
	gs2971_data.sen.csi = gs2971_tvin_plat->csi;
	if(gs2971_tvin_plat->sdi_connect){
		gs2971_data.irq = gpio_to_irq(gs2971_tvin_plat->sdi_connect);
		gs2971_data.sdi_connect = gs2971_tvin_plat->sdi_connect;
	}
	
	gs2971_data.sen.plat_device = pdev;
	gs2971_data.sen.streamcap.capability = V4L2_MODE_HIGHQUALITY |
					   V4L2_CAP_TIMEPERFRAME;
	gs2971_data.sen.streamcap.capturemode = 0;
	gs2971_data.sen.streamcap.timeperframe.denominator = DEFAULT_FPS;//没秒帧数

	gs2971_data.sen.streamcap.timeperframe.numerator = 1;
	
	gs2971_data.std_id = V4L2_STD_ALL;

	gs2971_data.sen.pix.width = gs2971_video_fmts[current_video_idx].act_width;	//
	gs2971_data.sen.pix.height = gs2971_video_fmts[current_video_idx].act_height;
	gs2971_data.sen.pix.pixelformat = DEFAULT_PIX_FORMAT;  /* YUV422 */
	gs2971_data.sen.pix.priv = 1;  /* 1 is used to indicate TV in */
	gs2971_data.sen.on = true;
	
	gpio_sensor_active();
	
	pr_debug("   type is %d (expect %d)\n",
		 gs2971_int_device.type, v4l2_int_type_slave);
	pr_debug("   num ioctls is %d\n",
		 gs2971_int_device.u.slave->num_ioctls);

	/* This function attaches this structure to the /dev/video0 device.
	 * The pointer in priv points to the mt9v111_data structure here.*/
	gs2971_int_device.priv = &gs2971_data;
	ret = v4l2_int_device_register(&gs2971_int_device);	//	注册设备

对于ioctl这里先不说,因为这个要根据master驱动来确定,下面来看master驱动。

2.maser设备

对于master freescale已经有了相关驱动,我们需要一些修改就可以了。驱动代码是

android\kernel_imx\drivers\media\video\mxc\capture\mxc_v4l2_capture.c

我们看probe函数中的设备注册

	/* Set up the v4l2 device and register it*/
	cam->self->priv = cam;
	v4l2_int_device_register(cam->self);

	/* register v4l video device */
	if (video_register_device(cam->video_dev, VFL_TYPE_GRABBER, video_nr)
	    == -1) {
		kfree(cam);
		cam = NULL;
		pr_err("ERROR: v4l2 capture: video_register_device failed\n");
		return -1;
	}

这里一共注册两次

v4l2_int_device_register()注册了是初始化用的,用来和slave设备配对。

video_register_device()用来注册视频设备,也就是为上层提供访问接口的设备,也就是/dev/video这个设备相关操作


打开视频设备后,可以设置该视频设备的属性,例如裁剪、缩放等。这一步是可选的。在Linux编程中,一般使用ioctl函数来对设备的I/O通道进行管理,下面对v4l2的ioctl做一个简单说明

在进行V4L2开发中,一般会用到以下的命令标志符:
VIDIOC_REQBUFS:分配内存
VIDIOC_QUERYBUF:把VIDIOC_REQBUFS中分配的数据缓存转换成物理地址
VIDIOC_QUERYCAP:查询驱动功能
VIDIOC_ENUM_FMT:获取当前驱动支持的视频格式
VIDIOC_S_FMT:设置当前驱动的频捕获格式
VIDIOC_G_FMT:读取当前驱动的频捕获格式
VIDIOC_TRY_FMT:验证当前驱动的显示格式
VIDIOC_CROPCAP:查询驱动的修剪能力
VIDIOC_S_CROP:设置视频信号的边框
VIDIOC_G_CROP:读取视频信号的边框
VIDIOC_QBUF:把数据放回缓存队列
VIDIOC_DQBUF:把数据从缓存中读取出来
VIDIOC_STREAMON:开始视频显示函数
VIDIOC_STREAMOFF:结束视频显示函数
VIDIOC_QUERYSTD:检查当前视频设备支持的标准,例如PAL或NTSC。
v4l比较复杂,介绍清楚需要很长篇幅,这里帖两个网址,就不多说了
http://baike.baidu.com/view/5494174.htm?fr=aladdin
http://www.cnblogs.com/emouse/archive/2013/03/04/2943243.html

看完上面两个链接,我们来看几个重要的函数

open()函数

主要是对做一些初始化

static int mxc_v4l_open(struct file *file)
{
	struct v4l2_ifparm ifparm;
	struct v4l2_format cam_fmt;
	ipu_csi_signal_cfg_t csi_param;
	struct video_device *dev = video_devdata(file);
	cam_data *cam = video_get_drvdata(dev);
	int err = 0;


	pr_debug("\nIn MVC: mxc_v4l_open\n");
	pr_debug("   device name is %s\n", dev->name);


	if (!cam) {
		pr_err("ERROR: v4l2 capture: Internal error, "
			"cam_data not found!\n");
		return -EBADF;
	}


	if (cam->sensor == NULL ||
	    cam->sensor->type != v4l2_int_type_slave) {
		pr_err("ERROR: v4l2 capture: slave not found15!\n");
		return -EAGAIN;
	}


	down(&cam->busy_lock);
	err = 0;
	if (signal_pending(current))
		goto oops;


	if (cam->open_count++ == 0) {
		wait_event_interruptible(cam->power_queue,
					 cam->low_power == false);


		if (strcmp(mxc_capture_inputs[cam->current_input].name,
			   "CSI MEM") == 0) {
#if defined(CONFIG_MXC_IPU_CSI_ENC) || defined(CONFIG_MXC_IPU_CSI_ENC_MODULE)
			err = csi_enc_select(cam);	//图像输出控制,也就是抓到的数据,以什么方式输出到上层
#endif
		} else if (strcmp(mxc_capture_inputs[cam->current_input].name,
				  "CSI IC MEM") == 0) {
#if defined(CONFIG_MXC_IPU_PRP_ENC) || defined(CONFIG_MXC_IPU_PRP_ENC_MODULE)
			err = prp_enc_select(cam);
#endif
		}


		cam->enc_counter = 0;
		INIT_LIST_HEAD(&cam->ready_q);
		INIT_LIST_HEAD(&cam->working_q);
		INIT_LIST_HEAD(&cam->done_q);


		vidioc_int_g_ifparm(cam->sensor, &ifparm);


		csi_param.sens_clksrc = 0;


		csi_param.clk_mode = IPU_CSI_CLK_MODE_GATED_CLK;//pclk模式 为gatemode 也就是CEA标准模式
		csi_param.data_pol = 0;	//data引脚极性 为正常
		csi_param.ext_vsync = 1;	//使用外部的 vsync 和 hsync
		csi_param.data_en_pol = 0;	//data_en引脚极性为正常
		csi_param.pixclk_pol = ifparm.u.bt656.latch_clk_inv;  //pclk 引脚极性为正常


		if (ifparm.u.bt656.mode
				== V4L2_IF_TYPE_BT656_MODE_NOBT_8BIT)
			csi_param.data_width = IPU_CSI_DATA_WIDTH_8;	//数据宽度为8位
		else if (ifparm.u.bt656.mode
				== V4L2_IF_TYPE_BT656_MODE_NOBT_10BIT)
			csi_param.data_width = IPU_CSI_DATA_WIDTH_10;
		else
			csi_param.data_width = IPU_CSI_DATA_WIDTH_8;

		csi_param.Vsync_pol = ifparm.u.bt656.nobt_vs_inv; //vsync极性 高有效
		csi_param.Hsync_pol = ifparm.u.bt656.nobt_hs_inv;//hsync极性 低有效


		csi_param.csi = cam->csi;//使用ipu0 csi0接口


		cam_fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
		vidioc_int_g_fmt_cap(cam->sensor, &cam_fmt);//从gs2971驱动中获取 format数据
		
		csi_param.data_fmt = cam_fmt.fmt.pix.pixelformat;

		/* This also is the max crop size for this device. */
		cam->crop_defrect.top = cam->crop_defrect.left = 0;
		cam->crop_defrect.width = cam_fmt.fmt.pix.width;
		cam->crop_defrect.height = cam_fmt.fmt.pix.height;


		/* At this point, this is also the current image size. */
		cam->crop_current.width = cam_fmt.fmt.pix.width;
		cam->crop_current.height = cam_fmt.fmt.pix.height;


		cam_fmt.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;
		vidioc_int_g_fmt_cap(cam->sensor, &cam_fmt);
		cam->crop_bounds.width = cam_fmt.fmt.win.w.width;
		cam->crop_bounds.height = cam_fmt.fmt.win.w.height; 
		cam->crop_current.left = cam_fmt.fmt.win.w.left;
		cam->crop_current.top = cam_fmt.fmt.win.w.top;


		pr_err("End of %s: v2f pix widthxheight %d x %d\n",
			__func__,
			cam->v2f.fmt.pix.width, cam->v2f.fmt.pix.height);
		pr_err("End of %s: crop_bounds widthxheight %d x %d\n",
			__func__,
			cam->crop_bounds.width, cam->crop_bounds.height);
		pr_err("End of %s: crop_defrect widthxheight %d x %d\n",
			__func__,
			cam->crop_defrect.width, cam->crop_defrect.height);
		pr_err("End of %s: crop_current widthxheight %d x %d\n",
			__func__,
			cam->crop_current.width, cam->crop_current.height);


		
		pr_err("On Open: Input to ipu size is %d x %d\n",
				cam_fmt.fmt.pix.width, cam_fmt.fmt.pix.height);
		ipu_csi_set_window_size(cam->ipu, cam->crop_current.width,	//设置图像大小(即上面的1280x720)
					cam->crop_current.height,
					cam->csi);
		ipu_csi_set_window_pos(cam->ipu, cam->crop_current.left,	//设置图像位置(即上面的260和25)
					cam->crop_current.top,
					cam->csi);
		ipu_csi_init_interface(cam->ipu, cam->crop_bounds.width,	//设置图像raw大小(即上面的1980x750)
					cam->crop_bounds.height,
					csi_param.data_fmt,
					csi_param);


		if (!cam->mclk_on[cam->mclk_source]) {
			ipu_csi_enable_mclk_if(cam->ipu, CSI_MCLK_I2C,
					       cam->mclk_source,
					       true, true);
			cam->mclk_on[cam->mclk_source] = true;
		}
		vidioc_int_s_power(cam->sensor, 1);
		vidioc_int_init(cam->sensor);
		vidioc_int_dev_init(cam->sensor);
	}


	file->private_data = dev;


      oops:
	up(&cam->busy_lock);
	return err;
}

这里说明一点

vidioc_int_g_fmt_cap()是从gs2971的驱动中获取数据,gs2971中有这么一段

static struct v4l2_int_ioctl_desc gs2971_ioctl_desc[] = {

	{vidioc_int_dev_init_num, (v4l2_int_ioctl_func*)ioctl_dev_init},

	/*!
	 * Delinitialise the dev. at slave detach.
	 * The complement of ioctl_dev_init.
	 */
	{vidioc_int_dev_exit_num, (v4l2_int_ioctl_func *)ioctl_dev_exit}, 

	{vidioc_int_s_power_num, (v4l2_int_ioctl_func*)ioctl_s_power},
	{vidioc_int_g_ifparm_num, (v4l2_int_ioctl_func*)ioctl_g_ifparm},
/*	{vidioc_int_g_needs_reset_num,
				(v4l2_int_ioctl_func *)ioctl_g_needs_reset}, */
/*	{vidioc_int_reset_num, (v4l2_int_ioctl_func *)ioctl_reset}, */
	{vidioc_int_init_num, (v4l2_int_ioctl_func*)ioctl_init},

	/*!
	 * VIDIOC_ENUM_FMT ioctl for the CAPTURE buffer type.
	 */
	{vidioc_int_enum_fmt_cap_num,
				(v4l2_int_ioctl_func *)ioctl_enum_fmt_cap}, 

	/*!
	 * VIDIOC_TRY_FMT ioctl for the CAPTURE buffer type.
	 * This ioctl is used to negotiate the image capture size and
	 * pixel format without actually making it take effect.
	 */
/*	{vidioc_int_try_fmt_cap_num,
				(v4l2_int_ioctl_func *)ioctl_try_fmt_cap}, */

	{vidioc_int_g_fmt_cap_num, (v4l2_int_ioctl_func*)ioctl_g_fmt_cap},

	/*!
	 * If the requested format is supported, configures the HW to use that
	 * format, returns error code if format not supported or HW can't be
	 * correctly configured.
	 */
/*	{vidioc_int_s_fmt_cap_num, (v4l2_int_ioctl_func *)ioctl_s_fmt_cap}, */

	{vidioc_int_g_parm_num, (v4l2_int_ioctl_func*)ioctl_g_parm},
	{vidioc_int_s_parm_num, (v4l2_int_ioctl_func*)ioctl_s_parm},
	/*{vidioc_int_queryctrl_num, (v4l2_int_ioctl_func*)ioctl_queryctrl},*/
	{vidioc_int_g_ctrl_num, (v4l2_int_ioctl_func*)ioctl_g_ctrl},
	{vidioc_int_s_ctrl_num, (v4l2_int_ioctl_func*)ioctl_s_ctrl},
	{vidioc_int_enum_framesizes_num,
				(v4l2_int_ioctl_func *)ioctl_enum_framesizes},
	{vidioc_int_g_chip_ident_num,
				(v4l2_int_ioctl_func *)ioctl_g_chip_ident},
};

vidioc_int_g_fmt_cap()就是调用这里的ioctl_g_fmt_cap()具体调用方法在\android\kernel_imx\include\media\v4l2-int-device.h里面,需要可以自己研究一下。

再来看几个重要的ioctl

查询支持视频格式

	case VIDIOC_ENUM_FRAMESIZES: {
		pr_err("   case VIDIOC_ENUM_FRAMESIZES\n");
		struct v4l2_frmsizeenum *fsize = arg;
		if (cam->sensor)
			retval = vidioc_int_enum_framesizes(cam->sensor, fsize);//返回所支持的格式 上层通过这个 可以得到 底层所支持的所有格式
		else {
			pr_err("ERROR: v4l2 capture: slave not found20!\n");
			retval = -ENODEV;
		}
		break;
	}
设置参数:打开视频前对硬件进行一些配置,比如,拍照前相机对焦等
static int mxc_v4l2_s_param(cam_data *cam, struct v4l2_streamparm *parm)
{
	struct v4l2_ifparm ifparm;
	struct v4l2_format cam_fmt;
	struct v4l2_streamparm currentparm;
	ipu_csi_signal_cfg_t csi_param;
	u32 current_fps, parm_fps;
	int err = 0;

	pr_err("In mxc_v4l2_s_param\n");

	...
	....
	...

	/* This essentially loses the data at the left and bottom of the image
	 * giving a digital zoom image, if crop_current is less than the full
	 * size of the image. */
	 ipu_csi_set_window_size(cam->ipu, cam->crop_current.width,		//gs2971没有要配制的东西,只要将输入的视频格式配置给cpu就可以了
				cam->crop_current.height, cam->csi);
	ipu_csi_set_window_pos(cam->ipu, cam->crop_current.left,
			       cam->crop_current.top,
			       cam->csi);
	ipu_csi_init_interface(cam->ipu, cam->crop_bounds.width,
			       cam->crop_bounds.height,
			       csi_param.data_fmt, csi_param);

exit:
	if (cam->overlay_on == true)
		start_preview(cam);

	return err;
}
配置格式:这里主要是配置视频输出格式,camera抓到UYVY格式的图像,输出到屏幕上时可能需要格式转换,这里就可以配置输出的格式

static int mxc_v4l2_s_fmt(cam_data *cam, struct v4l2_format *f)
{
	int retval = 0;
	int size = 0;
	int bytesperline = 0;
	int *width, *height;

	pr_err("In MVC: mxc_v4l2_s_fmt\n");

	switch (f->type) {
	case V4L2_BUF_TYPE_VIDEO_CAPTURE:
		pr_err("   type=V4L2_BUF_TYPE_VIDEO_CAPTURE\n");
	..........................
	'''''''''''''''''''''''''''''
		}
		break;
	case V4L2_BUF_TYPE_VIDEO_OVERLAY:
		pr_err("   type=V4L2_BUF_TYPE_VIDEO_OVERLAY\n");
		retval = verify_preview(cam, &f->fmt.win);
		cam->win = f->fmt.win;
		break;
	default:
		retval = -EINVAL;
	}

	pr_err("End of %s: v2f pix widthxheight %d x %d\n",
		 __func__,
		 cam->v2f.fmt.pix.width, cam->v2f.fmt.pix.height);
	pr_err("End of %s: crop_bounds widthxheight %d x %d\n",
		 __func__,
		 cam->crop_bounds.width, cam->crop_bounds.height);
	pr_err("End of %s: crop_defrect widthxheight %d x %d\n",
		 __func__,
		 cam->crop_defrect.width, cam->crop_defrect.height);
	pr_err("End of %s: crop_current widthxheight %d x %d\n",
		 __func__,
		 cam->crop_current.width, cam->crop_current.height);

	return retval;
}
开始抓取数据

static int mxc_streamon(cam_data *cam)
{
	struct mxc_v4l_frame *frame;
	unsigned long lock_flags;
	int err = 0;

	pr_err("In MVC:mxc_streamon\n");

	if (NULL == cam) {
		pr_err("ERROR! cam parameter is NULL\n");
		return -1;
	}

	if (cam->capture_on) {
		pr_err("ERROR: v4l2 capture: Capture stream has been turned "
		       " on\n");
		return -1;
	}

	if (list_empty(&cam->ready_q)) {
		pr_err("ERROR: v4l2 capture: mxc_streamon buffer has not been "
			"queued yet\n");
		return -EINVAL;
	}
	if (cam->enc_update_eba &&
		cam->ready_q.prev == cam->ready_q.next) {
		pr_err("ERROR: v4l2 capture: mxc_streamon buffer need ping pong "
			"at least two buffers\n");
		return -EINVAL;
	}

	cam->capture_pid = current->pid;

	if (cam->overlay_on == true)
		stop_preview(cam);

	if (cam->enc_enable) {
		err = cam->enc_enable(cam);	
		if (err != 0) {
			return err;
		}
	}

	spin_lock_irqsave(&cam->queue_int_lock, lock_flags);
	cam->ping_pong_csi = 0;
	cam->local_buf_num = 0;
	if (cam->enc_update_eba) {
		frame =
		    list_entry(cam->ready_q.next, struct mxc_v4l_frame, queue);	//调整buff队列
		list_del(cam->ready_q.next);
		list_add_tail(&frame->queue, &cam->working_q);
		frame->ipu_buf_num = cam->ping_pong_csi;
		err = cam->enc_update_eba(cam->ipu, frame->buffer.m.offset,
					  &cam->ping_pong_csi);

		frame =
		    list_entry(cam->ready_q.next, struct mxc_v4l_frame, queue);
		list_del(cam->ready_q.next);
		list_add_tail(&frame->queue, &cam->working_q);
		frame->ipu_buf_num = cam->ping_pong_csi;
		err |= cam->enc_update_eba(cam->ipu, frame->buffer.m.offset,
					   &cam->ping_pong_csi);
		spin_unlock_irqrestore(&cam->queue_int_lock, lock_flags);
	} else {
		spin_unlock_irqrestore(&cam->queue_int_lock, lock_flags);
		return -EINVAL;
	}

	if (cam->overlay_on == true)
		start_preview(cam);

	if (cam->enc_enable_csi) {
		err = cam->enc_enable_csi(cam); //打开dma开始向 buf中复制抓到的数据
		if (err != 0)
			return err;
	}

	cam->capture_on = true;

	return err;
}
好了这篇文章就写到这里,都是简单的一个介绍,具体这些代码怎么用,还要结合hal层来看。下一篇就来讲述。这样才能串起来













你可能感兴趣的:(android,camera)