linux内核文档翻译之——V4L2-framework.txt V4L2 API Specification

http://blog.csdn.net/jmq_0000/article/details/7530575

V4L2驱动框架概述
=====================================

这个文本文件讲述V4L2的框架所提供的各种结构和它们之间的关系.


介绍
------------

由于硬件的复杂性v412驱动往往是非常复杂的: 大多数设备有多个IC,在
/dev目录下有多个设备节点, 并也创建non-V4L2的设备,如DVB,ALSA,FB,
I2C和input(IR)设备。

特别是v412驱动设置配套的IC做音频/视频多路复用/编码/解码,使得它更
比大多数复杂的事实。通常这些芯片连接到主桥驱动器通过一个或多个I2C
总线,但也可以使用其他总线。这种设备被称为子设备

很长一段时间有限制的video_device结构框架创建v4l设备节点和video_buf
的的视频缓冲区处理(请注意,本文档不讨论video_buf框架)。

这意味着,所有驱动程序必须做的设备实例的设置和连接子设备本身。
这部分是相当复杂,做应该做的事,很多驱动程序不这样做是正确的。 
        
        
          
          
          
          

由于缺乏一个框架也有很多共同的代码不可重构。
因此,这个框架的基本构建块,所有的驱动程序需要与此相同的框架应更容
易进入所有驱动程序共享的实用功能重构的通用代码。


驱动程序的结构
---------------------

所有的驱动程序有以下结构:

1) 每个设备包含设备状态的实例结构。

2) 子设备的初始化和命令方式(如果有).

3) 创建V4L2的设备节点 (/dev/videoX, /dev/vbiX and /dev/radioX)
   和跟踪设备节点的具体数据。

4)文件句柄特定的结构,包含每个文件句柄数据;

5) 视频缓冲处理。








这是一个粗略的示意图,这一切是如何涉及的:

    device instances(设备实例
      |
      +-sub-device instances(子设备实例
      |
      \-V4L2 device nodes(V4L2的设备节点
  |
  \-filehandle instances(文件句柄实例
框架结构

--------------------------
框架类似于驱动程序的结构:它是一个 v4l2_device结构的设备实例的数据,
v4l2_subdev结构子设备实例的video_device结构存储V4L2的设备节点的数据,
并在将来v4l2_fh结构将保持跟踪的文件句柄实例(这是尚未实现)。

也可以选择集成框架的V4L2媒体框架。如果设置一个驱动程序的结构

v4l2_device mdev,sub-devices 和 video节点会自动出现在媒体框架作为实体。

struct v4l2_device
------------------
每个设备实例结构体v4l2_device(V4L2-device.h中)的代表。只是很简单的
设备可以分配这个结构,但大多数的时候你会把这个结构体嵌入到一个更大的结构体中。







你必须注册设备的实例:







	v4l2_device_register(struct device *dev, struct v4l2_device *v4l2_dev);

注册将初始化 v4l2_device 结构体. 如果 dev->driver_data字段是空, 它将连接到 v4l2_dev.

要与媒体设备框架集成的驱动程序需要设置DEV-> driver_data手动指向驱动程序特定的设备结构
嵌入结构体v4l2_device实例。这是通过一个dev_set_drvdata()调用之前注册V4L2的设备实例。
他们必须还设置结构体v4l2_device MDEV领域指向一个正确初始化和注册media_device实例。
如果v4l2_dev->name是空,那么这将被设置为从dev取得一个值(驱动程序名称后的
 bus_id要准确)。 如果你在调用v4l2_device_register之前设置它,那么它没有改变。
如果dev是NULL,那么你‘必须’设置v4l2_dev>name在调用v4l2_device_register前。


                                  
                                  
                              
                              
                              
                              

你可以使用v4l2_device_set_name()设置名称根据驱动程序的名字和

driver-global atomic_t实例。这将产生的名字一样,ivtv0,ivtv1,等。

如果名字最后一个数字,然后将它插入一个破折号:cx18-0,cx18-1,等

这个函数返回的实例数量。 第一个参数‘dev’通常是一个pci_dev的struct device的指针,

但它是ISA设备或一个设备创建多个PCI设备时这是罕见的DEV为NULL,
因此makingit不可能联想到一个特定的父母v4l2_dev。
您也可以提供一个notify()回调子设备,可以通过调用通知你的事件。
取决于你是否需要设置子设备。一个子设备支持的任何通知必须在头文件中定义
include/media/.h.

注销:

	v4l2_device_unregister(struct v4l2_device *v4l2_dev);

如果dev-> driver_data字段指向v4l2_dev,它将被重置为NULL。
注销也将自动注销从设备所有子设备

如果你有一个可热插拔设备(如USB设备),然后发生断开连接的时候父设备将变为无效。
由于v4l2_device指向父设备的指针,它被清除,以及标记父设备消失。
要做到这一点调用

	v4l2_device_disconnect(struct v4l2_device *v4l2_dev);

*不*注销subdevs,所以你仍然需要调用该的v4l2_device_unregister()函数。
如果你的驱动是不能热插拔,则有无需调用v4l2_device_disconnect()。

有时你需要遍历一个特定的驱动程序注册的所有设备。











这是通常情况下如果多个设备驱动程序使用相同的硬件。

例如ivtvfb驱动程序是一个使用IVTV硬件framebuffer驱动。

同样是真实的,例如ALSA驱动程序。您可以遍历所有注册的设备如下:

static int callback(struct device *dev, void *p)
{
    struct v4l2_device *v4l2_dev = dev_get_drvdata(dev);

    /* test if this device was inited */
    if (v4l2_dev == NULL)
        return 0;
    ...
    return 0;
}

int iterate(void *p)
{
    struct device_driver *drv;
    int err;

    /* Find driver 'ivtv' on the PCI bus.
       pci_bus_type is a global. For USB busses use usb_bus_type. */
    drv = driver_find("ivtv", &pci_bus_type);
    /* iterate over all ivtv device instances */
    err = driver_for_each_device(drv, NULL, p, callback);
    put_driver(drv);
    return err;
}

有时你需要保持一个设备实例运行计数器。这是常用的映射设备实例模块选项数组索引。

建议的方法如下:

static atomic_t drv_instance = ATOMIC_INIT(0);

static int __devinit drv_probe(struct pci_dev *pdev,
                const struct pci_device_id *pci_id)
{
    ...
    state->instance = atomic_inc_return(&drv_instance) - 1;
}

如果你有多个设备节点然后它可以是很难知道它是安全注销v4l2_device时候。

v4l2_device引用计数的支持就是对于这样做的目的。当video_register_device
时被称之为增加引用计数,有设备节点被释放时减少引用计数。
当引用计数达到零,则v4l2_device回调release()。你可以做最后的清理。









如果其他设备节点(例如ALSA)的创建,然后你可以增加和减少以及手动调用引用计数:

void v4l2_device_get(struct v4l2_device *v4l2_dev);

or:

int v4l2_device_put(struct v4l2_device *v4l2_dev);

struct v4l2_subdev

------------------

许多驱动程序需要与子设备进行通信。这些设备可以完成所有这类任务

但他们最常用的处理音频 和/或 视频混流,编码或解码。对于常见子设备摄像头传感器和摄像头控制器。


通常情况下,这些I2C器件,但不一定。以这些子设备的驱动程序提供一致的接口v4l2_subdev结构
(V4L2 subdev.h创建。


每个子设备驱动程序必须有一个v4l2_subdev结构。这个结构可以独立简单的设备或者

如果需要存储更多的状态信息它可能被嵌入在一个更大的结构。一般有一个低级别
设备结构(如i2c_client),其中包含设备的数据为内核设置。









它建议在v4l2_subdev使用v4l2_set_subdevdata()的私人数据存储的指针。

这使得很容易地从一个v4l2_subdev实际低级别的总线特定的器件数据。


你还需要很长的路要走从低级结构v4l2_subdev。对于常见的i2c_client结构的i2c_set_clientdata()调用用来存储v4l2_subdev指针,

为其他总线您可能必须使用其他方法。桥接可能还需要存储每subdev私人数据如桥特定每个-subdev私人数据指针。

v4l2_subdev结构提供与v4l2_get_subdev_hostdata()和v4l2_set_subdev_hostdata()可以访问目的主机私人数据。

从桥式驱动器角度你加载的子设备模块和某种取得的v4l2_subdev指针。

对于I2C器件这是很容易


你调用i2c_get_clientdata()。对于其他总线类似的东西需要做。Helper函数做你最棘手的工作,

这一个I2C总线上存在的子设备。每个v4l2_subdev包含函数指针的子设备驱动程序可以实现(或保留NULL,如果它不适用)。


由于子设备可以做很多不同的东西,你不想结束一个巨大的OPS结构其中只有少数OPS通常执行,函数指针进行排序按类别,

每个类别都有其自己的OPS结构。顶层OPS结构包含的类别OPS结构,这可能是NULL如果在subdev驱动程序不支持任何从该类别指针。

它看起来像这样:

struct v4l2_subdev_core_ops {
    int (*g_chip_ident)(struct v4l2_subdev *sd, struct v4l2_dbg_chip_ident *chip);
    int (*log_status)(struct v4l2_subdev *sd);
    int (*init)(struct v4l2_subdev *sd, u32 val);
    ...
};

struct v4l2_subdev_tuner_ops {
    ...
};

struct v4l2_subdev_audio_ops {
    ...
};

struct v4l2_subdev_video_ops {
    ...
};

struct v4l2_subdev_ops {
    const struct v4l2_subdev_core_ops  *core;
    const struct v4l2_subdev_tuner_ops *tuner;
    const struct v4l2_subdev_audio_ops *audio;
    const struct v4l2_subdev_video_ops *video;
};

是共同所有子设备的核心操作函数,其他类别的执行取决于子设备。例如视频设备是不大可能支持音频操作,反之亦然。

此设置限制的函数指针,同时还使其易于添加新操作函数和类别。

子设备驱动初始化的v4l2_subdev结构使用:

	v4l2_subdev_init(sd, &ops);

之后,您需要初始化一个独特的名字subdev->name和设置模块所有者。这样做是为了如果你使用的I2C辅助功能。

如果框架媒介整合是必要的,你必须初始化的media_entity结构通过调用media_entity_init()嵌入在v4l2_subdev结构(实体领域):

	struct media_pad *pads = &my_sd->pads;
	int err;

	err = media_entity_init(&sd->entity, npads, pads, 0);

pads数组必须此前已初始化。有没有需要手动设置结构media_entity类型和name字段,但如果需要修改字段必须初始化。










参考实体将自动获得/释放时,subdev设备节点(如果有的话)打开/关闭。

不要忘了清理的media实体子设备被破坏之前:
media_entity_cleanup(&sd->entity);

设备(桥)驱动程序需要注册v4l2_subdevv4l2_device:

int err = v4l2_device_register_subdev(v4l2_dev, sd);

可能会失败,如果subdev模块消失之前,它可以注册。
之后,这个功能调用成功的subdev-> dev字段指向v4l2_device。
如果的v4l2_device父设备具有非空MDEV字段
分设备实体将被自动注册media设备。
你可以注销一个子设备使用:
v4l2_device_unregister_subdev(sd);

事后subdev模块可以卸载和
 sd->dev == NULL.你可以直接调用操作函数
err = sd->ops->core->g_chip_ident(sd, &chip);
但是使用这个宏将更好和更容易:

	err = v4l2_subdev_call(sd, core, g_chip_ident, &chip);

宏将到右边的空指针检查,并如果subdev是NULL返回-ENODEV,
-ENOIOCTLCMD if either subdev->core or subdev->core->g_chip_ident is
NULL, 或实际的结果 subdev->ops->core->g_chip_ident ops.

它也可以调用全部或部分子设备:

	v4l2_device_call_all(v4l2_dev, 0, core, g_chip_ident, &chip);

不支持这个OPS的任何子设备跳过错误的效果将被忽略。如果你想检查是否有错误,使用这个:

	err = v4l2_device_call_until_err(v4l2_dev, 0, core, g_chip_ident, &chip);

Any error except -ENOIOCTLCMD will exit the loop with that error. If no
errors (except -ENOIOCTLCMD) occurred, then 0 is returned.

这两个调用的第二个参数是一组ID。如果为0,然后所有subdevs被调用。如果不为零,









那么只有那些其组ID匹配值将被调用。前桥驱动注册一个subdev它可以设置SD-> grp_id

任何想要(这是默认为0)。这个值是由桥式驱动器和拥有子设备驱动程序将不会修改或使用它。

这组ID给桥驱动器控制如何调用回调。例如,可能有多个音频芯片板,每一个有改变音量。但通常只

有一个实际上将被用来当用户想改变音量。


你可以设置组ID,subdev例如AUDIO_CONTROLLER和指定组ID值时调用v4l2_device_call_all()。

以确保它只会去的subdev需要。如果子设备需要通知其事件v4l2_device父母,那么它可以调用

v4l2_subdev_notify(SD,notification,ARG)。


这个宏检查是否有一个notify()的回调定义,如果没有返回-ENODEV。否则结果的通知()调用返回。

使用v4l2_subdev优势是它是一个通用结构,不包含任何对底层硬件知识。因此驱动程序可能包含几个subdevs使用I2C总线,

但也通过GPIO引脚控制subdev。这种区别是设置设备时只有关,但一旦subdev注册是完全透明的。


V4L2的子设备用户空间的API

-----------------------------

旁边公开通过v4l2_subdev_ops结构内核API,V4L2的子设备也可以直接控制用户空间应用。可以直接访问

子设备在/ dev中创建名为V4L-subdevX设备节点。


如果一个子设备支持直接用户空间的配置它必须设定以前注册的V4L2_SUBDEV_FL_HAS_DEVNODE标志。

子设备登记后,v4l2_device司机可以由调用v4l2_device_register_subdev_nodes()与

V4L2_SUBDEV_FL_HAS_DEVNODE标记所有注册子设备创建设备节点。子设备未注册时将被自动删除这些

设备节点。设备节点处理的V4L2的API一个子集

VIDIOC_QUERYCTRL
VIDIOC_QUERYMENU
VIDIOC_G_CTRL
VIDIOC_S_CTRL
VIDIOC_G_EXT_CTRLS
VIDIOC_S_EXT_CTRLS
VIDIOC_TRY_EXT_CTRLS

控制ioctls函数调用在V4L2的定义是相同的。他们的行为是相同的,唯一的例外,

他们只处理子设备执行控制。这些控制可以根据驱动程序,也可以通过一个(或几个)V4L2

的设备节点访问。

VIDIOC_DQEVENT

VIDIOC_SUBSCRIBE_EVENT

VIDIOC_UNSUBSCRIBE_EVENT

事件ioctls函数调用在V4L2的定义是相同。他们的行为是相同的,唯一的例外他们只分设备产生事件处理。

根据驱动程序,这些事件也可以报一个(或几个)V4L2的设备节点。


子设备驱动程序要使用的事件需要设置的V4L2_SUBDEV_USES_EVENTS v4l2_subdev::flags

和初始化之前注册的子设备v4l2_subdev:: nevents事件队列深度。注册事件后可以像往常一样

排队的v4l2_subdev:: devnode设备节点。


 要正确地支持事件,poll()的文件操作实施。Private ioctls没有在上述列表所有ioctls函数调用会直接传递

到子设备驱动程序通过core::ioctl操作。


I2C sub-device drivers

----------------------

由于这些驱动程序是很常见的,特殊的辅助功能是可以缓解使用这些驱动程序(V4L2-common.h)。

加入v4l2_subdev支持I2C驱动的推荐的方法是嵌入到每个I2C设备实例创建状态结构v4l2_subdev结构。


很简单的设备,没有状态结构,在这种情况下,你可以直接创建一个v4l2_subdev。

一个典型的状态结构看起来像这样(where 'chipname' is replaced bythe name of the chip):

struct chipname_state {
	struct v4l2_subdev sd;
	...  /* additional state fields */
};

如下初始化v4l2_subdev结构:
	
        v4l2_i2c_subdev_init(&state->sd, client, subdev_ops);












此功能将填补所有领域v4l2_subdev和确保v4l2_subdev和i2c_client都指向一个。

您还应该添加一个辅助内联函数从v4l2_subdev指针的去到chipname_state结构:

struct chipname_state {
    struct v4l2_subdev sd;
    ...  /* additional state fields */
};

如下初始化v4l2_subdev结构:

    v4l2_i2c_subdev_init(&state->sd, client, subdev_ops);

此功能将填补所有领域v4l2_subdev和保证的v4l2_subdev和i2c_client都指向一个。

您还应该添加一个辅助inline函数从v4l2_subdev指针的去到chipname_state结构:


static inline struct chipname_state *to_state(struct v4l2_subdev *sd)
{
    return container_of(sd, struct chipname_state, sd);
}

使用这个从v4l2_subdev结构i2c_client结构:

    struct i2c_client *client = v4l2_get_subdevdata(sd);

这从一个i2c_client到v4l2_subdev结构:

    struct v4l2_subdev *sd = i2c_get_clientdata(client);

确保调用v4l2_device_unregister_subdev(SD),remove()回调时被调用
这将注销桥式驱动器的子设备。调用此设备,即使是没有注册的,它是安全的。

你需要这样做,因为当桥式驱动器的销毁I2C适配器的remove()回调呼吁该适配器
上的I2C设备。之后相应v4l2_subdev结构是无效的,所以他们必须首先要注销。
 v4l2_device_unregister_subdev(SD)的remove()调用的回调保证,这是一直在做正确。



桥式驱动器也有一些辅助功能,它可以使用:

struct v4l2_subdev *sd = v4l2_i2c_new_subdev(v4l2_dev, adapter,
           "module_foo", "chipid", 0x36, NULL);

加载模块(如果没有需要加载的模块,可以为NULL)并调用i2c_adapter和
芯片/地址参数的i2c_new_device()。如果一切顺利,那么它注册这个v4l2_device 子设备


您还可以使用最后一个参数的v4l2_i2c_new_subdev()传递一个可能的I2C地址的数组,
它应该探究。这些探头地址仅用于如果前面的参数是0。一个非零的说法意味着,
你知道确切的I2C地址,所以在这种情况下,没有探测将发生。


如果出错了,这两个函数返回NULL。

请注意,你传递 CHIPID给v4l2_i2c_new_subdev()通常是作为相同模块的名称。它允许你指定一个芯片的变型,例如
SAA7114”或“SAA7115。一般来说,虽然I2C驱动程序会自动检测这个
使用CHIPID是需要看着在晚些时候更紧密的东西。它不同于之间的I2C驱动程序这样可能会造成混淆。
看到支持该芯片的变种,你可以在I2C驱动程序代码为i2c_device_id表。这将列出所有的可能性。


有两个辅助函数:

v4l2_i2c_new_subdev_cfg: this function adds new irq and platform_data
arguments and has both 'addr' and 'probed_addrs' arguments: if addr is not
0 then that will be used (non-probing variant), otherwise the probed_addrs
are probed.

For example: this will probe for address 0x10:

struct v4l2_subdev *sd = v4l2_i2c_new_subdev_cfg(v4l2_dev, adapter,
           "module_foo", "chipid", 0, NULL, 0, I2C_ADDRS(0x10));

v4l2_i2c_new_subdev_board uses an i2c_board_info struct which is passed
to the i2c driver and replaces the irq, platform_data and addr arguments.

If the subdev supports the s_config core ops, then that op is called with
the irq and platform_data arguments after the subdev was setup. The older
v4l2_i2c_new_(probed_)subdev functions will call s_config as well, but with
irq set to 0 and platform_data set to NULL.


struct video_device

-------------------

实际设备节点在/ dev目录使用的video_device结构(V4L2 dev.h创建。

这个结构可以动态分配或嵌入在一个更大的结构。要动态分配使用:

To allocate it dynamically use:

    struct video_device *vdev = video_device_alloc();

    if (vdev == NULL)
        return -ENOMEM;

    vdev->release = video_device_release;


如果嵌入到一个更大的结构,那么你必须设置的release()函数回调到自己的函数:

struct video_device *vdev = &my_vdev->vdev;

    vdev->release = my_vdev_release;

释放回调必须设置时被称为最后一个用户的视频设备出口。默认video_device_release()

回调只是调用kfree释放分配的内存

你也应该设定这些字段

- v4l2_dev: 设置这个v4l2_device父设备。
- name: 设置为描述性和独特的东西。
- fops: 设置这个v4l2_file_operations结构。
- ioctl_ops: if you use the v4l2_ioctl_ops to simplify ioctl maintenance
  (highly recommended to use this and it might become compulsory in the
  future!), then set this to your v4l2_ioctl_ops struct.
- lock: leave to NULL if you want to do all the locking in the driver.
  Otherwise you give it a pointer to a struct mutex_lock and before any
  of the v4l2_file_operations is called this lock will be taken by the
  core and released afterwards.
- prio: keeps track of the priorities. Used to implement VIDIOC_G/S_PRIORITY.
  If left to NULL, then it will use the struct v4l2_prio_state in v4l2_device.
  If you want to have a separate priority state per (group of) device node(s),
  then you can point it to your own struct v4l2_prio_state.
- parent: you only set this if v4l2_device was registered with NULL as
  the parent device struct. This only happens in cases where one hardware
  device has multiple PCI devices that all share the same v4l2_device core.

  The cx88 driver is an example of this: one core v4l2_device struct, but
  it is used by both an raw video PCI device (cx8800) and a MPEG PCI device
  (cx8802). Since the v4l2_device cannot be associated with a particular
  PCI device it is setup without a parent device. But when the struct
  video_device is setup you do know which parent PCI device to use.
- flags: optional. Set to V4L2_FL_USE_FH_PRIO if you want to let the framework
  handle the VIDIOC_G/S_PRIORITY ioctls. This requires that you use struct
  v4l2_fh. Eventually this flag will disappear once all drivers use the core
  priority handling. But for now it has to be set explicitly.


如果使用v4l2_ioctl_ops,那么你应该设置。unlocked_ioctl video_ioctl2在v4l2_file_operations结构。


不要使用。IOCTL!将在未来过时。

v4l2_file_operations结构的file_operations的一个子集。主要区别在于该inode参数被忽略,因为它从来没有使用过。

如果框架传媒的整合是必要的,你必须初始化media_entity结构嵌入式通过调用media_entity_init()的
video_device结构(entity字段):

	struct media_pad *pad = &my_vdev->pad;
	int err;

	err = media_entity_init(&vdev->entity, 1, pad, 0);

垫阵列必须先前已初始化。有没有需要手动设置的结构media_entity typename字段

A reference to the entity will be automatically acquired/released when the
video device is opened/closed.


v4l2_file_operations 和 锁

--------------------------------

你可以设置一个指针,在结构的video_device中mutex_lock

通常这将是一个顶层互斥或互斥每个设备节点。如果你想更细致的那么你必须将其设置为NULL和你自己的锁定。

如果锁被指定那么所有的文件操作将被序列化该锁。如果您使用videobuf那么你

必须通过相同的锁的videobuf队列初始化函数:如果videobuf等待一帧到达,

然后它会暂时解除锁定和重新锁定它之后。如果您的驱动程序代码中的等待,

那么你应该做同样允许其他进程访问设备节点的第一道工序,而在等待着什么。


一个热插拔断开实施也应采取之前调用v4l2_device_disconnect锁。


video_device 注册

-------------------------

接下来,您注册的视频设备:这会为您创建的字符设备
    err = video_register_device(vdev, VFL_TYPE_GRABBER, -1);
    if (err) {
        video_device_release(vdev); /* or kfree(my_vdev); */
        return err;
    }
如果的v4l2_device其父节点设备具有非空MDEV字段,视频设备的实体将被自动注册media
备。
注册哪个设备依赖于类型参数。存在以下几种类型:

VFL_TYPE_GRABBER: 对视频输入/输出设备video
X
VFL_TYPE_VBI: vbiX垂直空白数据(即关闭字幕,图文电视)
VFL_TYPE_RADIO: radioX用于无线电调谐器

VFL_TYPE_VTX: vtxX图文电视设备(不建议使用,不使用)

最后一个参数给你一定数量的控制了设备使用设备节点(即在videoXX)。通常情况下你将通过-1让V4L2框架选择第一个释放号码
但有时用户想选择一个特定的节点数量。这是常见的驱动程序允许用户通过驱动模块选项选择一个特定的设备节点数量。
这一数字然后被传递给这个函数和video_register_device将尝试选择该设备节点数量。
如果该号码已在使用中然后,下一个空闲的设备节点数量将被选中,它会发出警告的内核日志。

使用情况是,如果一个驱动程序创建许多设备。在这种情况下它可以是有用的放置在不同范围不同视频设备。
例如,视频捕获设备开始为0,视频输出设备16开始

所以,你可以使用的最后一个参数指定一个最低设备节点数量和V4L2框架将尽量挑选的第释放号码是相同或更高你通过什么。
如果失败,那么它只会选择第一个释放号码。由于在这种情况下,你不关心不能够选择指定的设备节点数量有关警告,
可以调用函数video_register_device_no_warn()代替。每当创建一个设备节点的一些属性也为您创建。
如果你看在/ sys/class/video4linux你看到的设备。去到如video0,
你会看到“name”和“index”属性。 'name'属性'name'字段的video_device结构。“index”属性是设备节点索引:

每个调用video_register_device()指数只上升1。寄存器总是第一个视频设备节点索引0开始。用户可以设置udev的规则,
利用索引属性使花哨的设备名称(如mpegX MPEG视频捕捉设备节点)。之后该设备已成功注册,那么你可以使用这些字段:

- vfl_type: 设备类型传递到video_register_device
- minor: 分配装置的次设备号。
- num: 设备节点数目(i.e. the X in videoX).
- index: 设备索引号。如果注册失败,那么你需要调用video_device_release()来释放分配的video_device结构,

或自己结构的video_device如果被嵌入。的VDEV>发布()回调将永远不会被调用,

如果注册失败,也不应该你曾经尝试注销的设备如果注册失败。


video_device 清除

--------------------

当被删除,在卸载的驱动程序,或因为被断开USB设备,视频设备节点,那么你应该注销他们:

    video_unregister_device(vdev);

This will remove the device nodes from sysfs (causing udev to remove them
from /dev).

After video_unregister_device() returns no new opens can be done. However,
in the case of USB devices some application might still have one of these
device nodes open. So after the unregister all file operations (except
release, of course) will return an error as well.

当最后一个用户的视频设备节点退出,然后这个VDEV - >release(
回调被调用,你可以做最后的清理。

不要忘了清理媒体视频设备的实体相关的,如果它已经被初始化:


    media_entity_cleanup(&vdev->entity);

This can be done from the release callback.


video_device 辅助函数

-----------------------------

有一些有用的辅助功能:

- file/video_device private data

You can set/get driver private data in the video_device struct using:

void *video_get_drvdata(struct video_device *vdev);
void video_set_drvdata(struct video_device *vdev, void *data);

Note that you can safely call video_set_drvdata() before calling
video_register_device().

And this function:

struct video_device *video_devdata(struct file *file);

returns the video_device belonging to the file struct.

The video_drvdata function combines video_get_drvdata with video_devdata:

void *video_drvdata(struct file *file);

You can go from a video_device struct to the v4l2_device struct using:

struct v4l2_device *v4l2_dev = vdev->v4l2_dev;

- Device node name

The video_device node kernel name can be retrieved using

const char *video_device_node_name(struct video_device *vdev);


userspace工具,如udev的名字被用来作为一个提示。应在可能的情况下使用,

而不是访问的video_device:: num和的video_device::minor字段的功能。


video buffer 辅助函数

-----------------------------

The v4l2 core API provides a set of standard methods (called "videobuf")
for dealing with video buffers. Those methods allow a driver to implement
read(), mmap() and overlay() in a consistent way.  There are currently
methods for using video buffers on devices that supports DMA with
scatter/gather method (videobuf-dma-sg), DMA with linear access
(videobuf-dma-contig), and vmalloced buffers, mostly used on USB drivers
(videobuf-vmalloc).

Please see Documentation/video4linux/videobuf for more information on how
to use the videobuf layer.


struct v4l2_fh

--------------

结构v4l2_fh提供了一种轻松地保持文件句柄是由V4L2的框架使用具体数据。

新的驱动程序必须使用struct v4l2_fh,因为它也可以用来实现优先处理

(VIDIOC_G/ S_PRIORITY)如果的video_device标志V4L2_FL_USE_FH_PRIO也。

(在V4L2的框架,而不是驱动器)v4l2_fh用户知道驱动程序是否使用通过测试

V4L2_FL_USES_V4L2_FH位的video_device-> flags中v4l2_fh作为其文件> private_data

指针。每当v4l2_fh_init()被调用时设置此位。

struct v4l2_fh is allocated as a part of the driver's own file handle
structure and file->private_data is set to it in the driver's open
function by the driver.

In many cases the struct v4l2_fh will be embedded in a larger structure.
In that case you should call v4l2_fh_init+v4l2_fh_add in open() and
v4l2_fh_del+v4l2_fh_exit in release().

Drivers can extract their own file handle structure by using the container_of
macro. Example:

struct my_fh {
    int blah;
    struct v4l2_fh fh;
};

...

int my_open(struct file *file)
{
    struct my_fh *my_fh;
    struct video_device *vfd;
    int ret;

    ...

    my_fh = kzalloc(sizeof(*my_fh), GFP_KERNEL);

    ...

    ret = v4l2_fh_init(&my_fh->fh, vfd);
    if (ret) {
        kfree(my_fh);
        return ret;
    }

    ...

    file->private_data = &my_fh->fh;
    v4l2_fh_add(&my_fh->fh);
    return 0;
}

int my_release(struct file *file)
{
    struct v4l2_fh *fh = file->private_data;
    struct my_fh *my_fh = container_of(fh, struct my_fh, fh);

    ...
    v4l2_fh_del(&my_fh->fh);
    v4l2_fh_exit(&my_fh->fh);
    kfree(my_fh);
    return 0;
}

下面是一个使用的v4l2_fh功能的简短说明:

int v4l2_fh_init(struct v4l2_fh *fh, struct video_device *vdev)

  Initialise the file handle. This *MUST* be performed in the driver's
  v4l2_file_operations->open() handler.

void v4l2_fh_add(struct v4l2_fh *fh)

添加v4l2_fh的video_device文件句柄列表。必须调用一次完全初始化文件句柄。

void v4l2_fh_del(struct v4l2_fh *fh)

  Unassociate the file handle from video_device(). The file handle
  exit function may now be called.

void v4l2_fh_exit(struct v4l2_fh *fh)

  Uninitialise the file handle. After uninitialisation the v4l2_fh
  memory can be freed.

 多个驱动程序需要做一些事情时,第一个文件句柄打开和关闭最后一个文件句柄时。

增加了两个辅助功能检查,是否v4l2_fh结构是唯一的打开文件句柄相关联的设备节点:

int v4l2_fh_is_singular(struct v4l2_fh *fh)

  Returns 1 if the file handle is the only open file handle, else 0.

int v4l2_fh_is_singular_file(struct file *filp)

  Same, but it calls v4l2_fh_is_singular with filp->private_data.

V4L2 events

-----------

V4L2的事件提供了一个通用的方式来传递事件到用户空间。
必须使用v4l2_fh的驱动程序能够支持V4L2的事件。

Useful functions:

- v4l2_event_alloc()

  使用事件驱动程序必须分配的文件句柄的事件。驱动程序通过调用函数不止一次,

可以保证至少n个总事件已分配。功能可能无法在原子上下文中被调用

- v4l2_event_queue()

  视频设备的队列中的事件。驱动程序的唯一责任是填写的类型和数据字段。V4L2的其他领域将被填充

- v4l2_event_subscribe()

  的video_device-> ioctl_ops> vidioc_subscribe_event必须检查驱动程序是否能

够产生与特定事件ID的事件。然后,它调用v4l2_event_subscribe()订阅该事件。

- v4l2_event_unsubscribe()

  在结构v4l2_ioctl_ops vidioc_unsubscribe_event。一个驱动程序可以直接使用

v4l2_event_unsubscribe(),除非它想在取消订阅过程中涉及。

在特殊类型V4L2_EVENT_ALL的可能被用来取消所有活动。
驱动程序可能需要一种特殊的方式来处理。
- v4l2_event_pending()  返回挂起的事件数量。实施调查时非常有用。

Events are delivered to user space through the poll system call. The driver
can use v4l2_fh->events->wait wait_queue_head_t as the argument for
poll_wait().

There are standard and private events. New standard events must use the
smallest available event type. The drivers must allocate their events from
their own class starting from class base. Class base is
V4L2_EVENT_PRIVATE_START + n * 1000 where n is the lowest available number.
The first event type in the class is reserved for future use, so the first
available event type is 'class base + 1'.

An example on how the V4L2 events may be used can be found in the OMAP
3 ISP driver available at as of
writing this.



V4L2-framework.txt 源文档在/Documentation/video4linux目录下.

内核自带的 Documentation目录是一个非常有用的参考资料和学习资料。
建议多读!!!









                                                                       
                                                                       
                              
                              
                              
                              

Video for Linux Two API Specification

Revision 0.24

Michael H Schimek

            
          

Bill Dirks

Hans Verkuil

Martin Rubli

This document is copyrighted © 1999-2008 by BillDirks, Michael H. Schimek, Hans Verkuil and Martin Rubli.

Permission is granted to copy, distribute and/or modifythis document under the terms of the GNU Free Documentation License,Version 1.1 or any later version published by the Free SoftwareFoundation; with no Invariant Sections, with no Front-Cover Texts, andwith no Back-Cover Texts. A copy of the license is included in theappendix entitled "GNU Free Documentation License".

Programming examples can be used and distributed withoutrestrictions.


Table of Contents

Introduction

1.  Common API Elements
1.1.  Opening and Closing Devices
1.1.1.  Device Naming 1.1.2.  Related Devices 1.1.3.  Multiple Opens 1.1.4.  Shared Data Streams 1.1.5.  Functions
1.2.  Querying Capabilities 1.3.  Application Priority 1.4.  Video Inputs and Outputs 1.5.  Audio Inputs and Outputs 1.6.  Tuners and Modulators
1.6.1.  Tuners 1.6.2.  Modulators 1.6.3.  Radio Frequency 1.6.4.  Satellite Receivers
1.7.  Video Standards 1.8.  User Controls 1.9.  Extended Controls
1.9.1.  Introduction 1.9.2.  The Extended Control API 1.9.3.  Enumerating Extended Controls 1.9.4.  Creating Control Panels 1.9.5.  MPEG Control Reference 1.9.6.  Camera Control Reference
1.10.  Data Formats
1.10.1.  Data Format Negotiation 1.10.2.  Image Format Enumeration
1.11.  Image Cropping, Insertion and Scaling
1.11.1.  Cropping Structures 1.11.2.  Scaling Adjustments 1.11.3.  Examples
1.12.  Streaming Parameters
2.  Image Formats
2.1.  Standard Image Formats 2.2.  Colorspaces 2.3.  Indexed Format 2.4.  RGB Formats
Packed RGB formats -- Packed RGB formats V4L2_PIX_FMT_SBGGR8 ('BA81') -- Bayer RGB format V4L2_PIX_FMT_SBGGR16 ('BA82') -- Bayer RGB format
2.5.  YUV Formats
Packed YUV formats -- Packed YUV formats V4L2_PIX_FMT_GREY ('GREY') -- Grey-scale image V4L2_PIX_FMT_Y16 ('Y16 ') -- Grey-scale image V4L2_PIX_FMT_YUYV ('YUYV') -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2 V4L2_PIX_FMT_UYVY ('UYVY') -- Variation of V4L2_PIX_FMT_YUYV with different order of samplesin memory V4L2_PIX_FMT_Y41P ('Y41P') -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1 V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12') -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0 V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9') -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0 V4L2_PIX_FMT_YUV422P ('422P') -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to V4L2_PIX_FMT_YUYV V4L2_PIX_FMT_YUV411P ('411P') -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to V4L2_PIX_FMT_Y41P V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21') -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to V4L2_PIX_FMT_YVU420
2.6.  Compressed Formats 2.7.  Reserved Format Identifiers
3.  Input/Output
3.1.  Read/Write 3.2.  Streaming I/O (Memory Mapping) 3.3.  Streaming I/O (User Pointers) 3.4.  Asynchronous I/O 3.5.  Buffers
3.5.1.  Timecodes
3.6.  Field Order
4.  Interfaces
4.1.  Video Capture Interface
4.1.1.  Querying Capabilities 4.1.2.  Supplemental Functions 4.1.3.  Image Format Negotiation 4.1.4.  Reading Images
4.2.  Video Overlay Interface
4.2.1.  Querying Capabilities 4.2.2.  Supplemental Functions 4.2.3.  Setup 4.2.4.  Overlay Window 4.2.5.  Enabling Overlay
4.3.  Video Output Interface
4.3.1.  Querying Capabilities 4.3.2.  Supplemental Functions 4.3.3.  Image Format Negotiation 4.3.4.  Writing Images
4.4.  Video Output Overlay Interface
4.4.1.  Querying Capabilities 4.4.2.  Framebuffer 4.4.3.  Overlay Window and Scaling 4.4.4.  Enabling Overlay
4.5.  Codec Interface 4.6.  Effect Devices Interface 4.7.  Raw VBI Data Interface
4.7.1.  Querying Capabilities 4.7.2.  Supplemental Functions 4.7.3.  Raw VBI Format Negotiation 4.7.4.  Reading and writing VBI images
4.8.  Sliced VBI Data Interface
4.8.1.  Querying Capabilities 4.8.2.  Supplemental Functions 4.8.3.  Sliced VBI Format Negotiation 4.8.4.  Reading and writing sliced VBI data
4.9.  Teletext Interface 4.10.  Radio Interface
4.10.1.  Querying Capabilities 4.10.2.  Supplemental Functions 4.10.3.  Programming
4.11.  RDS Interface
I.  Function Reference
V4L2 close() -- Close a V4L2 device V4L2 ioctl() -- Program a V4L2 device ioctl VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command ioctl VIDIOC_ENUMAUDIO -- Enumerate audio inputs ioctl VIDIOC_ENUMAUDOUT -- Enumerate audio outputs ioctl VIDIOC_ENUM_FMT -- Enumerate image formats ioctl VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes ioctl VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals ioctl VIDIOC_ENUMINPUT -- Enumerate video inputs ioctl VIDIOC_ENUMOUTPUT -- Enumerate video outputs ioctl VIDIOC_ENUMSTD -- Enumerate supported video standards ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT -- Query or select the current audio output ioctl VIDIOC_G_CHIP_IDENT -- Identify the chips on a TV card ioctl VIDIOC_G_CROP, VIDIOC_S_CROP -- Get or set the current cropping rectangle ioctl VIDIOC_G_CTRL, VIDIOC_S_CTRL -- Get or set the value of a control ioctl VIDIOC_G_ENC_INDEX -- Get meta data about a compressed video stream ioctl VIDIOC_G_EXT_CTRLS, VIDIOC_S_EXT_CTRLS,VIDIOC_TRY_EXT_CTRLS -- Get or set the value of several controls, try controlvalues ioctl VIDIOC_G_FBUF, VIDIOC_S_FBUF -- Get or set frame buffer overlay parameters ioctl VIDIOC_G_FMT, VIDIOC_S_FMT,VIDIOC_TRY_FMT -- Get or set the data format, try a format ioctl VIDIOC_G_FREQUENCY, VIDIOC_S_FREQUENCY -- Get or set tuner or modulator radiofrequency ioctl VIDIOC_G_INPUT, VIDIOC_S_INPUT -- Query or select the current video input ioctl VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP --  ioctl VIDIOC_G_MODULATOR, VIDIOC_S_MODULATOR -- Get or set modulator attributes ioctl VIDIOC_G_OUTPUT, VIDIOC_S_OUTPUT -- Query or select the current video output ioctl VIDIOC_G_PARM, VIDIOC_S_PARM -- Get or set streaming parameters ioctl VIDIOC_G_PRIORITY, VIDIOC_S_PRIORITY -- Query or request the access priority associated with afile descriptor ioctl VIDIOC_G_SLICED_VBI_CAP -- Query sliced VBI capabilities ioctl VIDIOC_G_STD, VIDIOC_S_STD -- Query or select the video standard of the current input ioctl VIDIOC_G_TUNER, VIDIOC_S_TUNER -- Get or set tuner attributes ioctl VIDIOC_LOG_STATUS -- Log driver status information ioctl VIDIOC_OVERLAY -- Start or stop video overlay ioctl VIDIOC_QBUF, VIDIOC_DQBUF -- Exchange a buffer with the driver ioctl VIDIOC_QUERYBUF -- Query the status of a buffer ioctl VIDIOC_QUERYCAP -- Query device capabilities ioctl VIDIOC_QUERYCTRL, VIDIOC_QUERYMENU -- Enumerate controls and menu control items ioctl VIDIOC_QUERYSTD -- Sense the video standard received by the currentinput ioctl VIDIOC_REQBUFS -- Initiate Memory Mapping or User Pointer I/O ioctl VIDIOC_STREAMON, VIDIOC_STREAMOFF -- Start or stop streaming I/O V4L2 mmap() -- Map device memory into application address space V4L2 munmap() -- Unmap device memory V4L2 open() -- Open a V4L2 device V4L2 poll() -- Wait for some event on a file descriptor V4L2 read() -- Read from a V4L2 device V4L2 select() -- Synchronous I/O multiplexing V4L2 write() -- Write to a V4L2 device
5.  V4L2 Driver Programming 6.  Changes
6.1.  Differences between V4L and V4L2
6.1.1.  Opening and Closing Devices 6.1.2.  Querying Capabilities 6.1.3.  Video Sources 6.1.4.  Tuning 6.1.5.  Image Properties 6.1.6.  Audio 6.1.7.  Frame Buffer Overlay 6.1.8.  Cropping 6.1.9.  Reading Images, Memory Mapping 6.1.10.  Reading Raw VBI Data 6.1.11.  Miscellaneous
6.2.  Changes of the V4L2 API
6.2.1.  Early Versions 6.2.2.  V4L2 Version 0.16 1999-01-31 6.2.3.  V4L2 Version 0.18 1999-03-16 6.2.4.  V4L2 Version 0.19 1999-06-05 6.2.5.  V4L2 Version 0.20 (1999-09-10) 6.2.6.  V4L2 Version 0.20 incremental changes 6.2.7.  V4L2 Version 0.20 2000-11-23 6.2.8.  V4L2 Version 0.20 2002-07-25 6.2.9.  V4L2 in Linux 2.5.46, 2002-10 6.2.10.  V4L2 2003-06-19 6.2.11.  V4L2 2003-11-05 6.2.12.  V4L2 in Linux 2.6.6, 2004-05-09 6.2.13.  V4L2 in Linux 2.6.8 6.2.14.  V4L2 spec erratum 2004-08-01 6.2.15.  V4L2 in Linux 2.6.14 6.2.16.  V4L2 in Linux 2.6.15 6.2.17.  V4L2 spec erratum 2005-11-27 6.2.18.  V4L2 spec erratum 2006-01-10 6.2.19.  V4L2 spec erratum 2006-02-03 6.2.20.  V4L2 spec erratum 2006-02-04 6.2.21.  V4L2 in Linux 2.6.17 6.2.22.  V4L2 spec erratum 2006-09-23 (Draft 0.15) 6.2.23.  V4L2 in Linux 2.6.18 6.2.24.  V4L2 in Linux 2.6.19 6.2.25.  V4L2 spec erratum 2006-10-12 (Draft 0.17) 6.2.26.  V4L2 in Linux 2.6.21 6.2.27.  V4L2 in Linux 2.6.22 6.2.28.  V4L2 in Linux 2.6.24 6.2.29.  V4L2 in Linux 2.6.25
6.3.  Relation of V4L2 to other Linux multimedia APIs
6.3.1.  X Video Extension 6.3.2.  Digital Video 6.3.3.  Audio Interfaces
6.4.  Experimental API Elements 6.5.  Obsolete API Elements
A.  Video For Linux Two Header File B.  Video Capture Example C.  GNU Free Documentation License
C.1.  0. PREAMBLE C.2.  1. APPLICABILITY AND DEFINITIONS C.3.  2. VERBATIM COPYING C.4.  3. COPYING IN QUANTITY C.5.  4. MODIFICATIONS C.6.  5. COMBINING DOCUMENTS C.7.  6. COLLECTIONS OF DOCUMENTS C.8.  7. AGGREGATION WITH INDEPENDENT WORKS C.9.  8. TRANSLATION C.10.  9. TERMINATION C.11.  10. FUTURE REVISIONS OF THIS LICENSE C.12.  Addendum
List of Types References
List of Figures 1-1.  Image Cropping, Insertion and Scaling 3-1.  Field Order, Top Field First Transmitted 3-2.  Field Order, Bottom Field First Transmitted 4-1.  Line synchronization 4-2.  ITU-R 525 line numbering (M/NTSC and M/PAL) 4-3.  ITU-R 625 line numbering
List of Examples 1-1.  Information about the current video input 1-2.  Switching to the first video input 1-3.  Information about the current audio input 1-4.  Switching to the first audio input 1-5.  Information about the current video standard 1-6.  Listing the video standards supported by the currentinput 1-7.  Selecting a new video standard 1-8.  Enumerating all controls 1-9.  Changing controls 1-10.  Resetting the cropping parameters 1-11.  Simple downscaling 1-12.  Selecting an output area 1-13.  Current scaling factor and pixel aspect 2-1.  ITU-R Rec. BT.601 color conversion 2-1.  V4L2_PIX_FMT_BGR24 4 × 4 pixelimage 2-1.  V4L2_PIX_FMT_SBGGR8 4 × 4pixel image 2-1.  V4L2_PIX_FMT_SBGGR16 4 × 4pixel image 2-1.  V4L2_PIX_FMT_GREY 4 × 4pixel image 2-1.  V4L2_PIX_FMT_Y16 4 × 4pixel image 2-1.  V4L2_PIX_FMT_YUYV 4 × 4pixel image 2-1.  V4L2_PIX_FMT_UYVY 4 × 4pixel image 2-1.  V4L2_PIX_FMT_Y41P 8 × 4pixel image 2-1.  V4L2_PIX_FMT_YVU420 4 × 4pixel image 2-1.  V4L2_PIX_FMT_YVU410 4 × 4pixel image 2-1.  V4L2_PIX_FMT_YUV422P 4 × 4pixel image 2-1.  V4L2_PIX_FMT_YUV411P 4 × 4pixel image 2-1.  V4L2_PIX_FMT_NV12 4 × 4pixel image 3-1.  Mapping buffers 3-2.  Initiating streaming I/O with user pointers 4-1.  Finding a framebuffer device for OSD

Introduction

Video For Linux Two is the second version of the Video ForLinux API, a kernel interface for analog radio and video capture andoutput drivers.

Early drivers used ad-hoc interfaces. These were replaced inLinux 2.2 by Alan Cox' V4L API, based on the interface of the bttvdriver. In 1999 Bill Dirks started the development of V4L2 to fix someshortcomings of V4L and to support a wider range of devices. The APIwas revised again in 2002 prior to its inclusion in Linux 2.5/2.6, andwork continues on improvements and additions while maintainingcompatibility with existing drivers and applications. In 2006/2007efforts began on FreeBSD drivers with a V4L2 interface.

This book documents the V4L2 API. Intended audience aredriver and application writers.

If you have questions or ideas regarding the API, pleasewrite to the Video4Linux mailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list. For inquiries aboutthe V4L2 specification contact the [email protected].

The latest version of this document and the DocBook SGMLsources are hosted at http://v4l2spec.bytesex.org,andhttp://linuxtv.org/downloads/video4linux/API/V4L2_API.


Chapter 1. Common API Elements

Programming a V4L2 device consists of thesesteps:

  • Opening the device

  • Changing device properties, selecting a video and audioinput, video standard, picture brightness a. o.

  • Negotiating a data format

  • Negotiating an input/output method

  • The actual input/output loop

  • Closing the device

In practice most steps are optional and can be executed out oforder. It depends on the V4L2 device type, you can read about thedetails inChapter 4. In this chapter we will discussthe basic concepts applicable to all devices.


1.1. Opening and Closing Devices

1.1.1. Device Naming

V4L2 drivers are implemented as kernel modules, loadedmanually by the system administrator or automatically when a device isfirst opened. The driver modules plug into the "videodev" kernelmodule. It provides helper functions and a common applicationinterface specified in this document.

Each driver thus loaded registers one or more device nodeswith major number 81 and a minor number between 0 and 255. Assigningminor numbers to V4L2 devices is entirely up to the system administrator,this is primarily intended to solve conflicts between devices.[1] The module options to select minor numbers are namedafter the device special file with a "_nr" suffix. For example "video_nr"for/dev/video video capture devices. The number isan offset to the base minor number associated with the device type.[2] When the driver supports multiple devices of the sametype more than one minor number can be assigned, separated by commas:

> insmod mydriver.o video_nr=0,1 radio_nr=0,1

In /etc/modules.conf this may bewritten as:

alias char-major-81-0 mydriver
alias char-major-81-1 mydriver
alias char-major-81-64 mydriver              
options mydriver video_nr=0,1 radio_nr=0,1   
          
When an application attempts to open a devicespecial file with major number 81 and minor number 0, 1, or 64, load"mydriver" (and the "videodev" module it depends upon).
Register the first two video capture devices withminor number 0 and 1 (base number is 0), the first two radio devicewith minor number 64 and 65 (base 64).
When no minor number is given as moduleoption the driver supplies a default.  Chapter 4recommends the base minor numbers to be used for the various devicetypes. Obviously minor numbers must be unique. When the number isalready in use the offending device will not beregistered.

By convention system administrators create variouscharacter device special files with these major and minor numbers inthe/dev directory. The names recomended for thedifferent V4L2 device types are listed inChapter 4.

The creation of character special files (withmknod) is a privileged operation anddevices cannot be opened by major and minor number. That meansapplications cannotreliable scan for loaded orinstalled drivers. The user must enter a device name, or theapplication can try the conventional device names.

Under the device filesystem (devfs) the minor numberoptions are ignored. V4L2 drivers (or by proxy the "videodev" module)automatically create the required device files in the/dev/v4l directory using the conventional devicenames above.


1.1.2. Related Devices

Devices can support several related functions. For examplevideo capturing, video overlay and VBI capturing are related becausethese functions share, amongst other, the same video input and tunerfrequency. V4L and earlier versions of V4L2 used the same device nameand minor number for video capturing and overlay, but different onesfor VBI. Experience showed this approach has several problems[3], and to make things worse the V4L videodev moduleused to prohibit multiple opens of a device.

As a remedy the present version of the V4L2 API relaxed theconcept of device types with specific names and minor numbers. Forcompatibility with old applications drivers must still register differentminor numbers to assign a default function to the device. But if relatedfunctions are supported by the driver they must be available under allregistered minor numbers. The desired function can be selected afteropening the device as described inChapter 4.

Imagine a driver supporting video capturing, videooverlay, raw VBI capturing, and FM radio reception. It registers threedevices with minor number 0, 64 and 224 (this numbering scheme isinherited from the V4L API). Regardless if/dev/video (81, 0) or/dev/vbi (81, 224) is opened the application canselect any one of the video capturing, overlay or VBI capturingfunctions. Without programming (e. g. reading from the devicewithddorcat)/dev/video captures video images, while/dev/vbi captures raw VBI data./dev/radio (81, 64) is invariable a radio device,unrelated to the video functions. Being unrelated does not imply thedevices can be used at the same time, however. Theopen()function may very well return anEBUSY error code.

Besides video input or output the hardware may alsosupport audio sampling or playback. If so, these functions areimplemented as OSS or ALSA PCM devices and eventually OSS or ALSAaudio mixer. The V4L2 API makes no provisions yet to find theserelated devices. If you have an idea please write to the Video4Linuxmailing list: https://listman.redhat.com/mailman/listinfo/video4linux-list.


1.1.3. Multiple Opens

In general, V4L2 devices can be opened more than once.When this is supported by the driver, users can for example start a"panel" application to change controls like brightness or audiovolume, while another application captures video and audio. In other words, panelapplications are comparable to an OSS or ALSA audio mixer application.When a device supports multiple functions like capturing and overlaysimultaneously, multiple opens allow concurrentuse of the device by forked processes or specialized applications.

Multiple opens are optional, although drivers shouldpermit at least concurrent accesses without data exchange, i. e. panelapplications. This impliesopen() can return anEBUSY error code when thedevice is already in use, as well asioctl() functions initiatingdata exchange (namely theVIDIOC_S_FMT ioctl), and theread()andwrite() functions.

Mere opening a V4L2 device does not grant exclusiveaccess.[4] Initiating data exchange however assigns the rightto read or write the requested type of data, and to change relatedproperties, to this file descriptor. Applications can requestadditional access privileges using the priority mechanism described inSection 1.3.


1.1.4. Shared Data Streams

V4L2 drivers should not support multiple applicationsreading or writing the same data stream on a device by copyingbuffers, time multiplexing or similar means. This is better handled bya proxy application in user space. When the driver supports streamsharing anyway it must be implemented transparently. The V4L2 API doesnot specify how conflicts are solved.


1.1.5. Functions

To open and close V4L2 devices applications use theopen() andclose() function, respectively. Devices areprogrammed using theioctl() function as explained in thefollowing sections.


1.2. Querying Capabilities

Because V4L2 covers a wide variety of devices not allaspects of the API are equally applicable to all types of devices.Furthermore devices of the same type have different capabilities andthis specification permits the omission of a few complicated and lessimportant parts of the API.

The VIDIOC_QUERYCAP ioctl is available to check if the kerneldevice is compatible with this specification, and to query thefunctions andI/Omethodssupported by the device. Other features can be queriedby calling the respective ioctl, for exampleVIDIOC_ENUMINPUTto learn about the number, types and names of video connectors on thedevice. Although abstraction is a major objective of this API, theioctl also allows driver specific applications to reliable identifythe driver.

All V4L2 drivers must supportVIDIOC_QUERYCAP. Applications should always callthis ioctl after opening the device.


1.3. Application Priority

When multiple applications share a device it may bedesirable to assign them different priorities. Contrary to thetraditional "rm -rf /" school of thought a video recording applicationcould for example block other applications from changing videocontrols or switching the current TV channel. Another objective is topermit low priority applications working in background, which can bepreempted by user controlled applications and automatically regaincontrol of the device at a later time.

Since these features cannot be implemented entirely in userspace V4L2 defines theVIDIOC_G_PRIORITY andVIDIOC_S_PRIORITYioctls to request and query the access priority associate with a filedescriptor. Opening a device assigns a medium priority, compatiblewith earlier versions of V4L2 and drivers not supporting these ioctls.Applications requiring a different priority will usually callVIDIOC_S_PRIORITY after verifying the device withtheVIDIOC_QUERYCAP ioctl.

Ioctls changing driver properties, such as VIDIOC_S_INPUT,return an EBUSY error code after another application obtained higher priority.An event mechanism to notify applications about asynchronous propertychanges has been proposed but not added yet.


1.4. Video Inputs and Outputs

Video inputs and outputs are physical connectors of adevice. These can be for example RF connectors (antenna/cable), CVBSa.k.a. Composite Video, S-Video or RGB connectors. Only video and VBIcapture devices have inputs, output devices have outputs, at least oneeach. Radio devices have no video inputs or outputs.

To learn about the number and attributes of theavailable inputs and outputs applications can enumerate them with theVIDIOC_ENUMINPUTandVIDIOC_ENUMOUTPUT ioctl, respectively. Thestruct v4l2_input returned by theVIDIOC_ENUMINPUTioctl also contains signal status information applicable when thecurrent video input is queried.

The VIDIOC_G_INPUT and VIDIOC_G_OUTPUT ioctl return theindex of the current video input or output. To select a differentinput or output applications call theVIDIOC_S_INPUT andVIDIOC_S_OUTPUT ioctl. Drivers must implement all the input ioctlswhen the device has one or more inputs, all the output ioctls when thedevice has one or more outputs.

Example 1-1. Information about the current video input

struct v4l2_input input;
int index;

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &index)) {
        perror ("VIDIOC_G_INPUT");
        exit (EXIT_FAILURE);
}

memset (&input, 0, sizeof (input));
input.index = index;

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
        perror ("VIDIOC_ENUMINPUT");
        exit (EXIT_FAILURE);
}

printf ("Current input: %s\n", input.name);
      

Example 1-2. Switching to the first video input

int index;

index = 0;

if (-1 == ioctl (fd, VIDIOC_S_INPUT, &index)) {
        perror ("VIDIOC_S_INPUT");
        exit (EXIT_FAILURE);
}
      

1.5. Audio Inputs and Outputs

Audio inputs and outputs are physical connectors of adevice. Video capture devices have inputs, output devices haveoutputs, zero or more each. Radio devices have no audio inputs oroutputs. They have exactly one tuner which in factis an audio source, but this API associatestuners with video inputs or outputs only, and radio devices havenone of these.[5] A connector on a TV card to loop back the receivedaudio signal to a sound card is not considered an audio output.

Audio and video inputs and outputs are associated. Selectinga video source also selects an audio source. This is most evident whenthe video and audio source is a tuner. Further audio connectors cancombine with more than one video input or output. Assumed twocomposite video inputs and two audio inputs exist, there may be up tofour valid combinations. The relation of video and audio connectorsis defined in theaudioset field of therespective struct v4l2_input or struct v4l2_output, where each bit representsthe index number, starting at zero, of one audio input or output.

To learn about the number and attributes of theavailable inputs and outputs applications can enumerate them with theVIDIOC_ENUMAUDIOandVIDIOC_ENUMAUDOUT ioctl, respectively. Thestruct v4l2_audio returned by theVIDIOC_ENUMAUDIO ioctlalso contains signal status information applicable when the currentaudio input is queried.

The VIDIOC_G_AUDIO and VIDIOC_G_AUDOUT ioctl reportthe current audio input and output, respectively. Note that, unlikeVIDIOC_G_INPUTandVIDIOC_G_OUTPUT these ioctls return a structureasVIDIOC_ENUMAUDIO andVIDIOC_ENUMAUDOUT do, not just an index.

To select an audio input and change its propertiesapplications call the VIDIOC_S_AUDIO ioctl. To select an audiooutput (which presently has no changeable properties) applicationscall theVIDIOC_S_AUDOUT ioctl.

Drivers must implement all input ioctls when the devicehas one or more inputs, all output ioctls when the device has oneor more outputs. When the device has any audio inputs or outputs thedriver must set theV4L2_CAP_AUDIO flag in thestruct v4l2_capability returned by theVIDIOC_QUERYCAP ioctl.

Example 1-3. Information about the current audio input

struct v4l2_audio audio;

memset (&audio, 0, sizeof (audio));

if (-1 == ioctl (fd, VIDIOC_G_AUDIO, &audio)) {
        perror ("VIDIOC_G_AUDIO");
        exit (EXIT_FAILURE);
}

printf ("Current input: %s\n", audio.name);
      

Example 1-4. Switching to the first audio input

struct v4l2_audio audio;

memset (&audio, 0, sizeof (audio)); /* clear audio.mode, audio.reserved */

audio.index = 0;

if (-1 == ioctl (fd, VIDIOC_S_AUDIO, &audio)) {
        perror ("VIDIOC_S_AUDIO");
        exit (EXIT_FAILURE);
}
      

1.6. Tuners and Modulators

1.6.1. Tuners

Video input devices can have one or more tunersdemodulating a RF signal. Each tuner is associated with one or morevideo inputs, depending on the number of RF connectors on the tuner.Thetype field of the respectivestruct v4l2_input returned by theVIDIOC_ENUMINPUT ioctl is set toV4L2_INPUT_TYPE_TUNER and itstuner field contains the index number ofthe tuner.

Radio devices have exactly one tuner with index zero, novideo inputs.

To query and change tuner properties applications use theVIDIOC_G_TUNER andVIDIOC_S_TUNER ioctl, respectively. Thestruct v4l2_tuner returned byVIDIOC_G_TUNER alsocontains signal status information applicable when the tuner of thecurrent video input, or a radio tuner is queried. Note thatVIDIOC_S_TUNER does not switch the current tuner,when there is more than one at all. The tuner is solely determined bythe current video input. Drivers must support both ioctls and set theV4L2_CAP_TUNER flag in the struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl when the device has one ormore tuners.


1.6.2. Modulators

Video output devices can have one or more modulators, uh,modulating a video signal for radiation or connection to the antennainput of a TV set or video recorder. Each modulator is associated withone or more video outputs, depending on the number of RF connectors onthe modulator. The typefield of therespective struct v4l2_output returned by theVIDIOC_ENUMOUTPUT ioctl isset toV4L2_OUTPUT_TYPE_MODULATOR and itsmodulator field contains the index numberof the modulator. This specification does not define radio outputdevices.

To query and change modulator properties applications usethe VIDIOC_G_MODULATOR and VIDIOC_S_MODULATOR ioctl. Note thatVIDIOC_S_MODULATOR does not switch the currentmodulator, when there is more than one at all. The modulator is solelydetermined by the current video output. Drivers must support bothioctls and set the V4L2_CAP_TUNER (sic) flag inthe struct v4l2_capability returned by theVIDIOC_QUERYCAP ioctl when thedevice has one or more modulators.


1.6.3. Radio Frequency

To get and set the tuner or modulator radio frequencyapplications use the VIDIOC_G_FREQUENCY and VIDIOC_S_FREQUENCYioctl which both take a pointer to a struct v4l2_frequency. These ioctlsare used for TV and radio devices alike. Drivers must support bothioctls when the tuner or modulator ioctls are supported, orwhen the device is a radio device.


1.6.4. Satellite Receivers

To be discussed. See also proposals by Peter Schlaf, [email protected] on 23 Oct 2002,subject: "Re: [V4L] Re: v4l2 api".


1.7. Video Standards

Video devices typically support one or more different videostandards or variations of standards. Each video input and output maysupport another set of standards. This set is reported by thestd field of struct v4l2_input andstruct v4l2_output returned by theVIDIOC_ENUMINPUT andVIDIOC_ENUMOUTPUTioctl, respectively.

V4L2 defines one bit for each analog video standardcurrently in use worldwide, and sets aside bits for driver definedstandards, e. g. hybrid standards to watch NTSC video tapes on PAL TVsand vice versa. Applications can use the predefined bits to select aparticular standard, although presenting the user a menu of supportedstandards is preferred. To enumerate and query the attributes of thesupported standards applications use theVIDIOC_ENUMSTD ioctl.

Many of the defined standards are actually just variationsof a few major standards. The hardware may in fact not distinguishbetween them, or do so internal and switch automatically. Thereforeenumerated standards also contain sets of one or more standardbits.

Assume a hypothetic tuner capable of demodulating B/PAL,G/PAL and I/PAL signals. The first enumerated standard is a set of Band G/PAL, switched automatically depending on the selected radiofrequency in UHF or VHF band. Enumeration gives a "PAL-B/G" or "PAL-I"choice. Similar a Composite input may collapse standards, enumerating"PAL-B/G/H/I", "NTSC-M" and "SECAM-D/K".[6]

To query and select the standard used by the current videoinput or output applications call theVIDIOC_G_STD andVIDIOC_S_STD ioctl, respectively. Thereceivedstandard can be sensed with theVIDIOC_QUERYSTD ioctl. Note parameter of all these ioctls is a pointer to av4l2_std_id type (a standard set),notan index into the standard enumeration.[7] Drivers must implement all video standard ioctlswhen the device has one or more video inputs or outputs.

Special rules apply to USB cameras where the notion of videostandards makes little sense. More generally any capture device,output devices accordingly, which is

  • incapable of capturing fields or frames at the nominalrate of the video standard, or

  • where timestamps referto the instant the field or frame was received by the driver, not thecapture time, or

  • where sequence numbersrefer to the frames received by the driver, not the capturedframes.

Here the driver shall set the std field of struct  v4l2_input and struct  v4l2_outputto zero, the  VIDIOC_G_STD, VIDIOC_S_STD, VIDIOC_QUERYSTDand VIDIOC_ENUMSTD ioctls shall return the EINVAL error code. [8]

Example 1-5. Information about the current video standard

v4l2_std_id std_id;
struct v4l2_standard standard;

if (-1 == ioctl (fd, VIDIOC_G_STD, &std_id)) {
        /* Note when VIDIOC_ENUMSTD always returns EINVAL this
           is no video device or it falls under the USB exception,
           and VIDIOC_G_STD returning EINVAL is no error. */

        perror ("VIDIOC_G_STD");
        exit (EXIT_FAILURE);
}

memset (&standard, 0, sizeof (standard));
standard.index = 0;

while (0 == ioctl (fd, VIDIOC_ENUMSTD, &standard)) {
        if (standard.id & std_id) {
               printf ("Current video standard: %s\n", standard.name);
               exit (EXIT_SUCCESS);
        }

        standard.index++;
}

/* EINVAL indicates the end of the enumeration, which cannot be
   empty unless this device falls under the USB exception. */

if (errno == EINVAL || standard.index == 0) {
        perror ("VIDIOC_ENUMSTD");
        exit (EXIT_FAILURE);
}
      

Example 1-6. Listing the video standards supported by the currentinput

struct v4l2_input input;
struct v4l2_standard standard;

memset (&input, 0, sizeof (input));

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &input.index)) {
        perror ("VIDIOC_G_INPUT");
        exit (EXIT_FAILURE);
}

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
        perror ("VIDIOC_ENUM_INPUT");
        exit (EXIT_FAILURE);
}

printf ("Current input %s supports:\n", input.name);

memset (&standard, 0, sizeof (standard));
standard.index = 0;

while (0 == ioctl (fd, VIDIOC_ENUMSTD, &standard)) {
        if (standard.id & input.std)
                printf ("%s\n", standard.name);

        standard.index++;
}

/* EINVAL indicates the end of the enumeration, which cannot be
   empty unless this device falls under the USB exception. */

if (errno != EINVAL || standard.index == 0) {
        perror ("VIDIOC_ENUMSTD");
        exit (EXIT_FAILURE);
}
      

Example 1-7. Selecting a new video standard

struct v4l2_input input;
v4l2_std_id std_id;

memset (&input, 0, sizeof (input));

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &input.index)) {
        perror ("VIDIOC_G_INPUT");
        exit (EXIT_FAILURE);
}

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
        perror ("VIDIOC_ENUM_INPUT");
        exit (EXIT_FAILURE);
}

if (0 == (input.std & V4L2_STD_PAL_BG)) {
        fprintf (stderr, "Oops. B/G PAL is not supported.\n");
        exit (EXIT_FAILURE);
}

/* Note this is also supposed to work when only B
   or G/PAL is supported. */

std_id = V4L2_STD_PAL_BG;

if (-1 == ioctl (fd, VIDIOC_S_STD, &std_id)) {
        perror ("VIDIOC_S_STD");
        exit (EXIT_FAILURE);
}
      

1.8. User Controls

Devices typically have a number of user-settable controlssuch as brightness, saturation and so on, which would be presented tothe user on a graphical user interface. But, different deviceswill have different controls available, and furthermore, the range ofpossible values, and the default value will vary from device todevice. The control ioctls provide the information and a mechanism tocreate a nice user interface for these controls that will workcorrectly with any device.

All controls are accessed using an ID value. V4L2 definesseveral IDs for specific purposes. Drivers can also implement theirown custom controls usingV4L2_CID_PRIVATE_BASEand higher values. The pre-defined control IDs have the prefixV4L2_CID_, and are listed inTable 1-1. The ID is used when querying the attributes ofa control, and when getting or setting the current value.

Generally applications should present controls to the userwithout assumptions about their purpose. Each control comes with aname string the user is supposed to understand. When the purpose isnon-intuitive the driver writer should provide a user manual, a userinterface plug-in or a driver specific panel application. PredefinedIDs were introduced to change a few controls programmatically, forexample to mute a device during a channel switch.

Drivers may enumerate different controls after switchingthe current video input or output, tuner or modulator, or audio inputor output. Different in the sense of other bounds, another default andcurrent value, step size or other menu items. A control with a certaincustom ID can also change name andtype.[9] Control values are stored globally, they do notchange when switching except to stay within the reported bounds. Theyalso do not change e. g. when the device is opened or closed, when thetuner radio frequency is changed or generally never withoutapplication request. Since V4L2 specifies no event mechanism, panelapplications intended to cooperate with other panel applications (bethey built into a larger application, as a TV viewer) may need toregularly poll control values to update their userinterface.[10]

Table 1-1. Control IDs

ID Type Description
V4L2_CID_BASE   First predefined ID, equal toV4L2_CID_BRIGHTNESS.
V4L2_CID_USER_BASE   Synonym of V4L2_CID_BASE.
V4L2_CID_BRIGHTNESS integer Picture brightness, or more precisely, the blacklevel.
V4L2_CID_CONTRAST integer Picture contrast or luma gain.
V4L2_CID_SATURATION integer Picture color saturation or chroma gain.
V4L2_CID_HUE integer Hue or color balance.
V4L2_CID_AUDIO_VOLUME integer Overall audio volume. Note some drivers alsoprovide an OSS or ALSA mixer interface.
V4L2_CID_AUDIO_BALANCE integer Audio stereo balance. Minimum corresponds to allthe way left, maximum to right.
V4L2_CID_AUDIO_BASS integer Audio bass adjustment.
V4L2_CID_AUDIO_TREBLE integer Audio treble adjustment.
V4L2_CID_AUDIO_MUTE boolean Mute audio, i. e. set the volume to zero, howeverwithout affectingV4L2_CID_AUDIO_VOLUME. LikeALSA drivers, V4L2 drivers must mute at load time to avoid excessivenoise. Actually the entire device should be reset to a low powerconsumption state.
V4L2_CID_AUDIO_LOUDNESS boolean Loudness mode (bass boost).
V4L2_CID_BLACK_LEVEL integer Another name for brightness (not a synonym ofV4L2_CID_BRIGHTNESS). This control is deprecatedand should not be used in new drivers and applications.
V4L2_CID_AUTO_WHITE_BALANCE boolean Automatic white balance (cameras).
V4L2_CID_DO_WHITE_BALANCE button This is an action control. When set (the value isignored), the device will do a white balance and then hold the currentsetting. Contrast this with the booleanV4L2_CID_AUTO_WHITE_BALANCE, which, whenactivated, keeps adjusting the white balance.
V4L2_CID_RED_BALANCE integer Red chroma balance.
V4L2_CID_BLUE_BALANCE integer Blue chroma balance.
V4L2_CID_GAMMA integer Gamma adjust.
V4L2_CID_WHITENESS integer Whiteness for grey-scale devices. This is a synonymfor V4L2_CID_GAMMA. This control is deprecatedand should not be used in new drivers and applications.
V4L2_CID_EXPOSURE integer Exposure (cameras). [Unit?]
V4L2_CID_AUTOGAIN boolean Automatic gain/exposure control.
V4L2_CID_GAIN integer Gain control.
V4L2_CID_HFLIP boolean Mirror the picture horizontally.
V4L2_CID_VFLIP boolean Mirror the picture vertically.
V4L2_CID_HCENTER_DEPRECATED (formerlyV4L2_CID_HCENTER) integer Horizontal image centering. This control isdeprecated. New drivers and applications should use theCamera class controlsV4L2_CID_PAN_ABSOLUTE,V4L2_CID_PAN_RELATIVEandV4L2_CID_PAN_RESET instead.
V4L2_CID_VCENTER_DEPRECATED (formerlyV4L2_CID_VCENTER) integer Vertical image centering. Centering is intended tophysically adjust cameras. For image cropping seeSection 1.11, for clippingSection 4.2. Thiscontrol is deprecated. New drivers and applications should use theCamera class controlsV4L2_CID_TILT_ABSOLUTE,V4L2_CID_TILT_RELATIVEandV4L2_CID_TILT_RESET instead.
V4L2_CID_POWER_LINE_FREQUENCY integer Enables a power line frequency filter to avoidflicker. Possible values are:V4L2_CID_POWER_LINE_FREQUENCY_DISABLED(0),V4L2_CID_POWER_LINE_FREQUENCY_50HZ (1) andV4L2_CID_POWER_LINE_FREQUENCY_60HZ (2).
V4L2_CID_HUE_AUTO boolean Enables automatic hue control by the device. Theeffect of settingV4L2_CID_HUE while automatichue control is enabled is undefined, drivers should ignore suchrequest.
V4L2_CID_WHITE_BALANCE_TEMPERATURE integer This control specifies the white balance settingsas a color temperature in Kelvin. A driver should have a minimum of2800 (incandescent) to 6500 (daylight). For more information aboutcolor temperature seeWikipedia.
V4L2_CID_SHARPNESS integer Adjusts the sharpness filters in a camera. Theminimum value disables the filters, higher values give a sharperpicture.
V4L2_CID_BACKLIGHT_COMPENSATION integer Adjusts the backlight compensation in a camera. Theminimum value disables backlight compensation.
V4L2_CID_LASTP1   End of the predefined control IDs (currentlyV4L2_CID_BACKLIGHT_COMPENSATION + 1).
V4L2_CID_PRIVATE_BASE   ID of the first custom (driver specific) control.Applications depending on particular custom controls should check thedriver name and version, seeSection 1.2.

Applications can enumerate the available controls with theVIDIOC_QUERYCTRL andVIDIOC_QUERYMENU ioctls, get and set acontrol value with theVIDIOC_G_CTRL andVIDIOC_S_CTRL ioctls.Drivers must implementVIDIOC_QUERYCTRL,VIDIOC_G_CTRL andVIDIOC_S_CTRL when the device has one or morecontrols,VIDIOC_QUERYMENU when it has one ormore menu type controls.

Example 1-8. Enumerating all controls

struct v4l2_queryctrl queryctrl;
struct v4l2_querymenu querymenu;

static void
enumerate_menu (void)
{
        printf ("  Menu items:\n");

        memset (&querymenu, 0, sizeof (querymenu));
        querymenu.id = queryctrl.id;

        for (querymenu.index = queryctrl.minimum;
             querymenu.index <= queryctrl.maximum;
              querymenu.index++) {
                if (0 == ioctl (fd, VIDIOC_QUERYMENU, &querymenu)) {
                        printf ("  %s\n", querymenu.name);
                } else {
                        perror ("VIDIOC_QUERYMENU");
                        exit (EXIT_FAILURE);
                }
        }
}

memset (&queryctrl, 0, sizeof (queryctrl));

for (queryctrl.id = V4L2_CID_BASE;
     queryctrl.id < V4L2_CID_LASTP1;
     queryctrl.id++) {
        if (0 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
                if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
                        continue;

                printf ("Control %s\n", queryctrl.name);

                if (queryctrl.type == V4L2_CTRL_TYPE_MENU)
                        enumerate_menu ();
        } else {
                if (errno == EINVAL)
                        continue;

                perror ("VIDIOC_QUERYCTRL");
                exit (EXIT_FAILURE);
        }
}

for (queryctrl.id = V4L2_CID_PRIVATE_BASE;;
     queryctrl.id++) {
        if (0 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
                if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
                        continue;

                printf ("Control %s\n", queryctrl.name);

                if (queryctrl.type == V4L2_CTRL_TYPE_MENU)
                        enumerate_menu ();
        } else {
                if (errno == EINVAL)
                        break;

                perror ("VIDIOC_QUERYCTRL");
                exit (EXIT_FAILURE);
        }
}

Example 1-9. Changing controls

struct v4l2_queryctrl queryctrl;
struct v4l2_control control;

memset (&queryctrl, 0, sizeof (queryctrl));
queryctrl.id = V4L2_CID_BRIGHTNESS;

if (-1 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
        if (errno != EINVAL) {
                perror ("VIDIOC_QUERYCTRL");
                exit (EXIT_FAILURE);
        } else {
                printf ("V4L2_CID_BRIGHTNESS is not supported\n");
        }
} else if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED) {
        printf ("V4L2_CID_BRIGHTNESS is not supported\n");
} else {
        memset (&control, 0, sizeof (control));
        control.id = V4L2_CID_BRIGHTNESS;
        control.value = queryctrl.default_value;

        if (-1 == ioctl (fd, VIDIOC_S_CTRL, &control)) {
                perror ("VIDIOC_S_CTRL");
                exit (EXIT_FAILURE);
        }
}

memset (&control, 0, sizeof (control));
control.id = V4L2_CID_CONTRAST;

if (0 == ioctl (fd, VIDIOC_G_CTRL, &control)) {
        control.value += 1;

        /* The driver may clamp the value or return ERANGE, ignored here */

        if (-1 == ioctl (fd, VIDIOC_S_CTRL, &control)
            && errno != ERANGE) {
                perror ("VIDIOC_S_CTRL");
                exit (EXIT_FAILURE);
        }
/* Ignore if V4L2_CID_CONTRAST is unsupported */
} else if (errno != EINVAL) {
        perror ("VIDIOC_G_CTRL");
        exit (EXIT_FAILURE);
}

control.id = V4L2_CID_AUDIO_MUTE;
control.value = TRUE; /* silence */

/* Errors ignored */
ioctl (fd, VIDIOC_S_CTRL, &control);

1.9. Extended Controls

1.9.1. Introduction

The control mechanism as originally designed was meantto be used for user settings (brightness, saturation, etc). However,it turned out to be a very useful model for implementing morecomplicated driver APIs where each driver implements only a subset ofa larger API.

The MPEG encoding API was the driving force behinddesigning and implementing this extended control mechanism: the MPEGstandard is quite large and the currently supported hardware MPEGencoders each only implement a subset of this standard. Further more,many parameters relating to how the video is encoded into an MPEGstream are specific to the MPEG encoding chip since the MPEG standardonly defines the format of the resulting MPEG stream, not how thevideo is actually encoded into that format.

Unfortunately, the original control API lacked somefeatures needed for these new uses and so it was extended into the(not terribly originally named) extended control API.


1.9.2. The Extended Control API

Three new ioctls are available: VIDIOC_G_EXT_CTRLS,VIDIOC_S_EXT_CTRLS andVIDIOC_TRY_EXT_CTRLS. These ioctls act onarrays of controls (as opposed to theVIDIOC_G_CTRL andVIDIOC_S_CTRL ioctls that act on a single control). This is neededsince it is often required to atomically change several controls atonce.

Each of the new ioctls expects a pointer to astruct v4l2_ext_controls. This structure contains a pointer to the controlarray, a count of the number of controls in that array and a controlclass. Control classes are used to group similar controls into asingle class. For example, control classV4L2_CTRL_CLASS_USER contains all user controls(i. e. all controls that can also be set using the oldVIDIOC_S_CTRL ioctl). Control classV4L2_CTRL_CLASS_MPEG contains all controlsrelating to MPEG encoding, etc.

All controls in the control array must belong to thespecified control class. An error is returned if this is not thecase.

It is also possible to use an empty control array (count== 0) to check whether the specified control class issupported.

The control array is a struct v4l2_ext_control array. Thev4l2_ext_control structure is very similar tostruct v4l2_control, except for the fact that it also allows for 64-bitvalues and pointers to be passed (although the latter is not yet usedanywhere).

It is important to realize that due to the flexibility ofcontrols it is necessary to check whether the control you want to setactually is supported in the driver and what the valid range of valuesis. So use theVIDIOC_QUERYCTRL andVIDIOC_QUERYMENU ioctls tocheck this. Also note that it is possible that some of the menuindices in a control of typeV4L2_CTRL_TYPE_MENUmay not be supported (VIDIOC_QUERYMENU willreturn an error). A good example is the list of supported MPEG audiobitrates. Some drivers only support one or two bitrates, otherssupport a wider range.


1.9.3. Enumerating Extended Controls

The recommended way to enumerate over the extendedcontrols is by using VIDIOC_QUERYCTRL in combination with theV4L2_CTRL_FLAG_NEXT_CTRL flag:

struct v4l2_queryctrl qctrl;

qctrl.id = V4L2_CTRL_FLAG_NEXT_CTRL;
while (0 == ioctl (fd, VIDIOC_QUERYCTRL, &qctrl)) {
        /* ... */
        qctrl.id |= V4L2_CTRL_FLAG_NEXT_CTRL;
}

The initial control ID is set to 0 ORed with theV4L2_CTRL_FLAG_NEXT_CTRL flag. TheVIDIOC_QUERYCTRL ioctl will return the firstcontrol with a higher ID than the specified one. When no such controlsare found an error is returned.

If you want to get all controls within a specific controlclass, then you can set the initialqctrl.id value to the control class and addan extra check to break out of the loop when a control of anothercontrol class is found:

qctrl.id = V4L2_CTRL_CLASS_MPEG | V4L2_CTRL_FLAG_NEXT_CTRL;
while (0 == ioctl (fd, VIDIOC_QUERYCTRL, &qctrl)) {
        if (V4L2_CTRL_ID2CLASS (qctrl.id) != V4L2_CTRL_CLASS_MPEG)
                break;
                /* ... */
                qctrl.id |= V4L2_CTRL_FLAG_NEXT_CTRL;
        }

The 32-bit qctrl.id value issubdivided into three bit ranges: the top 4 bits are reserved forflags (e. g.V4L2_CTRL_FLAG_NEXT_CTRL) and are notactually part of the ID. The remaining 28 bits form the control ID, ofwhich the most significant 12 bits define the control class and theleast significant 16 bits identify the control within the controlclass. It is guaranteed that these last 16 bits are always non-zerofor controls. The range of 0x1000 and up are reserved fordriver-specific controls. The macroV4L2_CTRL_ID2CLASS(id) returns the control classID based on a control ID.

If the driver does not support extended controls, thenVIDIOC_QUERYCTRL will fail when used incombination withV4L2_CTRL_FLAG_NEXT_CTRL. Inthat case the old method of enumerating control should be used (see1.8). But if it is supported, then it is guaranteed to enumerate overall controls, including driver-private controls.


1.9.4. Creating Control Panels

It is possible to create control panels for a graphicaluser interface where the user can select the various controls.Basically you will have to iterate over all controls using the methoddescribed above. Each control class starts with a control of typeV4L2_CTRL_TYPE_CTRL_CLASS.VIDIOC_QUERYCTRL will return the name of thiscontrol class which can be used as the title of a tab page within acontrol panel.

The flags field of struct v4l2_queryctrl also contains hints onthe behavior of the control. See theVIDIOC_QUERYCTRL documentationfor more details.


1.9.5. MPEG Control Reference

Below all controls within the MPEG control class aredescribed. First the generic controls, then controls specific forcertain hardware.


1.9.5.1. Generic MPEG Controls

Table 1-2. MPEG Control IDs

ID Type  
  Description
       
V4L2_CID_MPEG_CLASS  class  
  The MPEG classdescriptor. Calling VIDIOC_QUERYCTRL for this control will return adescription of this control class. This description can be used as thecaption of a Tab page in a GUI, for example.
       
V4L2_CID_MPEG_STREAM_TYPE  enum  
  The MPEG-1, -2 or -4output stream type. One cannot assume anything here. Each hardwareMPEG encoder tends to support different subsets of the available MPEGstream types. The currently defined stream types are:
 
V4L2_MPEG_STREAM_TYPE_MPEG2_PS  MPEG-2 program stream
V4L2_MPEG_STREAM_TYPE_MPEG2_TS  MPEG-2 transport stream
V4L2_MPEG_STREAM_TYPE_MPEG1_SS  MPEG-1 system stream
V4L2_MPEG_STREAM_TYPE_MPEG2_DVD  MPEG-2 DVD-compatible stream
V4L2_MPEG_STREAM_TYPE_MPEG1_VCD  MPEG-1 VCD-compatible stream
V4L2_MPEG_STREAM_TYPE_MPEG2_SVCD  MPEG-2 SVCD-compatible stream
       
V4L2_CID_MPEG_STREAM_PID_PMT  integer  
  Program Map TablePacket ID for the MPEG transport stream (default 16)
       
V4L2_CID_MPEG_STREAM_PID_AUDIO  integer  
  Audio Packet ID forthe MPEG transport stream (default 256)
       
V4L2_CID_MPEG_STREAM_PID_VIDEO  integer  
  Video Packet ID forthe MPEG transport stream (default 260)
       
V4L2_CID_MPEG_STREAM_PID_PCR  integer  
  Packet ID for theMPEG transport stream carrying PCR fields (default 259)
       
V4L2_CID_MPEG_STREAM_PES_ID_AUDIO  integer  
  Audio ID for MPEGPES
       
V4L2_CID_MPEG_STREAM_PES_ID_VIDEO  integer  
  Video ID for MPEGPES
       
V4L2_CID_MPEG_STREAM_VBI_FMT  enum  
  Some cards can embedVBI data (e. g. Closed Caption, Teletext) into the MPEG stream. Thiscontrol selects whether VBI data should be embedded, and if so, whatembedding method should be used. The list of possible VBI formatsdepends on the driver. The currently defined VBI format typesare:
 
V4L2_MPEG_STREAM_VBI_FMT_NONE  No VBI in the MPEG stream
V4L2_MPEG_STREAM_VBI_FMT_IVTV  VBI in private packets, IVTV format (documentedin the kernel sources in the fileDocumentation/video4linux/cx2341x/README.vbi)
       
V4L2_CID_MPEG_AUDIO_SAMPLING_FREQ  enum  
  MPEG Audio samplingfrequency. Possible values are:
 
V4L2_MPEG_AUDIO_SAMPLING_FREQ_44100  44.1 kHz
V4L2_MPEG_AUDIO_SAMPLING_FREQ_48000  48 kHz
V4L2_MPEG_AUDIO_SAMPLING_FREQ_32000  32 kHz
       
V4L2_CID_MPEG_AUDIO_ENCODING  enum  
  MPEG Audio encoding.Possible values are:
 
V4L2_MPEG_AUDIO_ENCODING_LAYER_1  MPEG Layer I encoding
V4L2_MPEG_AUDIO_ENCODING_LAYER_2  MPEG Layer II encoding
V4L2_MPEG_AUDIO_ENCODING_LAYER_3  MPEG Layer III encoding
       
V4L2_CID_MPEG_AUDIO_L1_BITRATE  enum  
  Layer I bitrate.Possible values are:
 
V4L2_MPEG_AUDIO_L1_BITRATE_32K  32 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_64K  64 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_96K  96 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_128K  128 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_160K  160 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_192K  192 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_224K  224 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_256K  256 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_288K  288 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_320K  320 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_352K  352 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_384K  384 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_416K  416 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_448K  448 kbit/s
       
V4L2_CID_MPEG_AUDIO_L2_BITRATE  enum  
  Layer II bitrate.Possible values are:
 
V4L2_MPEG_AUDIO_L2_BITRATE_32K  32 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_48K  48 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_56K  56 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_64K  64 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_80K  80 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_96K  96 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_112K  112 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_128K  128 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_160K  160 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_192K  192 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_224K  224 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_256K  256 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_320K  320 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_384K  384 kbit/s
       
V4L2_CID_MPEG_AUDIO_L3_BITRATE  enum  
  Layer III bitrate.Possible values are:
 
V4L2_MPEG_AUDIO_L3_BITRATE_32K  32 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_40K  40 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_48K  48 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_56K  56 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_64K  64 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_80K  80 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_96K  96 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_112K  112 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_128K  128 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_160K  160 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_192K  192 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_224K  224 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_256K  256 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_320K  320 kbit/s
       
V4L2_CID_MPEG_AUDIO_MODE  enum  
  MPEG Audio mode.Possible values are:
 
V4L2_MPEG_AUDIO_MODE_STEREO  Stereo
V4L2_MPEG_AUDIO_MODE_JOINT_STEREO  Joint Stereo
V4L2_MPEG_AUDIO_MODE_DUAL  Bilingual
V4L2_MPEG_AUDIO_MODE_MONO  Mono
       
V4L2_CID_MPEG_AUDIO_MODE_EXTENSION  enum  
  Joint Stereoaudio mode extension. In Layer I and II they indicate which subbandsare in intensity stereo. All other subbands are coded in stereo. LayerIII is not (yet) supported. Possible valuesare:
 
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_4  Subbands 4-31 in intensity stereo
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_8  Subbands 8-31 in intensity stereo
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_12  Subbands 12-31 in intensity stereo
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_16  Subbands 16-31 in intensity stereo
       
V4L2_CID_MPEG_AUDIO_EMPHASIS  enum  
  Audio Emphasis.Possible values are:
 
V4L2_MPEG_AUDIO_EMPHASIS_NONE  None
V4L2_MPEG_AUDIO_EMPHASIS_50_DIV_15_uS  50/15 microsecond emphasis
V4L2_MPEG_AUDIO_EMPHASIS_CCITT_J17  CCITT J.17
       
V4L2_CID_MPEG_AUDIO_CRC  enum  
  CRC method. Possiblevalues are:
 
V4L2_MPEG_AUDIO_CRC_NONE  None
V4L2_MPEG_AUDIO_CRC_CRC16  16 bit parity check
       
V4L2_CID_MPEG_AUDIO_MUTE  bool  
  Mutes the audio whencapturing. This is not done by muting audio hardware, which can stillproduce a slight hiss, but in the encoder itself, guaranteeing a fixedand reproducable audio bitstream. 0 = unmuted, 1 = muted.
       
V4L2_CID_MPEG_VIDEO_ENCODING  enum  
  MPEG Video encodingmethod. Possible values are:
 
V4L2_MPEG_VIDEO_ENCODING_MPEG_1  MPEG-1 Video encoding
V4L2_MPEG_VIDEO_ENCODING_MPEG_2  MPEG-2 Video encoding
       
V4L2_CID_MPEG_VIDEO_ASPECT  enum  
  Video aspect.Possible values are:
 
V4L2_MPEG_VIDEO_ASPECT_1x1   
V4L2_MPEG_VIDEO_ASPECT_4x3   
V4L2_MPEG_VIDEO_ASPECT_16x9   
V4L2_MPEG_VIDEO_ASPECT_221x100   
       
V4L2_CID_MPEG_VIDEO_B_FRAMES  integer  
  Number of B-Frames(default 2)
       
V4L2_CID_MPEG_VIDEO_GOP_SIZE  integer  
  GOP size (default12)
       
V4L2_CID_MPEG_VIDEO_GOP_CLOSURE  bool  
  GOP closure (default1)
       
V4L2_CID_MPEG_VIDEO_PULLDOWN  bool  
  Enable 3:2 pulldown(default 0)
       
V4L2_CID_MPEG_VIDEO_BITRATE_MODE  enum  
  Video bitrate mode.Possible values are:
 
V4L2_MPEG_VIDEO_BITRATE_MODE_VBR  Variable bitrate
V4L2_MPEG_VIDEO_BITRATE_MODE_CBR  Constant bitrate
       
V4L2_CID_MPEG_VIDEO_BITRATE  integer  
  Video bitrate in bitsper second.
       
V4L2_CID_MPEG_VIDEO_BITRATE_PEAK  integer  
  Peak video bitrate inbits per second. Must be larger or equal to the average video bitrate.It is ignored if the video bitrate mode is set to constantbitrate.
       
V4L2_CID_MPEG_VIDEO_TEMPORAL_DECIMATION  integer  
  For every capturedframe, skip this many subsequent frames (default 0).
       
V4L2_CID_MPEG_VIDEO_MUTE  bool  
  "Mutes" the video to afixed color when capturing. This is useful for testing, to produce afixed video bitstream. 0 = unmuted, 1 = muted.
       
V4L2_CID_MPEG_VIDEO_MUTE_YUV  integer  
  Sets the "mute" colorof the video. The supplied 32-bit integer is interpreted as follows (bit0 = least significant bit):
 
Bit 0:7 V chrominance information
Bit 8:15 U chrominance information
Bit 16:23 Y luminance information
Bit 24:31 Must be zero.

1.9.5.2. CX2341x MPEG Controls

The following MPEG class controls deal with MPEGencoding settings that are specific to the Conexant CX23415 andCX23416 MPEG encoding chips.

Table 1-3. CX2341x Control IDs

ID Type  
  Description
       
V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE  enum  
  Sets the SpatialFilter mode (default MANUAL). Possible valuesare:
 
V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_MANUAL  Choose the filter manually
V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_AUTO  Choose the filter automatically
       
V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER  integer (0-15)  
  The setting for theSpatial Filter. 0 = off, 15 = maximum. (Default is 0.)
       
V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE  enum  
  Select the algorithmto use for the Luma Spatial Filter (default1D_HOR). Possible values:
 
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_OFF  No filter
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_HOR  One-dimensional horizontal
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_VERT  One-dimensional vertical
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_HV_SEPARABLE  Two-dimensional separable
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_SYM_NON_SEPARABLE  Two-dimensional symmetricalnon-separable
       
V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE  enum  
  Select the algorithmfor the Chroma Spatial Filter (default 1D_HOR).Possible values are:
 
V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_OFF  No filter
V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_1D_HOR  One-dimensional horizontal
       
V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE  enum  
  Sets the TemporalFilter mode (default MANUAL). Possible valuesare:
 
V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_MANUAL  Choose the filter manually
V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_AUTO  Choose the filter automatically
       
V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER  integer (0-31)  
  The setting for theTemporal Filter. 0 = off, 31 = maximum. (Default is 8 for full-scalecapturing and 0 for scaled capturing.)
       
V4L2_CID_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE  enum  
  Median Filter Type(default OFF). Possible values are:
 
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_OFF  No filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR  Horizontal filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_VERT  Vertical filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR_VERT  Horizontal and vertical filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_DIAG  Diagonal filter
       
V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_BOTTOM  integer (0-255)  
  Threshold above whichthe luminance median filter is enabled (default 0)
       
V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_TOP  integer (0-255)  
  Threshold below whichthe luminance median filter is enabled (default 255)
       
V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_BOTTOM  integer (0-255)  
  Threshold above whichthe chroma median filter is enabled (default 0)
       
V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_TOP  integer (0-255)  
  Threshold below whichthe chroma median filter is enabled (default 255)
       
V4L2_CID_MPEG_CX2341X_STREAM_INSERT_NAV_PACKETS  bool  
  The CX2341X MPEG encodercan insert one empty MPEG-2 PES packet into the stream between everyfour video frames. The packet size is 2048 bytes, including thepacket_start_code_prefix and stream_id fields. The stream_id is 0xBF(private stream 2). The payload consists of 0x00 bytes, to be filledin by the application. 0 = do not insert, 1 = insert packets.

1.9.6. Camera Control Reference

The Camera class includes controls for mechanical (orequivalent digital) features of a device such as controllable lensesor sensors.

Table 1-4. Camera Control IDs

ID Type  
  Description
       
V4L2_CID_CAMERA_CLASS  class  
  The Camera classdescriptor. Calling VIDIOC_QUERYCTRL for this control will return adescription of this control class.
       
V4L2_CID_EXPOSURE_AUTO  integer  
  Enables automaticadjustments of the exposure time and/or iris aperture. The effect ofmanual changes of the exposure time or iris aperture while thesefeatures are enabled is undefined, drivers should ignore suchrequests. Possible values are:
 
V4L2_EXPOSURE_AUTO  Automatic exposure time, automatic irisaperture.
V4L2_EXPOSURE_MANUAL  Manual exposure time, manual iris.
V4L2_EXPOSURE_SHUTTER_PRIORITY  Manual exposure time, auto iris.
V4L2_EXPOSURE_APERTURE_PRIORITY  Auto exposure time, manual iris.
       
V4L2_CID_EXPOSURE_ABSOLUTE  integer  
  Determines the exposuretime of the camera sensor. The exposure time is limited by the frameinterval. Drivers should interpret the values as 100 µs units,where the value 1 stands for 1/10000th of a second, 10000 for 1 secondand 100000 for 10 seconds.
       
V4L2_CID_EXPOSURE_AUTO_PRIORITY  boolean  
  WhenV4L2_CID_EXPOSURE_AUTO is set toAUTO orSHUTTER_PRIORITY,this control determines if the device may dynamically vary the framerate. By default this feature is disabled (0) and the frame rate mustremain constant.
       
V4L2_CID_PAN_RELATIVE  integer  
  This control turns thecamera horizontally by the specified amount. The unit is undefined. Apositive value moves the camera to the right (clockwise when viewedfrom above), a negative value to the left. A value of zero does notcause motion.
       
V4L2_CID_TILT_RELATIVE  integer  
  This control turns thecamera vertically by the specified amount. The unit is undefined. Apositive value moves the camera up, a negative value down. A value ofzero does not cause motion.
       
V4L2_CID_PAN_RESET  boolean  
  When this control is setto TRUE (1), the camera moves horizontally to thedefault position.
       
V4L2_CID_TILT_RESET  boolean  
  When this control is setto TRUE (1), the camera moves vertically to thedefault position.
       
V4L2_CID_PAN_ABSOLUTE  integer  
  This controlturns the camera horizontally to the specified position. Positivevalues move the camera to the right (clockwise when viewed from above),negative values to the left. Drivers should interpret the values as arcseconds, with valid values between -180 * 3600 and +180 * 3600inclusive.
       
V4L2_CID_TILT_ABSOLUTE  integer  
  This controlturns the camera vertically to the specified position. Positive valuesmove the camera up, negative values down. Drivers should interpret thevalues as arc seconds, with valid values between -180 * 3600 and +180* 3600 inclusive.
       
V4L2_CID_FOCUS_ABSOLUTE  integer  
  This control sets thefocal point of the camera to the specified position. The unit isundefined. Positive values set the focus closer to the camera,negative values towards infinity.
       
V4L2_CID_FOCUS_RELATIVE  integer  
  This control moves thefocal point of the camera by the specified amount. The unit isundefined. Positive values move the focus closer to the camera,negative values towards infinity.
       
V4L2_CID_FOCUS_AUTO  boolean  
  Enables automatic focusadjustments. The effect of manual focus adjustments while this featureis enabled is undefined, drivers should ignore such requests.
       

1.10. Data Formats

1.10.1. Data Format Negotiation

Different devices exchange different kinds of data withapplications, for example video images, raw or sliced VBI data, RDSdatagrams. Even within one kind many different formats are possible,in particular an abundance of image formats. Although drivers mustprovide a default and the selection persists across closing andreopening a device, applications should always negotiate a data formatbefore engaging in data exchange. Negotiation means the applicationasks for a particular format and the driver selects and reports thebest the hardware can do to satisfy the request. Of courseapplications can also just query the current selection.

A single mechanism exists to negotiate all data formatsusing the aggregate struct v4l2_format and theVIDIOC_G_FMT andVIDIOC_S_FMT ioctls. Additionally theVIDIOC_TRY_FMT ioctl can beused to examine what the hardwarecould do,without actually selecting a new data format. The data formatssupported by the V4L2 API are covered in the respective device sectioninChapter 4. For a closer look at image formats seeChapter 2.

The VIDIOC_S_FMT ioctl is a majorturning-point in the initialization sequence. Prior to this pointmultiple panel applications can access the same device concurrently toselect the current input, change controls or modify other properties.The first VIDIOC_S_FMT assigns a logical stream(video data, VBI data etc.) exclusively to one file descriptor.

Exclusive means no other application, more precisely noother file descriptor, can grab this stream or change deviceproperties inconsistent with the negotiated parameters. A videostandard change for example, when the new standard uses a differentnumber of scan lines, can invalidate the selected image format.Therefore only the file descriptor owning the stream can makeinvalidating changes. Accordingly multiple file descriptors whichgrabbed different logical streams prevent each other from interferingwith their settings. When for example video overlay is about to startor already in progress, simultaneous video capturing may be restrictedto the same cropping and image size.

When applications omit theVIDIOC_S_FMT ioctl its locking side effects areimplied by the next step, the selection of an I/O method with theVIDIOC_REQBUFSioctl or implicit with the first read() orwrite() call.

Generally only one logical stream can be assigned to afile descriptor, the exception being drivers permitting simultaneousvideo capturing and overlay using the same file descriptor forcompatibility with V4L and earlier versions of V4L2. Switching thelogical stream or returning into "panel mode" is possible by closingand reopening the device. Driversmay support aswitch usingVIDIOC_S_FMT.

All drivers exchanging data withapplications must support the VIDIOC_G_FMT andVIDIOC_S_FMT ioctl. Implementation of theVIDIOC_TRY_FMT is highly recommended butoptional.


1.10.2. Image Format Enumeration

Apart of the generic format negotiation functionsa special ioctl to enumerate all image formats supported by videocapture, overlay or output devices is available.[11]

The VIDIOC_ENUM_FMT ioctl must be supportedby all drivers exchanging image data with applications.

Important: Drivers are not supposed to convert image formats inkernel space. They must enumerate only formats directly supported bythe hardware. If necessary driver writers should publish an exampleconversion routine or library for integration into applications.


1.11. Image Cropping, Insertion and Scaling

Some video capture devices can sample a subsection of thepicture and shrink or enlarge it to an image of arbitrary size. Wecall these abilities cropping and scaling. Some video output devicescan scale an image up or down and insert it at an arbitrary scan lineand horizontal offset into a video signal.

Applications can use the following API to select an area inthe video signal, query the default area and the hardware limits.Despite their name, theVIDIOC_CROPCAP,VIDIOC_G_CROPandVIDIOC_S_CROP ioctls apply to input as well as outputdevices.

Scaling requires a source and a target. On a video captureor overlay device the source is the video signal, and the croppingioctls determine the area actually sampled. The target are imagesread by the application or overlaid onto the graphics screen. Theirsize (and position for an overlay) is negotiated with theVIDIOC_G_FMT andVIDIOC_S_FMT ioctls.

On a video output device the source are the images passed inby the application, and their size is again negotiated with theVIDIOC_G/S_FMT ioctls, or may be encoded in acompressed video stream. The target is the video signal, and thecropping ioctls determine the area where the images areinserted.

Source and target rectangles are defined even if the devicedoes not support scaling or theVIDIOC_G/S_CROPioctls. Their size (and position where applicable) will be fixed inthis case.All capture and output device must support theVIDIOC_CROPCAP ioctl such that applications candetermine if scaling takes place.


1.11.1. Cropping Structures

Figure 1-1. Image Cropping, Insertion and Scaling

For capture devices the coordinates of the top leftcorner, width and height of the area which can be sampled is given bythebounds substructure of thestruct v4l2_cropcap returned by theVIDIOC_CROPCAPioctl. To support a wide range of hardware this specification does notdefine an origin or units. However by convention drivers shouldhorizontally count unscaled samples relative to 0H (the leading edgeof the horizontal sync pulse, see Figure 4-1).Vertically ITU-R linenumbers of the first field (Figure 4-2,Figure 4-3), multiplied by two if the driver can capture bothfields.

The top left corner, width and height of the sourcerectangle, that is the area actually sampled, is given by struct v4l2_cropusing the same coordinate system as struct v4l2_cropcap. Applications canuse the VIDIOC_G_CROP andVIDIOC_S_CROP ioctls to get and set thisrectangle. It must lie completely within the capture boundaries andthe driver may further adjust the requested size and/or positionaccording to hardware limitations.

Each capture device has a default source rectangle, givenby the defrect substructure ofstruct v4l2_cropcap. The center of this rectangle shall align with thecenter of the active picture area of the video signal, and cover whatthe driver writer considers the complete picture. Drivers shall resetthe source rectangle to the default when the driver is first loaded,but not later.

For output devices these structures and ioctls are usedaccordingly, defining thetarget rectangle wherethe images will be inserted into the video signal.


1.11.2. Scaling Adjustments

Video hardware can have various cropping, insertion andscaling limitations. It may only scale up or down, support onlydiscrete scaling factors, or have different scaling abilities inhorizontal and vertical direction. Also it may not support scaling atall. At the same time the struct v4l2_crop rectangle may have to bealigned, and both the source and target rectangles may have arbitraryupper and lower size limits. In particular the maximumwidth andheightin struct v4l2_crop may be smaller than thestruct v4l2_cropcap.bounds area. Therefore, asusual, drivers are expected to adjust the requested parameters andreturn the actual values selected.

Applications can change the source or the target rectanglefirst, as they may prefer a particular image size or a certain area inthe video signal. If the driver has to adjust both to satisfy hardwarelimitations, the last requested rectangle shall take priority, and thedriver should preferably adjust the opposite one. The VIDIOC_TRY_FMTioctl however shall not change the driver state and therefore onlyadjust the requested rectangle.

Suppose scaling on a video capture device is restricted toa factor 1:1 or 2:1 in either direction and the target image size mustbe a multiple of 16 × 16 pixels. The source croppingrectangle is set to defaults, which are also the upper limit in thisexample, of 640 × 400 pixels at offset 0, 0. Anapplication requests an image size of 300 × 225pixels, assuming video will be scaled down from the "full picture"accordingly. The driver sets the image size to the closest possiblevalues 304 × 224, then chooses the cropping rectangleclosest to the requested size, that is 608 × 224(224 × 2:1 would exceed the limit 400). The offset0, 0 is still valid, thus unmodified. Given the default croppingrectangle reported byVIDIOC_CROPCAP theapplication can easily propose another offset to center the croppingrectangle.

Now the application may insist on covering an area using apicture aspect ratio closer to the original request, so it asks for acropping rectangle of 608 × 456 pixels. The presentscaling factors limit cropping to 640 × 384, so thedriver returns the cropping size 608 × 384 and adjuststhe image size to closest possible 304 × 192.


1.11.3. Examples

Source and target rectangles shall remain unchanged acrossclosing and reopening a device, such that piping data into or out of adevice will work without special preparations. More advancedapplications should ensure the parameters are suitable before startingI/O.

Example 1-10. Resetting the cropping parameters

(A video capture device is assumed; changeV4L2_BUF_TYPE_VIDEO_CAPTURE for otherdevices.)

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
        perror ("VIDIOC_CROPCAP");
        exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
crop.c = cropcap.defrect; 

/* Ignore if cropping is not supported (EINVAL). */

if (-1 == ioctl (fd, VIDIOC_S_CROP, &crop)
    && errno != EINVAL) {
        perror ("VIDIOC_S_CROP");
        exit (EXIT_FAILURE);
}
      

Example 1-11. Simple downscaling

(A video capture device is assumed.)

struct v4l2_cropcap cropcap;
struct v4l2_format format;

reset_cropping_parameters ();

/* Scale down to 1/4 size of full picture. */

memset (&format, 0, sizeof (format)); /* defaults */

format.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

format.fmt.pix.width = cropcap.defrect.width >> 1;
format.fmt.pix.height = cropcap.defrect.height >> 1;
format.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;

if (-1 == ioctl (fd, VIDIOC_S_FMT, &format)) {
        perror ("VIDIOC_S_FORMAT");
        exit (EXIT_FAILURE);
}

/* We could check the actual image size now, the actual scaling factor
   or if the driver can scale at all. */
        

Example 1-12. Selecting an output area

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
        perror ("VIDIOC_CROPCAP");
        exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));

crop.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;
crop.c = cropcap.defrect;

/* Scale the width and height to 50 % of their original size
   and center the output. */

crop.c.width /= 2;
crop.c.height /= 2;
crop.c.left += crop.c.width / 2;
crop.c.top += crop.c.height / 2;

/* Ignore if cropping is not supported (EINVAL). */

if (-1 == ioctl (fd, VIDIOC_S_CROP, &crop)
    && errno != EINVAL) {
        perror ("VIDIOC_S_CROP");
        exit (EXIT_FAILURE);
}

Example 1-13. Current scaling factor and pixel aspect

(A video capture device is assumed.)

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;
struct v4l2_format format;
double hscale, vscale;
double aspect;
int dwidth, dheight;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
        perror ("VIDIOC_CROPCAP");
        exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_G_CROP, &crop)) {
        if (errno != EINVAL) {
                perror ("VIDIOC_G_CROP");
                exit (EXIT_FAILURE);
        }

        /* Cropping not supported. */
        crop.c = cropcap.defrect;
}

memset (&format, 0, sizeof (format));
format.fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_G_FMT, &format)) {
        perror ("VIDIOC_G_FMT");
        exit (EXIT_FAILURE);
}

/* The scaling applied by the driver. */

hscale = format.fmt.pix.width / (double) crop.c.width;
vscale = format.fmt.pix.height / (double) crop.c.height;

aspect = cropcap.pixelaspect.numerator /
         (double) cropcap.pixelaspect.denominator;
aspect = aspect * hscale / vscale;

/* Devices following ITU-R BT.601 do not capture
   square pixels. For playback on a computer monitor
   we should scale the images to this size. */

dwidth = format.fmt.pix.width / aspect;
dheight = format.fmt.pix.height;
        

1.12. Streaming Parameters

Streaming parameters are intended to optimize the videocapture process as well as I/O. Presently applications can request ahigh quality capture mode with theVIDIOC_S_PARM ioctl.

The current video standard determines a nominal number offrames per second. If less than this number of frames is to becaptured or output, applications can request frame skipping orduplicating on the driver side. This is especially useful when usingtheread() orwrite(), which are not augmented by timestampsor sequence counters, and to avoid unneccessary data copying.

Finally these ioctls can be used to determine the number ofbuffers used internally by a driver in read/write mode. Forimplications see the section discussing theread()function.

To get and set the streaming parameters applications callthe VIDIOC_G_PARM and VIDIOC_S_PARM ioctl, respectively. They takea pointer to a struct v4l2_streamparm, which contains a union holdingseparate parameters for input and output devices.

These ioctls are optional, drivers need not implementthem. If so, they return theEINVAL error code.


Chapter 2. Image Formats

The V4L2 API was primarily designed for devices exchangingimage data with applications. Thev4l2_pix_format structure defines the formatand layout of an image in memory. Image formats are negotiated withtheVIDIOC_S_FMT ioctl. (The explanations here focus on videocapturing and output, for overlay frame buffer formats see alsoVIDIOC_G_FBUF.)

Table 2-1. struct v4l2_pix_format

__u32 width Image width in pixels.
__u32 height Image height in pixels.
Applications set these fields torequest an image size, drivers return the closest possible values. Incase of planar formats thewidth andheight applies to the largest plane. Toavoid ambiguities drivers must return values rounded up to a multipleof the scale factor of any smaller planes. For example when the imageformat is YUV 4:2:0,width andheight must be multiples of two.
__u32 pixelformat The pixel format or type of compression, set by theapplication. This is a little endianfour character code. V4L2 definesstandard RGB formats inTable 2-1, YUV formats inSection 2.5, and reserved codes inTable 2-8
enum v4l2_field field Video images are typically interlaced. Applicationscan request to capture or output only the top or bottom field, or bothfields interlaced or sequentially stored in one buffer or alternatingin separate buffers. Drivers return the actual field order selected.For details see Section 3.6.
__u32 bytesperline Distance in bytes between the leftmost pixels in twoadjacent lines.

Both applications and driverscan set this field to request padding bytes at the end of each line.Drivers however may ignore the value requested by the application,returningwidth times bytes per pixel or alarger value required by the hardware. That implies applications canjust set this field to zero to get a reasonabledefault.

Video hardware may access padding bytes,therefore they must reside in accessible memory. Consider cases wherepadding bytes after the last line of an image cross a system pageboundary. Input devices may write padding bytes, the value isundefined. Output devices ignore the contents of paddingbytes.

When the image format is planar thebytesperline value applies to the largestplane and is divided by the same factor as thewidth field for any smaller planes. Forexample the Cb and Cr planes of a YUV 4:2:0 image have half as manypadding bytes following each line as the Y plane. To avoid ambiguitiesdrivers must return abytesperline valuerounded up to a multiple of the scale factor.

__u32 sizeimage Size in bytes of the buffer to hold a complete image,set by the driver. Usually this isbytesperline timesheight. When the image consists of variablelength compressed data this is the maximum number of bytes required tohold an image.
enum v4l2_colorspace colorspace This information supplements thepixelformat and must be set by the driver,seeSection 2.2.
__u32 priv Reserved for custom (driver defined) additionalinformation about formats. When not used drivers and applications mustset this field to zero.

2.1. Standard Image Formats

In order to exchange images between drivers andapplications, it is necessary to have standard image data formatswhich both sides will interpret the same way. V4L2 includes severalsuch formats, and this section is intended to be an unambiguousspecification of the standard image data formats in V4L2.

V4L2 drivers are not limited to these formats, however.Driver-specific formats are possible. In that case the application maydepend on a codec to convert images to one of the standard formatswhen needed. But the data can still be stored and retrieved in theproprietary format. For example, a device may support a proprietarycompressed format. Applications can still capture and save the data inthe compressed format, saving much disk space, and later use a codecto convert the images to the X Windows screen format when the video isto be displayed.

Even so, ultimately, some standard formats are needed, sothe V4L2 specification would not be complete without well-definedstandard formats.

The V4L2 standard formats are mainly uncompressed formats. Thepixels are always arranged in memory from left to right, and from topto bottom. The first byte of data in the image buffer is always forthe leftmost pixel of the topmost row. Following that is the pixelimmediately to its right, and so on until the end of the top row ofpixels. Following the rightmost pixel of the row there may be zero ormore bytes of padding to guarantee that each row of pixel data has acertain alignment. Following the pad bytes, if any, is data for theleftmost pixel of the second row from the top, and so on. The last rowhas just as many pad bytes after it as the other rows.

In V4L2 each format has an identifier which looks likePIX_FMT_XXX, defined in thevideodev.h header file. These identifiersrepresentfour character codeswhich are also listed below, however they are not the same as thoseused in the Windows world.


2.2. Colorspaces

[intro]

Gamma Correction

[to do]

E'R = f(R)

E'G = f(G)

E'B = f(B)

Construction of luminance and color-differencesignals

[to do]

E'Y =CoeffR E'R+ CoeffG E'G+ CoeffB E'B

(E'R - E'Y) = E'R- CoeffR E'R- CoeffG E'G- CoeffB E'B

(E'B - E'Y) = E'B- CoeffR E'R- CoeffG E'G- CoeffB E'B

Re-normalized color-difference signals

The color-difference signals are scaled back to unityrange [-0.5;+0.5]:

KB = 0.5 / (1 - CoeffB)

KR = 0.5 / (1 - CoeffR)

PB =KB (E'B - E'Y) = 0.5 (CoeffR / CoeffB) E'R+ 0.5 (CoeffG / CoeffB) E'G+ 0.5 E'B

PR =KR (E'R - E'Y) = 0.5 E'R+ 0.5 (CoeffG / CoeffR) E'G+ 0.5 (CoeffB / CoeffR) E'B

Quantization

[to do]

Y' = (Lum. Levels - 1) · E'Y + Lum. Offset

CB = (Chrom. Levels - 1)· PB + Chrom. Offset

CR = (Chrom. Levels - 1)· PR + Chrom. Offset

Rounding to the nearest integer and clamping to the range[0;255] finally yields the digital color components Y'CbCrstored in YUV images.

Example 2-1. ITU-R Rec. BT.601 color conversion

Forward Transformation

int ER, EG, EB;         /* gamma corrected RGB input [0;255] */
int Y1, Cb, Cr;         /* output [0;255] */

double r, g, b;         /* temporaries */
double y1, pb, pr;

int
clamp (double x)
{
        int r = x;      /* round to nearest */

        if (r < 0)         return 0;
        else if (r > 255)  return 255;
        else               return r;
}

r = ER / 255.0;
g = EG / 255.0;
b = EB / 255.0;

y1  =  0.299  * r + 0.587 * g + 0.114  * b;
pb  = -0.169  * r - 0.331 * g + 0.5    * b;
pr  =  0.5    * r - 0.419 * g - 0.081  * b;

Y1 = clamp (219 * y1 + 16);
Cb = clamp (224 * pb + 128);
Cr = clamp (224 * pr + 128);

/* or shorter */

y1 = 0.299 * ER + 0.587 * EG + 0.114 * EB;

Y1 = clamp ( (219 / 255.0)                    *       y1  + 16);
Cb = clamp (((224 / 255.0) / (2 - 2 * 0.114)) * (EB - y1) + 128);
Cr = clamp (((224 / 255.0) / (2 - 2 * 0.299)) * (ER - y1) + 128);
      

Inverse Transformation

int Y1, Cb, Cr;         /* gamma pre-corrected input [0;255] */
int ER, EG, EB;         /* output [0;255] */

double r, g, b;         /* temporaries */
double y1, pb, pr;

int
clamp (double x)
{
        int r = x;      /* round to nearest */

        if (r < 0)         return 0;
        else if (r > 255)  return 255;
        else               return r;
}

y1 = (255 / 219.0) * (Y1 - 16);
pb = (255 / 224.0) * (Cb - 128);
pr = (255 / 224.0) * (Cr - 128);

r = 1.0 * y1 + 0     * pb + 1.402 * pr;
g = 1.0 * y1 - 0.344 * pb - 0.714 * pr;
b = 1.0 * y1 + 1.772 * pb + 0     * pr;

ER = clamp (r * 255); /* [ok? one should prob. limit y1,pb,pr] */
EG = clamp (g * 255);
EB = clamp (b * 255);
      

Table 2-2. enum v4l2_colorspace

Identifier Value Description Chromaticities[a] White Point Gamma Correction Luminance E'Y Quantization
Red Green Blue Y' Cb, Cr
V4L2_COLORSPACE_SMPTE170M 1 NTSC/PAL according toSMPTE 170M,ITU BT.601 x = 0.630, y = 0.340 x = 0.310, y = 0.595 x = 0.155, y = 0.070 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I 0.299 E'R+ 0.587 E'G+ 0.114 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_SMPTE240M 2 1125-Line (US) HDTV, see SMPTE 240M x = 0.630, y = 0.340 x = 0.310, y = 0.595 x = 0.155, y = 0.070 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4 I for I ≤0.0228,1.1115 I0.45 - 0.1115 for 0.0228 < I 0.212 E'R+ 0.701 E'G+ 0.087 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_REC709 3 HDTV and modern devices, see ITU BT.709 x = 0.640, y = 0.330 x = 0.300, y = 0.600 x = 0.150, y = 0.060 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I 0.2125 E'R+ 0.7154 E'G+ 0.0721 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_BT878 4 Broken Bt878 extents[b],ITU BT.601 ? ? ? ? ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 237 E'Y + 16 224 PB,R + 128 (probably)
V4L2_COLORSPACE_470_SYSTEM_M 5 M/NTSC[c] according toITU BT.470,ITU BT.601 x = 0.67, y = 0.33 x = 0.21, y = 0.71 x = 0.14, y = 0.08 x = 0.310, y = 0.316, Illuminant C ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_470_SYSTEM_BG 6 625-line PAL and SECAM systems according toITU BT.470, ITU BT.601 x = 0.64, y = 0.33 x = 0.29, y = 0.60 x = 0.15, y = 0.06 x = 0.313, y = 0.329,Illuminant D65 ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_JPEG 7 JPEG Y'CbCr, see JFIF,ITU BT.601 ? ? ? ? ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 256 E'Y + 16[d] 256 PB,R + 128
V4L2_COLORSPACE_SRGB 8 [?] x = 0.640, y = 0.330 x = 0.300, y = 0.600 x = 0.150, y = 0.060 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I n/a
Notes:
a. The coordinates of the color primaries aregiven in the CIE system (1931)
b. The ubiquitous Bt878 video capture chipquantizes E'Y to 238 levels, yielding a rangeof Y' = 16 … 253, unlike Rec. 601 Y' = 16 …235. This is not a typo in the Bt878 documentation, it has beenimplemented in silicon. The chroma extents are unclear.
c. No identifier exists for M/PAL which usesthe chromaticities of M/NTSC, the remaining parameters are equal to B andG/PAL.
d. Note JFIF quantizesY'PBPR in range [0;+1] and[-0.5;+0.5] to257 levels, however Y'CbCr signalsare still clamped to [0;255].

2.3. Indexed Format

In this format each pixel is represented by an 8 bit indexinto a 256 entry ARGB palette. It is intended forVideo Output Overlays only. There are no ioctls toaccess the palette, this must be done with ioctls of the Linux framebuffer API.

Table 2-3. Indexed Image Format

Identifier Code   Byte 0                                                    
    Bit 7 6 5 4 3 2 1 0                                                    
V4L2_PIX_FMT_PAL8 'PAL8'   i7 i6 i5 i4 i3 i2 i1 i0                                                    

2.4. RGB Formats

Table of Contents Packed RGB formats -- Packed RGB formats V4L2_PIX_FMT_SBGGR8 ('BA81') -- Bayer RGB format V4L2_PIX_FMT_SBGGR16 ('BA82') -- Bayer RGB format

Packed RGB formats

Name

Packed RGB formats -- Packed RGB formats

Description

These formats are designed to match the pixel formats oftypical PC graphics frame buffers. They occupy 8, 16, 24 or 32 bitsper pixel. These are all packed-pixel formats, meaning all the datafor a pixel lie next to each other in memory.

When one of these formats is used, drivers shall report thecolorspace V4L2_COLORSPACE_SRGB.

Table 2-1. Packed RGB Image Formats

Identifier Code   Byte 0 in memory   Byte 1   Byte 2   Byte 3
    Bit 7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0
V4L2_PIX_FMT_RGB332 'RGB1'   b1 b0 g2 g1 g0 r2 r1 r0                                                    
V4L2_PIX_FMT_RGB444 'R444'   g3 g2 g1 g0 b3 b2 b1 b0   a3 a2 a1 a0 r3 r2 r1 r0                                  
V4L2_PIX_FMT_RGB555 'RGBO'   g2 g1 g0 r4 r3 r2 r1 r0   a b4 b3 b2 b1 b0 g4 g3                                  
V4L2_PIX_FMT_RGB565 'RGBP'   g2 g1 g0 r4 r3 r2 r1 r0   b4 b3 b2 b1 b0 g5 g4 g3                                  
V4L2_PIX_FMT_RGB555X 'RGBQ'   a b4 b3 b2 b1 b0 g4 g3   g2 g1 g0 r4 r3 r2 r1 r0                                  
V4L2_PIX_FMT_RGB565X 'RGBR'   b4 b3 b2 b1 b0 g5 g4 g3   g2 g1 g0 r4 r3 r2 r1 r0                                  
V4L2_PIX_FMT_BGR24 'BGR3'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0                
V4L2_PIX_FMT_RGB24 'RGB3'   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0                
V4L2_PIX_FMT_BGR32 'BGR4'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0   a7 a6 a5 a4 a3 a2 a1 a0
V4L2_PIX_FMT_RGB32 'RGB4'   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0   a7 a6 a5 a4 a3 a2 a1 a0

Bit 7 is the most significant bit. The value of a = alphabits is undefined when reading from the driver, ignored when writingto the driver, except when alpha blending has been negotiated for aVideo Overlay or Video Output Overlay.

Example 2-1. V4L2_PIX_FMT_BGR24 4 × 4 pixelimage

Byte Order. Each cell is one byte.

start + 0: B00 G00 R00 B01 G01 R01 B02 G02 R02 B03 G03 R03
start + 12: B10 G10 R10 B11 G11 R11 B12 G12 R12 B13 G13 R13
start + 24: B20 G20 R20 B21 G21 R21 B22 G22 R22 B23 G23 R23
start + 36: B30 G30 R30 B31 G31 R31 B32 G32 R32 B33 G33 R33

Important: Drivers may interpret these formats differently.

Some RGB formats above are uncommon and were probablydefined in error. Drivers may interpret them as inTable 2-2.

Table 2-2. Packed RGB Image Formats (corrected)

Identifier Code   Byte 0 in memory   Byte 1   Byte 2   Byte 3
    Bit 7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0
V4L2_PIX_FMT_RGB332 'RGB1'   r2 r1 r0 g2 g1 g0 b1 b0                                                    
V4L2_PIX_FMT_RGB444 'R444'   g3 g2 g1 g0 b3 b2 b1 b0   a3 a2 a1 a0 r3 r2 r1 r0                                  
V4L2_PIX_FMT_RGB555 'RGBO'   g2 g1 g0 b4 b3 b2 b1 b0   a r4 r3 r2 r1 r0 g4 g3                                  
V4L2_PIX_FMT_RGB565 'RGBP'   g2 g1 g0 b4 b3 b2 b1 b0   r4 r3 r2 r1 r0 g5 g4 g3                                  
V4L2_PIX_FMT_RGB555X 'RGBQ'   a r4 r3 r2 r1 r0 g4 g3   g2 g1 g0 b4 b3 b2 b1 b0                                  
V4L2_PIX_FMT_RGB565X 'RGBR'   r4 r3 r2 r1 r0 g5 g4 g3   g2 g1 g0 b4 b3 b2 b1 b0                                  
V4L2_PIX_FMT_BGR24 'BGR3'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0                
V4L2_PIX_FMT_RGB24 'RGB3'   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0                
V4L2_PIX_FMT_BGR32 'BGR4'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0   a7 a6 a5 a4 a3 a2 a1 a0
V4L2_PIX_FMT_RGB32 'RGB4'   a7 a6 a5 a4 a3 a2 a1 a0   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0

A test utility to determine which RGB formats a driveractually supports is available from the LinuxTV v4l-dvb repository.Seehttp://linuxtv.org/repo/ for access instructions.

V4L2_PIX_FMT_SBGGR8 ('BA81')

Name

V4L2_PIX_FMT_SBGGR8 -- Bayer RGB format

Description

This is commonly the native format of digital cameras,reflecting the arrangement of sensors on the CCD device. Only one red,green or blue value is given for each pixel. Missing components mustbe interpolated from neighbouring pixels. From left to right the firstrow consists of a blue and green value, the second row of a green andred value. This scheme repeats to the right and down for every twocolumns and rows.

Example 2-1. V4L2_PIX_FMT_SBGGR8 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: B00 G01 B02 G03
start + 4: G10 R11 G12 R13
start + 8: B20 G21 B22 G23
start + 12: G30 R31 G32 R33

V4L2_PIX_FMT_SBGGR16 ('BA82')

Name

V4L2_PIX_FMT_SBGGR16 -- Bayer RGB format

Description

This format is similar to V4L2_PIX_FMT_SBGGR8, except each pixel hasa depth of 16 bits. The least significant byte is stored at lowermemory addresses (little-endian). Note the actual sampling precisionmay be lower than 16 bits, for example 10 bits per pixel with valuesin range 0 to 1023.

Example 2-1. V4L2_PIX_FMT_SBGGR16 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: B00low B00high G01low G01high B02low B02high G03low G03high
start + 8: G10low G10high R11low R11high G12low G12high R13low R13high
start + 16: B20low B20high G21low G21high B22low B22high G23low G23high
start + 24: G30low G30high R31low R31high G32low G32high R33low R33high

2.5. YUV Formats

Table of Contents Packed YUV formats -- Packed YUV formats V4L2_PIX_FMT_GREY ('GREY') -- Grey-scale image V4L2_PIX_FMT_Y16 ('Y16 ') -- Grey-scale image V4L2_PIX_FMT_YUYV ('YUYV') -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2 V4L2_PIX_FMT_UYVY ('UYVY') -- Variation of V4L2_PIX_FMT_YUYV with different order of samplesin memory V4L2_PIX_FMT_Y41P ('Y41P') -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1 V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12') -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0 V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9') -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0 V4L2_PIX_FMT_YUV422P ('422P') -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to V4L2_PIX_FMT_YUYV V4L2_PIX_FMT_YUV411P ('411P') -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to V4L2_PIX_FMT_Y41P V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21') -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to V4L2_PIX_FMT_YVU420

YUV is the format native to TV broadcast and composite videosignals. It separates the brightness information (Y) from the colorinformation (U and V or Cb and Cr). The color information consists ofred and bluecolor difference signals, this waythe green component can be reconstructed by subtracting from thebrightness component. SeeSection 2.2 for conversionexamples. YUV was chosen because early television would only transmitbrightness information. To add color in a way compatible with existingreceivers a new signal carrier was added to transmit the colordifference signals. Secondary in the YUV format the U and V componentsusually have lower resolution than the Y component. This is an analogvideo compression technique taking advantage of a property of thehuman visual system, being more sensitive to brightnessinformation.

Packed YUV formats

Name

Packed YUV formats -- Packed YUV formats

Description

Similar to the packed RGB formats these formats storethe Y, Cb and Cr component of each pixel in one 16 or 32 bitword.

Table 2-1. Packed YUV Image Formats

Identifier Code   Byte 0 in memory   Byte 1   Byte 2   Byte 3
    Bit 7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0
V4L2_PIX_FMT_YUV444 'Y444'   Cb3 Cb2 Cb1 Cb0 Cr3 Cr2 Cr1 Cr0   a3 a2 a1 a0 Y'3 Y'2 Y'1 Y'0                                  
V4L2_PIX_FMT_YUV555 'YUVO'   Cb2 Cb1 Cb0 Cr4 Cr3 Cr2 Cr1 Cr0   a Y'4 Y'3 Y'2 Y'1 Y'0 Cb4 Cb3                                  
V4L2_PIX_FMT_YUV565 'YUVP'   Cb2 Cb1 Cb0 Cr4 Cr3 Cr2 Cr1 Cr0   Y'4 Y'3 Y'2 Y'1 Y'0 Cb5 Cb4 Cb3                                  
V4L2_PIX_FMT_YUV32 'YUV4'   a7 a6 a5 a4 a3 a2 a1 a0   Y'7 Y'6 Y'5 Y'4 Y'3 Y'2 Y'1 Y'0   Cb7 Cb6 Cb5 Cb4 Cb3 Cb2 Cb1 Cb0   Cr7 Cr6 Cr5 Cr4 Cr3 Cr2 Cr1 Cr0

Bit 7 is the most significant bit. The value of a = alphabits is undefined when reading from the driver, ignored when writingto the driver, except when alpha blending has been negotiated for aVideo Overlay or Video Output Overlay.

V4L2_PIX_FMT_GREY ('GREY')

Name

V4L2_PIX_FMT_GREY -- Grey-scale image

Description

This is a grey-scale image. It is really a degenerateY'CbCr format which simply contains no Cb or Cr data.

Example 2-1. V4L2_PIX_FMT_GREY 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33

V4L2_PIX_FMT_Y16 ('Y16 ')

Name

V4L2_PIX_FMT_Y16 -- Grey-scale image

Description

This is a grey-scale image with a depth of 16 bits perpixel. The least significant byte is stored at lower memory addresses(little-endian). Note the actual sampling precision may be lower than16 bits, for example 10 bits per pixel with values in range 0 to1023.

Example 2-1. V4L2_PIX_FMT_Y16 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00low Y'00high Y'01low Y'01high Y'02low Y'02high Y'03low Y'03high
start + 8: Y'10low Y'10high Y'11low Y'11high Y'12low Y'12high Y'13low Y'13high
start + 16: Y'20low Y'20high Y'21low Y'21high Y'22low Y'22high Y'23low Y'23high
start + 24: Y'30low Y'30high Y'31low Y'31high Y'32low Y'32high Y'33low Y'33high

V4L2_PIX_FMT_YUYV ('YUYV')

Name

V4L2_PIX_FMT_YUYV -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2

Description

In this format each four bytes is two pixels. Each fourbytes is two Y's, a Cb and a Cr. Each Y goes to one of the pixels, andthe Cb and Cr belong to both pixels. As you can see, the Cr and Cbcomponents have half the horizontal resolution of the Y component.V4L2_PIX_FMT_YUYVis known in the Windowsenvironment as YUY2.

Example 2-1. V4L2_PIX_FMT_YUYV 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Cb00 Y'01 Cr00 Y'02 Cb01 Y'03 Cr01
start + 8: Y'10 Cb10 Y'11 Cr10 Y'12 Cb11 Y'13 Cr11
start + 16: Y'20 Cb20 Y'21 Cr20 Y'22 Cb21 Y'23 Cr21
start + 24: Y'30 Cb30 Y'31 Cr30 Y'32 Cb31 Y'33 Cr31

Color Sample Location.

  0   1   2   3
0 Y C Y   Y C Y
1 Y C Y   Y C Y
2 Y C Y   Y C Y
3 Y C Y   Y C Y

V4L2_PIX_FMT_UYVY ('UYVY')

Name

V4L2_PIX_FMT_UYVY -- Variation of V4L2_PIX_FMT_YUYV with different order of samplesin memory

Description

In this format each four bytes is two pixels. Each fourbytes is two Y's, a Cb and a Cr. Each Y goes to one of the pixels, andthe Cb and Cr belong to both pixels. As you can see, the Cr and Cbcomponents have half the horizontal resolution of the Ycomponent.

Example 2-1. V4L2_PIX_FMT_UYVY 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Cb00 Y'00 Cr00 Y'01 Cb01 Y'02 Cr01 Y'03
start + 8: Cb10 Y'10 Cr10 Y'11 Cb11 Y'12 Cr11 Y'13
start + 16: Cb20 Y'20 Cr20 Y'21 Cb21 Y'22 Cr21 Y'23
start + 24: Cb30 Y'30 Cr30 Y'31 Cb31 Y'32 Cr31 Y'33

Color Sample Location.

  0   1   2   3
0 Y C Y   Y C Y
1 Y C Y   Y C Y
2 Y C Y   Y C Y
3 Y C Y   Y C Y

V4L2_PIX_FMT_Y41P ('Y41P')

Name

V4L2_PIX_FMT_Y41P -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1

Description

In this format each 12 bytes is eight pixels. In thetwelve bytes are two CbCr pairs and eight Y's. The first CbCr pairgoes with the first four Y's, and the second CbCr pair goes with theother four Y's. The Cb and Cr components have one fourth thehorizontal resolution of the Y component.

Do not confuse this format with V4L2_PIX_FMT_YUV411P. Y41P is derived from "YUV 4:1:1packed", whileYUV411P stands for "YUV 4:1:1planar".

Example 2-1. V4L2_PIX_FMT_Y41P 8 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Cb00 Y'00 Cr00 Y'01 Cb01 Y'02 Cr01 Y'03 Y'04 Y'05 Y'06 Y'07
start + 12: Cb10 Y'10 Cr10 Y'11 Cb11 Y'12 Cr11 Y'13 Y'14 Y'15 Y'16 Y'17
start + 24: Cb20 Y'20 Cr20 Y'21 Cb21 Y'22 Cr21 Y'23 Y'24 Y'25 Y'26 Y'27
start + 36: Cb30 Y'30 Cr30 Y'31 Cb31 Y'32 Cr31 Y'33 Y'34 Y'35 Y'36 Y'37

Color Sample Location.

  0   1   2   3   4   5   6   7
0 Y   Y C Y   Y   Y   Y C Y   Y
1 Y   Y C Y   Y   Y   Y C Y   Y
2 Y   Y C Y   Y   Y   Y C Y   Y
3 Y   Y C Y   Y   Y   Y C Y   Y

V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12')

Name

V4L2_PIX_FMT_YVU420V4L2_PIX_FMT_YUV420 -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0

Description

These are planar formats, as opposed to a packed format.The three components are separated into three sub- images or planes.The Y plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_YVU420, the Cr plane immediatelyfollows the Y plane in memory. The Cr plane is half the width and halfthe height of the Y plane (and of the image). Each Cr belongs to fourpixels, a two-by-two square of the image. For example,Cr0 belongs to Y'00,Y'01, Y'10, andY'11. Following the Cr plane is the Cb plane,just like the Cr plane.V4L2_PIX_FMT_YUV420 isthe same except the Cb plane comes first, then the Cr plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have half as many pad bytes after their rows. In otherwords, two Cx rows (including padding) is exactly as long as one Y row(including padding).

Example 2-1. V4L2_PIX_FMT_YVU420 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cr00 Cr01    
start + 18: Cr10 Cr11    
start + 20: Cb00 Cb01    
start + 22: Cb10 Cb11    

Color Sample Location.

  0   1   2   3
0 Y   Y   Y   Y
    C       C  
1 Y   Y   Y   Y
             
2 Y   Y   Y   Y
    C       C  
3 Y   Y   Y   Y

V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9')

Name

V4L2_PIX_FMT_YVU410V4L2_PIX_FMT_YUV410 -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0

Description

These are planar formats, as opposed to a packed format.The three components are separated into three sub-images or planes.The Y plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_YVU410, the Cr plane immediatelyfollows the Y plane in memory. The Cr plane is ¼ the width and¼ the height of the Y plane (and of the image). Each Cr belongsto 16 pixels, a four-by-four square of the image. Following the Crplane is the Cb plane, just like the Cr plane.V4L2_PIX_FMT_YUV410 is the same, except the Cbplane comes first, then the Cr plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have ¼ as many pad bytes after their rows. Inother words, four Cx rows (including padding) are exactly as long asone Y row (including padding).

Example 2-1. V4L2_PIX_FMT_YVU410 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cr00      
start + 17: Cb00      

Color Sample Location.

  0   1   2   3
0 Y   Y   Y   Y
             
1 Y   Y   Y   Y
        C      
2 Y   Y   Y   Y
             
3 Y   Y   Y   Y

V4L2_PIX_FMT_YUV422P ('422P')

Name

V4L2_PIX_FMT_YUV422P -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to V4L2_PIX_FMT_YUYV

Description

This format is not commonly used. This is a planarversion of the YUYV format. The three components are separated intothree sub-images or planes. The Y plane is first. The Y plane has onebyte per pixel. The Cb plane immediately follows the Y plane inmemory. The Cb plane is half the width of the Y plane (and of theimage). Each Cb belongs to two pixels. For example,Cb0 belongs to Y'00,Y'01. Following the Cb plane is the Cr plane,just like the Cb plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have half as many pad bytes after their rows. In otherwords, two Cx rows (including padding) is exactly as long as one Y row(including padding).

Example 2-1. V4L2_PIX_FMT_YUV422P 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cb00 Cb01    
start + 18: Cb10 Cb11    
start + 20: Cb20 Cb21    
start + 22: Cb30 Cb31    
start + 24: Cr00 Cr01    
start + 26: Cr10 Cr11    
start + 28: Cr20 Cr21    
start + 30: Cr30 Cr31    

Color Sample Location.

  0   1   2   3
0 Y C Y   Y C Y
1 Y C Y   Y C Y
2 Y C Y   Y C Y
3 Y C Y   Y C Y

V4L2_PIX_FMT_YUV411P ('411P')

Name

V4L2_PIX_FMT_YUV411P -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to V4L2_PIX_FMT_Y41P

Description

This format is not commonly used. This is a planarformat similar to the 4:2:2 planar format except with half as manychroma. The three components are separated into three sub-images orplanes. The Y plane is first. The Y plane has one byte per pixel. TheCb plane immediately follows the Y plane in memory. The Cb plane is¼ the width of the Y plane (and of the image). Each Cb belongsto 4 pixels all on the same row. For example,Cb0 belongs to Y'00,Y'01, Y'02 andY'03. Following the Cb plane is the Cr plane,just like the Cb plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have ¼ as many pad bytes after their rows. Inother words, four C x rows (including padding) is exactly as long asone Y row (including padding).

Example 2-1. V4L2_PIX_FMT_YUV411P 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cb00      
start + 17: Cb10      
start + 18: Cb20      
start + 19: Cb30      
start + 20: Cr00      
start + 21: Cr10      
start + 22: Cr20      
start + 23: Cr30      

Color Sample Location.

  0   1   2   3
0 Y   Y C Y   Y
1 Y   Y C Y   Y
2 Y   Y C Y   Y
3 Y   Y C Y   Y

V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21')

Name

V4L2_PIX_FMT_NV12V4L2_PIX_FMT_NV21 -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to V4L2_PIX_FMT_YVU420

Description

These are two-plane versions of the YUV 4:2:0 format.The three components are separated into two sub-images or planes. TheY plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_NV12, a combined CbCr planeimmediately follows the Y plane in memory. The CbCr plane is the samewidth, in bytes, as the Y plane (and of the image), but is half astall in pixels. Each CbCr pair belongs to four pixels. For example,Cb0/Cr0 belongs toY'00, Y'01,Y'10, Y'11.V4L2_PIX_FMT_NV21 is the same except the Cb andCr bytes are swapped, the CrCb plane starts with a Cr byte.

If the Y plane has pad bytes after each row, then theCbCr plane has as many pad bytes after its rows.

Example 2-1. V4L2_PIX_FMT_NV12 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cb00 Cr00 Cb01 Cr01
start + 20: Cb10 Cr10 Cb11 Cr11

Color Sample Location.

  0   1   2   3
0 Y   Y   Y   Y
    C       C  
1 Y   Y   Y   Y
             
2 Y   Y   Y   Y
    C       C  
3 Y   Y   Y   Y

2.6. Compressed Formats

Table 2-7. Compressed Image Formats

Identifier Code Details
V4L2_PIX_FMT_JPEG 'JPEG' TBD. See also VIDIOC_G_JPEGCOMPVIDIOC_S_JPEGCOMP.
V4L2_PIX_FMT_MPEG 'MPEG' MPEG stream. The actual format is determined byextended controlV4L2_CID_MPEG_STREAM_TYPE, seeTable 1-2.

2.7. Reserved Format Identifiers

These formats are not defined by this specification, theyare just listed for reference and to avoid naming conflicts. If youwant to register your own format, send an e-mail to the V4L mailinglisthttps://listman.redhat.com/mailman/listinfo/video4linux-list for inclusion in thevideodev.hfile. If you want to share your format with other developers add alink to your documentation and send a copy to the maintainer of thisdocument, Michael Schimek, forinclusion in this section. If you think your format should be listedin a standard format section please make a proposal on the V4L mailinglist.

Table 2-8. Reserved Image Formats

Identifier Code Details
V4L2_PIX_FMT_DV 'dvsd' unknown
V4L2_PIX_FMT_ET61X251 'E625' Compressed format of the ET61X251 driver.
V4L2_PIX_FMT_HI240 'HI24'

8 bit RGB format used by the BTTV driver,http://bytesex.org/bttv/

V4L2_PIX_FMT_HM12 'HM12'

YUV 4:2:0 format used by theIVTV driver, http://www.ivtvdriver.org/

The format is documented in thekernel sources in the fileDocumentation/video4linux/cx2341x/README.hm12

V4L2_PIX_FMT_MJPEG 'MJPG' Compressed format used by the Zoran driver
V4L2_PIX_FMT_PWC1 'PWC1' Compressed format of the PWC driver.
V4L2_PIX_FMT_PWC2 'PWC2' Compressed format of the PWC driver.
V4L2_PIX_FMT_SN9C10X 'S910' Compressed format of the SN9C102 driver.
V4L2_PIX_FMT_WNVA 'WNVA'

Used by the Winnov Videum driver, http://www.thedirks.org/winnov/

V4L2_PIX_FMT_YYUV 'YYUV' unknown

Chapter 3. Input/Output

The V4L2 API defines several different methods to read from orwrite to a device. All drivers exchanging data with applications mustsupport at least one of them.

The classic I/O method using the read()and write() function is automatically selectedafter opening a V4L2 device. When the driver does not support thismethod attempts to read or write will fail at any time.

Other methods must be negotiated. To select the streaming I/Omethod with memory mapped or user buffers applications call theVIDIOC_REQBUFS ioctl. The asynchronous I/O method is not definedyet.

Video overlay can be considered another I/O method, althoughthe application does not directly receive the image data. It isselected by initiating video overlay with theVIDIOC_S_FMT ioctl.For more information seeSection 4.2.

Generally exactly one I/O method, including overlay, isassociated with each file descriptor. The only exceptions areapplications not exchanging data with a driver ("panel applications",seeSection 1.1) and drivers permitting simultaneous video capturingand overlay using the same file descriptor, for compatibility with V4Land earlier versions of V4L2.

VIDIOC_S_FMT andVIDIOC_REQBUFS would permit this to some degree,but for simplicity drivers need not support switching the I/O method(after first switching away from read/write) other than by closingand reopening the device.

The following sections describe the various I/O methods inmore detail.


3.1. Read/Write

Input and output devices support theread() and write() function,respectively, when the V4L2_CAP_READWRITE flag inthecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl is set.

Drivers may need the CPU to copy the data, but they may alsosupport DMA to or from user memory, so this I/O method is notnecessarily less efficient than other methods merely exchanging bufferpointers. It is considered inferior though because no meta-informationlike frame counters or timestamps are passed. This information isnecessary to recognize frame dropping and to synchronize with otherdata streams. However this is also the simplest I/O method, requiringlittle or no setup to exchange data. It permits command line stuntslike this (the vidctrl tool isfictitious):

> vidctrl /dev/video --input=0 --format=YUYV --size=352x288
> dd if=/dev/video of=myimage.422 bs=202752 count=1

To read from the device applications use theread() function, to write thewrite() function.Drivers must implement one I/O method if theyexchange data with applications, but it need not be this.[12] When reading or writing is supported, the drivermust also support the select() and poll()function.[13]


3.2. Streaming I/O (Memory Mapping)

Input and output devices support this I/O method when theV4L2_CAP_STREAMING flag in thecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl is set. There are twostreaming methods, to determine if the memory mapping flavor issupported applications must call theVIDIOC_REQBUFS ioctl.

Streaming is an I/O method where only pointers to buffersare exchanged between application and driver, the data itself is notcopied. Memory mapping is primarily intended to map buffers in devicememory into the application's address space. Device memory can be forexample the video memory on a graphics card with a video captureadd-on. However, being the most efficient I/O method available for along time, many other drivers support streaming as well, allocatingbuffers in DMA-able main memory.

A driver can support many sets of buffers. Each set isidentified by a unique buffer type value. The sets are independent andeach set can hold a different type of data. To access different setsat the same time different file descriptors must be used.[14]

To allocate device buffers applications call theVIDIOC_REQBUFS ioctl with the desired number of buffers and buffertype, for exampleV4L2_BUF_TYPE_VIDEO_CAPTURE.This ioctl can also be used to change the number of buffers or to freethe allocated memory, provided none of the buffers are stillmapped.

Before applications can access the buffers they must mapthem into their address space with themmap() function. Thelocation of the buffers in device memory can be determined with theVIDIOC_QUERYBUF ioctl. Them.offset andlength returned in a struct v4l2_buffer arepassed as sixth and second parameter to themmap() function. The offset and length valuesmust not be modified. Remember the buffers are allocated in physicalmemory, as opposed to virtual memory which can be swapped out to disk.Applications should free the buffers as soon as possible with themunmap() function.

Example 3-1. Mapping buffers

struct v4l2_requestbuffers reqbuf;
struct {
        void *start;
        size_t length;
} *buffers;
unsigned int i;

memset (&reqbuf, 0, sizeof (reqbuf));
reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbuf.memory = V4L2_MEMORY_MMAP;
reqbuf.count = 20;

if (-1 == ioctl (fd, VIDIOC_REQBUFS, &reqbuf)) {
        if (errno == EINVAL)
                printf ("Video capturing or mmap-streaming is not supported\n");
        else
                perror ("VIDIOC_REQBUFS");

        exit (EXIT_FAILURE);
}

/* We want at least five buffers. */

if (reqbuf.count < 5) {
        /* You may need to free the buffers here. */
        printf ("Not enough buffer memory\n");
        exit (EXIT_FAILURE);
}

buffers = calloc (reqbuf.count, sizeof (*buffers));
assert (buffers != NULL);

for (i = 0; i < reqbuf.count; i++) {
        struct v4l2_buffer buffer;

        memset (&buffer, 0, sizeof (buffer));
        buffer.type = reqbuf.type;
	buffer.memory = V4L2_MEMORY_MMAP;
        buffer.index = i;

        if (-1 == ioctl (fd, VIDIOC_QUERYBUF, &buffer)) {
                perror ("VIDIOC_QUERYBUF");
                exit (EXIT_FAILURE);
        }

        buffers[i].length = buffer.length; /* remember for munmap() */

        buffers[i].start = mmap (NULL, buffer.length,
                                 PROT_READ | PROT_WRITE, /* recommended */
                                 MAP_SHARED,             /* recommended */
                                 fd, buffer.m.offset);

        if (MAP_FAILED == buffers[i].start) {
                /* If you do not exit here you should unmap() and free()
                   the buffers mapped so far. */
                perror ("mmap");
                exit (EXIT_FAILURE);
        }
}

/* Cleanup. */

for (i = 0; i < reqbuf.count; i++)
        munmap (buffers[i].start, buffers[i].length);
      

Conceptually streaming drivers maintain two buffer queues, an incomingand an outgoing queue. They separate the synchronous capture or outputoperation locked to a video clock from the application which issubject to random disk or network delays and preemption byother processes, thereby reducing the probability of data loss.The queues are organized as FIFOs, buffers will beoutput in the order enqueued in the incoming FIFO, and werecaptured in the order dequeued from the outgoing FIFO.

The driver may require a minimum number of buffers enqueuedat all times to function, apart of this no limit exists on the numberof buffers applications can enqueue in advance, or dequeue andprocess. They can also enqueue in a different order than buffers havebeen dequeued, and the driver canfill enqueuedempty buffers in any order.[15] The index number of a buffer (struct v4l2_bufferindex) plays no role here, it onlyidentifies the buffer.

Initially all mapped buffers are in dequeued state,inaccessible by the driver. For capturing applications it is customaryto first enqueue all mapped buffers, then to start capturing and enterthe read loop. Here the application waits until a filled buffer can bedequeued, and re-enqueues the buffer when the data is no longerneeded. Output applications fill and enqueue buffers, when enoughbuffers are stacked up the output is started withVIDIOC_STREAMON. In the write loop, whenthe application runs out of free buffers, it must wait until an emptybuffer can be dequeued and reused.

To enqueue and dequeue a buffer applications use theVIDIOC_QBUF andVIDIOC_DQBUF ioctl. The status of a buffer beingmapped, enqueued, full or empty can be determined at any time using theVIDIOC_QUERYBUF ioctl. Two methods exist to suspend execution of theapplication until one or more buffers can be dequeued. By defaultVIDIOC_DQBUF blocks when no buffer is in theoutgoing queue. When theO_NONBLOCK flag wasgiven to theopen()function,VIDIOC_DQBUFreturns immediately with anEAGAIN error code when no buffer is available. Theselect() orpoll() function are always available.

To start and stop capturing or output applications call theVIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl. NoteVIDIOC_STREAMOFF removes all buffers from bothqueues as a side effect. Since there is no notion of doing anything"now" on a multitasking system, if an application needs to synchronizewith another event it should examine the struct v4l2_buffertimestamp of captured buffers, or set thefield before enqueuing buffers for output.

Drivers implementing memory mapping I/O mustsupport the VIDIOC_REQBUFS,VIDIOC_QUERYBUF,VIDIOC_QBUF,VIDIOC_DQBUF,VIDIOC_STREAMONandVIDIOC_STREAMOFF ioctl, themmap(),munmap(),select() andpoll()function.[16]

[capture example]


3.3. Streaming I/O (User Pointers)

Input and output devices support this I/O method when theV4L2_CAP_STREAMING flag in thecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl is set. If the particular userpointer method (not only memory mapping) is supported must bedetermined by calling theVIDIOC_REQBUFS ioctl.

This I/O method combines advantages of the read/write andmemory mapping methods. Buffers are allocated by the applicationitself, and can reside for example in virtual or shared memory. Onlypointers to data are exchanged, these pointers and meta-informationare passed in struct v4l2_buffer. The driver must be switchedinto user pointer I/O mode by calling theVIDIOC_REQBUFS with thedesired buffer type. No buffers are allocated beforehands,consequently they are not indexed and cannot be queried like mappedbuffers with theVIDIOC_QUERYBUF ioctl.

Example 3-2. Initiating streaming I/O with user pointers

struct v4l2_requestbuffers reqbuf;

memset (&reqbuf, 0, sizeof (reqbuf));
reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbuf.memory = V4L2_MEMORY_USERPTR;

if (ioctl (fd, VIDIOC_REQBUFS, &reqbuf) == -1) {
        if (errno == EINVAL)
                printf ("Video capturing or user pointer streaming is not supported\n");
        else
                perror ("VIDIOC_REQBUFS");

        exit (EXIT_FAILURE);
}
      

Buffer addresses and sizes are passed on the fly with theVIDIOC_QBUF ioctl. Although buffers are commonly cycled,applications can pass different addresses and sizes at eachVIDIOC_QBUF call. If required by the hardware thedriver swaps memory pages within physical memory to create acontinuous area of memory. This happens transparently to theapplication in the virtual memory subsystem of the kernel. When bufferpages have been swapped out to disk they are brought back and finallylocked in physical memory for DMA.[17]

Filled or displayed buffers are dequeued with theVIDIOC_DQBUF ioctl. The driver can unlock the memory pages at anytime between the completion of the DMA and this ioctl. The memory isalso unlocked when VIDIOC_STREAMOFF is called, VIDIOC_REQBUFS, orwhen the device is closed. Applications must take care not to freebuffers without dequeuing. For once, the buffers remain locked untilfurther, wasting physical memory. Second the driver will not benotified when the memory is returned to the application's free listand subsequently reused for other purposes, possibly completing therequested DMA and overwriting valuable data.

For capturing applications it is customary to enqueue anumber of empty buffers, to start capturing and enter the read loop.Here the application waits until a filled buffer can be dequeued, andre-enqueues the buffer when the data is no longer needed. Outputapplications fill and enqueue buffers, when enough buffers are stackedup output is started. In the write loop, when the applicationruns out of free buffers it must wait until an empty buffer can bedequeued and reused. Two methods exist to suspend execution of theapplication until one or more buffers can be dequeued. By defaultVIDIOC_DQBUF blocks when no buffer is in theoutgoing queue. When theO_NONBLOCK flag wasgiven to theopen() function,VIDIOC_DQBUFreturns immediately with anEAGAIN error code when no buffer is available. Theselect() orpoll() function are always available.

To start and stop capturing or output applications call theVIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl. NoteVIDIOC_STREAMOFF removes all buffers from bothqueues and unlocks all buffers as a side effect. Since there is nonotion of doing anything "now" on a multitasking system, if anapplication needs to synchronize with another event it should examinethe struct v4l2_buffertimestamp of capturedbuffers, or set the field before enqueuing buffers for output.

Drivers implementing user pointer I/O mustsupport the VIDIOC_REQBUFS,VIDIOC_QBUF,VIDIOC_DQBUF,VIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl, theselect() andpoll() function.[18]


3.4. Asynchronous I/O

This method is not defined yet.


3.5. Buffers

A buffer contains data exchanged by application anddriver using one of the Streaming I/O methods. Only pointers tobuffers are exchanged, the data itself is not copied. These pointers,together with meta-information like timestamps or field parity, arestored in a struct v4l2_buffer, argument totheVIDIOC_QUERYBUFVIDIOC_QBUF and VIDIOC_DQBUF ioctl.

Nominally timestamps refer to the first data byte transmitted.In practice however the wide range of hardware covered by the V4L2 APIlimits timestamp accuracy. Often an interrupt routine willsample the system clock shortly after the field or frame was storedcompletely in memory. So applications must expect a constantdifference up to one field or frame period plus a small (few scanlines) random error. The delay and error can be muchlarger due to compression or transmission over an external bus whenthe frames are not properly stamped by the sender. This is frequentlythe case with USB cameras. Here timestamps refer to the instant thefield or frame was received by the driver, not the capture time. Thesedevices identify by not enumerating any video standards, seeSection 1.7.

Similar limitations apply to output timestamps. Typicallythe video hardware locks to a clock controlling the video timing, thehorizontal and vertical synchronization pulses. At some point in theline sequence, possibly the vertical blanking, an interrupt routinesamples the system clock, compares against the timestamp and programsthe hardware to repeat the previous field or frame, or to display thebuffer contents.

Apart of limitations of the video device and naturalinaccuracies of all clocks, it should be noted system time itself isnot perfectly stable. It can be affected by power saving cycles,warped to insert leap seconds, or even turned back or forth by thesystem administrator affecting long term measurements. [19]

Table 3-1. struct v4l2_buffer

__u32 index   Number of the buffer, set by the application. Thisfield is only used for memory mapping I/Oand can range from zero to the number of buffers allocatedwith theVIDIOC_REQBUFSioctl (struct v4l2_requestbufferscount) minus one.
enum v4l2_buf_type type   Type of the buffer, same as struct v4l2_formattype or struct v4l2_requestbufferstype, set by the application.
__u32 bytesused   The number of bytes occupied by the data in thebuffer. It depends on the negotiated data format and may change witheach buffer for compressed variable size data like JPEG images.Drivers must set this field whentyperefers to an input stream, applications when an output stream.
__u32 flags   Flags set by the application or driver, see Table 3-3.
enum v4l2_field field   Indicates the field order of the image in thebuffer, seeTable 3-8. This field is not used whenthe buffer contains VBI data. Drivers must set it whentype refers to an input stream,applications when an output stream.
struct timeval timestamp  

For input streams this is thesystem time (as returned by the gettimeofday()function) when the first data byte was captured. For output streamsthe data will not be displayed before this time, secondary to thenominal frame rate determined by the current video standard inenqueued order. Applications can for example zero this field todisplay frames as soon as possible. The driver stores the time atwhich the first data byte was actually sent out in thetimestamp field. This permitsapplications to monitor the drift between the video and systemclock.

struct v4l2_timecode timecode   When type isV4L2_BUF_TYPE_VIDEO_CAPTURE and theV4L2_BUF_FLAG_TIMECODE flag is set inflags, this structure contains a frametimecode. InV4L2_FIELD_ALTERNATEmode the top and bottom field contain the same timecode.Timecodes are intended to help video editing and are typically recorded onvideo tapes, but also embedded in compressed formats like MPEG. Thisfield is independent of the timestampandsequence fields.
__u32 sequence   Set by the driver, counting the frames in thesequence.

In V4L2_FIELD_ALTERNATE mode the top andbottom field have the same sequence number. The count starts at zeroand includes dropped or repeated frames. A dropped frame was receivedby an input device but could not be stored due to lack of free bufferspace. A repeated frame was displayed again by an output devicebecause the application did not pass new data intime.

Note this may count the frames receivede.g. over USB, without taking into account the frames dropped by theremote hardware due to limited compression throughput or busbandwidth. These devices identify by not enumerating any videostandards, seeSection 1.7.

enum v4l2_memory memory   This field must be set by applications and/or driversin accordance with the selected I/O method.
union m    
  __u32 offset When memory isV4L2_MEMORY_MMAP this is the offset of the bufferfrom the start of the device memory. The value is returned by thedriver and apart of serving as parameter to themmap() functionnot useful for applications. SeeSection 3.2 for details.
  unsigned long userptr When memory isV4L2_MEMORY_USERPTR this is a pointer to thebuffer (casted to unsigned long type) in virtual memory, set by theapplication. SeeSection 3.3 for details.
__u32 length   Size of the buffer (not the payload) in bytes.
__u32 input   Some video capture drivers support rapid andsynchronous video input changes, a function useful for example invideo surveillance applications. For this purpose applications set theV4L2_BUF_FLAG_INPUT flag, and this field to thenumber of a video input as in struct v4l2_input fieldindex.
__u32 reserved   A place holder for future extensions and custom(driver defined) buffer typesV4L2_BUF_TYPE_PRIVATE and higher.

Table 3-2. enum v4l2_buf_type

V4L2_BUF_TYPE_VIDEO_CAPTURE 1 Buffer of a video capture stream, see Section 4.1.
V4L2_BUF_TYPE_VIDEO_OUTPUT 2 Buffer of a video output stream, see Section 4.3.
V4L2_BUF_TYPE_VIDEO_OVERLAY 3 Buffer for video overlay, see Section 4.2.
V4L2_BUF_TYPE_VBI_CAPTURE 4 Buffer of a raw VBI capture stream, see Section 4.7.
V4L2_BUF_TYPE_VBI_OUTPUT 5 Buffer of a raw VBI output stream, see Section 4.7.
V4L2_BUF_TYPE_SLICED_VBI_CAPTURE 6 Buffer of a sliced VBI capture stream, see Section 4.8.
V4L2_BUF_TYPE_SLICED_VBI_OUTPUT 7 Buffer of a sliced VBI output stream, see Section 4.8.
V4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY 8 Buffer for video output overlay (OSD), see Section 4.4. Status:Experimental.
V4L2_BUF_TYPE_PRIVATE 0x80 This and higher values are reserved for custom(driver defined) buffer types.

Table 3-3. Buffer Flags

V4L2_BUF_FLAG_MAPPED 0x0001 The buffer resides in device memory and has been mappedinto the application's address space, seeSection 3.2 for details.Drivers set or clear this flag when theVIDIOC_QUERYBUF,VIDIOC_QBUF orVIDIOC_DQBUFioctl is called. Set by the driver.
V4L2_BUF_FLAG_QUEUED 0x0002 Internally drivers maintain two buffer queues, anincoming and outgoing queue. When this flag is set, the buffer iscurrently on the incoming queue. It automatically moves to theoutgoing queue after the buffer has been filled (capture devices) ordisplayed (output devices). Drivers set or clear this flag when theVIDIOC_QUERYBUF ioctl is called. After(successful) calling theVIDIOC_QBUFioctl it isalways set and afterVIDIOC_DQBUF alwayscleared.
V4L2_BUF_FLAG_DONE 0x0004 When this flag is set, the buffer is currently onthe outgoing queue, ready to be dequeued from the driver. Drivers setor clear this flag when theVIDIOC_QUERYBUF ioctlis called. After calling theVIDIOC_QBUForVIDIOC_DQBUF it is always cleared. Of course abuffer cannot be on both queues at the same time, theV4L2_BUF_FLAG_QUEUEDandV4L2_BUF_FLAG_DONE flag are mutually exclusive.They can be both cleared however, then the buffer is in "dequeued"state, in the application domain to say so.
V4L2_BUF_FLAG_KEYFRAME 0x0008 Drivers set or clear this flag when calling theVIDIOC_DQBUF ioctl. It may be set by videocapture devices when the buffer contains a compressed image which is akey frame (or field), i. e. can be decompressed on its own.
V4L2_BUF_FLAG_PFRAME 0x0010 Similar to V4L2_BUF_FLAG_KEYFRAMEthis flags predicted frames or fields which contain only differences to aprevious key frame.
V4L2_BUF_FLAG_BFRAME 0x0020 Similar to V4L2_BUF_FLAG_PFRAME this is a bidirectional predicted frame or field. [ooc tbd]
V4L2_BUF_FLAG_TIMECODE 0x0100 The timecode field is valid.Drivers set or clear this flag when theVIDIOC_DQBUFioctl is called.
V4L2_BUF_FLAG_INPUT 0x0200 The input field is valid.Applications set or clear this flag before calling theVIDIOC_QBUF ioctl.

Table 3-4. enum v4l2_memory

V4L2_MEMORY_MMAP 1 The buffer is used for memorymapping I/O.
V4L2_MEMORY_USERPTR 2 The buffer is used for userpointer I/O.
V4L2_MEMORY_OVERLAY 3 [to do]

3.5.1. Timecodes

The v4l2_timecode structure isdesigned to hold aSMPTE 12M or similar timecode.(structtimeval timestamps are stored instruct v4l2_bufferfieldtimestamp.)

Table 3-5. struct v4l2_timecode

__u32 type Frame rate the timecodes are based on, see Table 3-6.
__u32 flags Timecode flags, see Table 3-7.
__u8 frames Frame count, 0 ... 23/24/29/49/59, depending on the type of timecode.
__u8 seconds Seconds count, 0 ... 59. This is a binary, not BCD number.
__u8 minutes Minutes count, 0 ... 59. This is a binary, not BCD number.
__u8 hours Hours count, 0 ... 29. This is a binary, not BCD number.
__u8 userbits[4] The "user group" bits from the timecode.

Table 3-6. Timecode Types

V4L2_TC_TYPE_24FPS 1 24 frames per second, i. e. film.
V4L2_TC_TYPE_25FPS 2 25 frames per second, i. e. PAL or SECAM video.
V4L2_TC_TYPE_30FPS 3 30 frames per second, i. e. NTSC video.
V4L2_TC_TYPE_50FPS 4  
V4L2_TC_TYPE_60FPS 5  

Table 3-7. Timecode Flags

V4L2_TC_FLAG_DROPFRAME 0x0001 Indicates "drop frame" semantics for counting framesin 29.97 fps material. When set, frame numbers 0 and 1 at the start ofeach minute, except minutes 0, 10, 20, 30, 40, 50 are omitted from thecount.
V4L2_TC_FLAG_COLORFRAME 0x0002 The "color frame" flag.
V4L2_TC_USERBITS_field 0x000C Field mask for the "binary group flags".
V4L2_TC_USERBITS_USERDEFINED 0x0000 Unspecified format.
V4L2_TC_USERBITS_8BITCHARS 0x0008 8-bit ISO characters.

3.6. Field Order

We have to distinguish between progressive and interlacedvideo. Progressive video transmits all lines of a video imagesequentially. Interlaced video divides an image into two fields,containing only the odd and even lines of the image, respectively.Alternating the so called odd and even field are transmitted, and dueto a small delay between fields a cathode ray TV displays the linesinterleaved, yielding the original frame. This curious technique wasinvented because at refresh rates similar to film the image wouldfade out too quickly. Transmitting fields reduces the flicker withoutthe necessity of doubling the frame rate and with it the bandwidthrequired for each channel.

It is important to understand a video camera does not exposeone frame at a time, merely transmitting the frames separated intofields. The fields are in fact captured at two different instances intime. An object on screen may well move between one field and thenext. For applications analysing motion it is of paramount importanceto recognize which field of a frame is older, thetemporalorder.

When the driver provides or accepts images field by fieldrather than interleaved, it is also important applications understandhow the fields combine to frames. We distinguish between top andbottom fields, thespatial order: The first lineof the top field is the first line of an interlaced frame, the firstline of the bottom field is the second line of that frame.

However because fields were captured one after the other,arguing whether a frame commences with the top or bottom field ispointless. Any two successive top and bottom, or bottom and top fieldsyield a valid frame. Only when the source was progressive to beginwith, e. g. when transferring film to video, two fields may come fromthe same frame, creating a natural order.

Counter to intuition the top field is not necessarily theolder field. Whether the older field contains the top or bottom linesis a convention determined by the video standard. Hence thedistinction between temporal and spatial order of fields. The diagramsbelow should make this clearer.

All video capture and output devices must report the currentfield order. Some drivers may permit the selection of a differentorder, to this end applications initialize thefield field of struct v4l2_pix_format beforecalling the VIDIOC_S_FMT ioctl. If this is not desired it shouldhave the valueV4L2_FIELD_ANY (0).

Table 3-8. enum v4l2_field

V4L2_FIELD_ANY 0 Applications request this field order when anyone of theV4L2_FIELD_NONE,V4L2_FIELD_TOP,V4L2_FIELD_BOTTOM, orV4L2_FIELD_INTERLACED formats is acceptable.Drivers choose depending on hardware capabilities or e. g. therequested image size, and return the actual field order. struct v4l2_bufferfield can never beV4L2_FIELD_ANY.
V4L2_FIELD_NONE 1 Images are in progressive format, not interlaced.The driver may also indicate this order when it cannot distinguishbetweenV4L2_FIELD_TOPandV4L2_FIELD_BOTTOM.
V4L2_FIELD_TOP 2 Images consist of the top field only.
V4L2_FIELD_BOTTOM 3 Images consist of the bottom field only.Applications may wish to prevent a device from capturing interlacedimages because they will have "comb" or "feathering" artefacts aroundmoving objects.
V4L2_FIELD_INTERLACED 4 Images contain both fields, interleaved line byline. The temporal order of the fields (whether the top or bottomfield is first transmitted) depends on the current video standard.M/NTSC transmits the bottom field first, all other standards the topfield first.
V4L2_FIELD_SEQ_TB 5 Images contain both fields, the top field linesare stored first in memory, immediately followed by the bottom fieldlines. Fields are always stored in temporal order, the older one firstin memory. Image sizes refer to the frame, not fields.
V4L2_FIELD_SEQ_BT 6 Images contain both fields, the bottom fieldlines are stored first in memory, immediately followed by the topfield lines. Fields are always stored in temporal order, the older onefirst in memory. Image sizes refer to the frame, not fields.
V4L2_FIELD_ALTERNATE 7 The two fields of a frame are passed in separatebuffers, in temporal order, i. e. the older one first. To indicate the fieldparity (whether the current field is a top or bottom field) the driveror application, depending on data direction, must set struct v4l2_bufferfield toV4L2_FIELD_TOPorV4L2_FIELD_BOTTOM. Any two successive fields pairto build a frame. If fields are successive, without any dropped fieldsbetween them (fields can drop individually), can be determined fromthe struct v4l2_buffersequencefield. Imagesizes refer to the frame, not fields. This format cannot be selectedwhen using the read/write I/O method.
V4L2_FIELD_INTERLACED_TB 8 Images contain both fields, interleaved line byline, top field first. The top field is transmitted first.
V4L2_FIELD_INTERLACED_BT 9 Images contain both fields, interleaved line byline, top field first. The bottom field is transmitted first.

Figure 3-1. Field Order, Top Field First Transmitted

Figure 3-2. Field Order, Bottom Field First Transmitted


Chapter 4. Interfaces

4.1. Video Capture Interface

Video capture devices sample an analog video signal and storethe digitized images in memory. Today nearly all devices can captureat full 25 or 30 frames/second. With this interface applications cancontrol the capture process and move images from the driver into userspace.

Conventionally V4L2 video capture devices are accessed throughcharacter device special files named/dev/videoand/dev/video0 to/dev/video63 with major number 81 and minornumbers 0 to 63./dev/video is typically asymbolic link to the preferred video device. Note the same devicefiles are used for video output devices.


4.1.1. Querying Capabilities

Devices supporting the video capture interface set theV4L2_CAP_VIDEO_CAPTURE flag in thecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl. As secondary device functionsthey may also support thevideo overlay(V4L2_CAP_VIDEO_OVERLAY) and theraw VBI capture(V4L2_CAP_VBI_CAPTURE) interface. At least one ofthe read/write or streaming I/O methods must be supported. Tuners andaudio inputs are optional.


4.1.2. Supplemental Functions

Video capture devices shall support audio input, tuner, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo inputandvideo standard ioctls must be supported byall video capture devices.


4.1.3. Image Format Negotiation

The result of a capture operation is determined bycropping and image format parameters. The former select an area of thevideo picture to capture, the latter how images are stored in memory,i. e. in RGB or YUV format, the number of bits per pixel or width andheight. Together they also define how images are scaled in theprocess.

As usual these parameters are not resetatopen() time to permit Unix tool chains, programming a deviceand then reading from it as if it was a plain file. Well written V4L2applications ensure they really get what they want, including croppingand scaling.

Cropping initialization at minimum requires to reset theparameters to defaults. An example is given inSection 1.11.

To query the current image format applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_CAPTURE and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_pix_formatpix member of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as above andinitialize all fields of the struct v4l2_pix_formatvbimember of thefmt union, or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl with a pointer to this structure. Drivers mayadjust the parameters and finally return the actual parameters asVIDIOC_G_FMT does.

Like VIDIOC_S_FMT theVIDIOC_TRY_FMT ioctl can be used to learn about hardware limitationswithout disabling I/O or possibly time consuming hardwarepreparations.

The contents of struct v4l2_pix_format are discussed inChapter 2. See also the specification of theVIDIOC_G_FMT,VIDIOC_S_FMTandVIDIOC_TRY_FMTioctls for details. Videocapture devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.


4.1.4. Reading Images

A video capture device may support the read() function and/or streaming (memory mapping oruser pointer) I/O. SeeChapter 3 for details.


4.2. Video Overlay Interface

Also known as Framebuffer Overlay or Previewing

Video overlay devices have the ability to genlock (TV-)videointo the (VGA-)video signal of a graphics card, or to store capturedimages directly in video memory of a graphics card, typically withclipping. This can be considerable more efficient than capturingimages and displaying them by other means. In the old days when onlynuclear power plants needed cooling towers this used to be the onlyway to put live video into a window.

Video overlay devices are accessed through the same characterspecial files as video capture devices.Note the default function of a /dev/videodeviceis video capturing. The overlay function is only available aftercalling theVIDIOC_S_FMT ioctl.

The driver may support simultaneous overlay and capturingusing the read/write and streaming I/O methods. If so, operation atthe nominal frame rate of the video standard is not guaranteed. Framesmay be directed away from overlay to capture, or one field may be usedfor overlay and the other for capture if the capture parameters permitthis.

Applications should use different file descriptors forcapturing and overlay. This must be supported by all drivers capableof simultaneous capturing and overlay. Optionally these drivers mayalso permit capturing and overlay with a single file descriptor forcompatibility with V4L and earlier versions of V4L2.[20]


4.2.1. Querying Capabilities

Devices supporting the video overlay interface set theV4L2_CAP_VIDEO_OVERLAY flag in thecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl. The overlay I/O method specifiedbelow must be supported. Tuners and audio inputs are optional.


4.2.2. Supplemental Functions

Video overlay devices shall support audio input, tuner, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo inputandvideo standard ioctls must be supported byall video overlay devices.


4.2.3. Setup

Before overlay can commence applications must program thedriver with frame buffer parameters, namely the address and size ofthe frame buffer and the image format, for example RGB 5:6:5. TheVIDIOC_G_FBUF and VIDIOC_S_FBUF ioctls are available to getand set these parameters, respectively. TheVIDIOC_S_FBUF ioctl is privileged because itallows to set up DMA into physical memory, bypassing the memoryprotection mechanisms of the kernel. Only the superuser can change theframe buffer address and size. Users are not supposed to run TVapplications as root or with SUID bit set. A small helper applicationwith suitable privileges should query the graphics system and programthe V4L2 driver at the appropriate time.

Some devices add the video overlay to the output signalof the graphics card. In this case the frame buffer is not modified bythe video device, and the frame buffer address and pixel format arenot needed by the driver. TheVIDIOC_S_FBUF ioctlis not privileged. An application can check for this type of device bycalling theVIDIOC_G_FBUF ioctl.

A driver may support any (or none) of five clipping/blendingmethods:

  1. Chroma-keying displays the overlaid image only wherepixels in the primary graphics surface assume a certain color.

  2. A bitmap can be specified where each bit correspondsto a pixel in the overlaid image. When the bit is set, thecorresponding video pixel is displayed, otherwise a pixel of thegraphics surface.

  3. A list of clipping rectangles can be specified. Inthese regions no video is displayed, so thegraphics surface can be seen here.

  4. The framebuffer has an alpha channel that can be usedto clip or blend the framebuffer with the video.

  5. A global alpha value can be specified to blend theframebuffer contents with video images.

When simultaneous capturing and overlay is supported andthe hardware prohibits different image and frame buffer formats, theformat requested first takes precedence. The attempt to capture(VIDIOC_S_FMT) or overlay (VIDIOC_S_FBUF) may fail with anEBUSY error code or return accordingly modified parameters..


4.2.4. Overlay Window

The overlaid image is determined by cropping and overlaywindow parameters. The former select an area of the video picture tocapture, the latter how images are overlaid and clipped. Croppinginitialization at minimum requires to reset the parameters todefaults. An example is given in Section 1.11.

The overlay window is described by a struct v4l2_window. Itdefines the size of the image, its position over the graphics surfaceand the clipping to be applied. To get the current parametersapplications set the type field of astruct v4l2_format toV4L2_BUF_TYPE_VIDEO_OVERLAY andcall theVIDIOC_G_FMTioctl. The driver fills thev4l2_window substructure namedwin. It is not possible to retrieve apreviously programmed clipping list or bitmap.

To program the overlay window applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OVERLAY, initialize thewin substructure and call theVIDIOC_S_FMT ioctl. The driver adjusts the parameters againsthardware limits and returns the actual parameters asVIDIOC_G_FMT does. LikeVIDIOC_S_FMT, theVIDIOC_TRY_FMT ioctl can beused to learn about driver capabilities without actually changingdriver state. UnlikeVIDIOC_S_FMT this also worksafter the overlay has been enabled.

The scaling factor of the overlaid image is implied by thewidth and height given in struct v4l2_window and the size of the croppingrectangle. For more information seeSection 1.11.

When simultaneous capturing and overlay is supported andthe hardware prohibits different image and window sizes, the sizerequested first takes precedence. The attempt to capture or overlay aswell (VIDIOC_S_FMT) may fail with an EBUSY error code or return accordinglymodified parameters.

Table 4-1. struct v4l2_window

struct v4l2_rect w Size and position of the window relative to thetop, left corner of the frame buffer defined withVIDIOC_S_FBUF. Thewindow can extend the frame buffer width and height, thex andycoordinates can be negative, and it can lie completely outside theframe buffer. The driver clips the window accordingly, or if that isnot possible, modifies its size and/or position.
enum v4l2_field field Applications set this field to determine whichvideo field shall be overlaid, typically one ofV4L2_FIELD_ANY (0),V4L2_FIELD_TOP,V4L2_FIELD_BOTTOMorV4L2_FIELD_INTERLACED. Drivers may have to choosea different field order and return the actual setting here.
__u32 chromakey When chroma-keying has been negotiated withVIDIOC_S_FBUF applications set this field to the desired pixel valuefor the chroma key. The format is the same as the pixel format of theframebuffer (struct v4l2_framebufferfmt.pixelformat field), with bytes in hostorder. E. g. forV4L2_PIX_FMT_BGR24 the value should be 0xRRGGBB on a little endian, 0xBBGGRR on a bigendian host.
struct v4l2_clip * clips When chroma-keying has notbeen negotiated andVIDIOC_G_FBUFindicated this capability,applications can set this field to point to an array ofclipping rectangles.
Like the window coordinatesw, clipping rectangles are defined relativeto the top, left corner of the frame buffer. However clippingrectangles must not extend the frame buffer width and height, and theymust not overlap. If possible applications should merge adjacentrectangles. Whether this must create x-y or y-x bands, or the order ofrectangles, is not defined. When clip lists are not supported thedriver ignores this field. Its contents after callingVIDIOC_S_FMTare undefined.
__u32 clipcount When the application set theclips field, this field must contain thenumber of clipping rectangles in the list. When clip lists are notsupported the driver ignores this field, its contents after callingVIDIOC_S_FMT are undefined. When clip lists aresupported but no clipping is desired this field must be set tozero.
void * bitmap When chroma-keying hasnot been negotiated andVIDIOC_G_FBUFindicatedthis capability, applications can set this field to point to aclipping bit mask.

It must be of the same sizeas the window, w.width andw.height. Each bit corresponds to a pixelin the overlaid image, which is displayed only when the bit isset. Pixel coordinates translate to bits like:

((__u8 *) bitmap)[w.width * y + x / 8] & (1 << (x & 7))

where 0 ≤ x <w.width and0 ≤y <w.height.a

When a clippingbit mask is not supported the driver ignores this field, its contentsafter callingVIDIOC_S_FMT are undefined. When a bit mask is supportedbut no clipping is desired this field must be set toNULL.

Applications need not create aclip list or bit mask. When they pass both, or despite negotiatingchroma-keying, the results are undefined. Regardless of the chosenmethod, the clipping abilities of the hardware may be limited inquantity or quality. The results when these limits are exceeded areundefined.b

__u8 global_alpha

The global alpha value used to blend theframebuffer with video images, if global alpha blending has beennegotiated (V4L2_FBUF_FLAG_GLOBAL_ALPHA, seeVIDIOC_S_FBUF,Table 3).

Notethis field was added in Linux 2.6.23, extending the structure. Howeverthe VIDIOC_G/S/TRY_FMT ioctls,which take a pointer to av4l2_format parent structure with paddingbytes at the end, are not affected.

Notes:
a. Should we require w.width to be a multiple of eight?
b. When the image is written into frame buffermemory it will be undesirable if the driver clips out less pixelsthan expected, because the application and graphics system are notaware these regions need to be refreshed. The driver should clip outmore pixels or not write the image at all.

Table 4-2. struct v4l2_clip[21]

struct v4l2_rect c Coordinates of the clipping rectangle, relative tothe top, left corner of the frame buffer. Only window pixelsoutside all clipping rectangles aredisplayed.
struct v4l2_clip * next Pointer to the next clipping rectangle, NULL whenthis is the last rectangle. Drivers ignore this field, it cannot beused to pass a linked list of clipping rectangles.

Table 4-3. struct v4l2_rect

__s32 left Horizontal offset of the top, left corner of therectangle, in pixels.
__s32 top Vertical offset of the top, left corner of therectangle, in pixels. Offsets increase to the right and down.
__s32 width Width of the rectangle, in pixels.
__s32 height Height of the rectangle, in pixels. Width andheight cannot be negative, the fields are signed for hystericalreasons.

4.2.5. Enabling Overlay

To start or stop the frame buffer overlay applications callthe VIDIOC_OVERLAY ioctl.


4.3. Video Output Interface

Video output devices encode stills or image sequences asanalog video signal. With this interface applications cancontrol the encoding process and move images from user space tothe driver.

Conventionally V4L2 video output devices are accessed throughcharacter device special files named/dev/videoand/dev/video0 to/dev/video63 with major number 81 and minornumbers 0 to 63./dev/video is typically asymbolic link to the preferred video device. Note the same devicefiles are used for video capture devices.


4.3.1. Querying Capabilities

Devices supporting the video output interface set theV4L2_CAP_VIDEO_OUTPUT flag in thecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl. As secondary device functionsthey may also support theraw VBIoutput (V4L2_CAP_VBI_OUTPUT) interface. Atleast one of the read/write or streaming I/O methods must besupported. Modulators and audio outputs are optional.


4.3.2. Supplemental Functions

Video output devices shall support audio output, modulator, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo outputandvideo standard ioctls must be supported byall video output devices.


4.3.3. Image Format Negotiation

The output is determined by cropping and image formatparameters. The former select an area of the video picture where theimage will appear, the latter how images are stored in memory, i. e. inRGB or YUV format, the number of bits per pixel or width and height.Together they also define how images are scaled in the process.

As usual these parameters are not resetatopen() time to permit Unix tool chains, programming a deviceand then writing to it as if it was a plain file. Well written V4L2applications ensure they really get what they want, including croppingand scaling.

Cropping initialization at minimum requires to reset theparameters to defaults. An example is given inSection 1.11.

To query the current image format applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_pix_formatpix member of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as above andinitialize all fields of the struct v4l2_pix_formatvbimember of thefmt union, or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl with a pointer to this structure. Drivers mayadjust the parameters and finally return the actual parameters asVIDIOC_G_FMT does.

Like VIDIOC_S_FMT theVIDIOC_TRY_FMT ioctl can be used to learn about hardware limitationswithout disabling I/O or possibly time consuming hardwarepreparations.

The contents of struct v4l2_pix_format are discussed inChapter 2. See also the specification of theVIDIOC_G_FMT,VIDIOC_S_FMTandVIDIOC_TRY_FMTioctls for details. Videooutput devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.


4.3.4. Writing Images

A video output device may support the write() function and/or streaming (memory mapping oruser pointer) I/O. SeeChapter 3 for details.


4.4. Video Output Overlay Interface

Also known as On-Screen Display (OSD)

Experimental: This is an experimentalinterface and may change in the future.

Some video output devices can overlay a framebuffer image ontothe outgoing video signal. Applications can set up such an overlayusing this interface, which borrows structures and ioctls of theVideo Overlay interface.

The OSD function is accessible through the same characterspecial file as the Video Output function.Note the default function of such a /dev/videodeviceis video capturing or output. The OSD function is only available aftercalling theVIDIOC_S_FMT ioctl.


4.4.1. Querying Capabilities

Devices supporting the Video OutputOverlay interface set theV4L2_CAP_VIDEO_OUTPUT_OVERLAY flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl.


4.4.2. Framebuffer

Contrary to the Video Overlayinterface the framebuffer is normally implemented on the TV card andnot the graphics card. On Linux it is accessible as a framebufferdevice (/dev/fbN). Given a V4L2 device,applications can find the corresponding framebuffer device by callingthe VIDIOC_G_FBUF ioctl. It returns, amongst other information, thephysical address of the framebuffer in thebase field of struct v4l2_framebuffer. Theframebuffer device ioctlFBIOGET_FSCREENINFOreturns the same address in thesmem_startfield of structfb_fix_screeninfo. TheFBIOGET_FSCREENINFO ioctl and structfb_fix_screeninfo are defined in thelinux/fb.h header file.

The width and height of the framebuffer depends on thecurrent video standard. A V4L2 driver may reject attempts to changethe video standard (or any other ioctl which would imply a framebuffersize change) with anEBUSY error code until all applications closed theframebuffer device.

Example 4-1. Finding a framebuffer device for OSD

#include 

struct v4l2_framebuffer fbuf;
unsigned int i;
int fb_fd;

if (-1 == ioctl (fd, VIDIOC_G_FBUF, &fbuf)) {
        perror ("VIDIOC_G_FBUF");
        exit (EXIT_FAILURE);
}

for (i = 0; i < 30; ++i) {
        char dev_name[16];
        struct fb_fix_screeninfo si;

        snprintf (dev_name, sizeof (dev_name), "/dev/fb%u", i);

        fb_fd = open (dev_name, O_RDWR);
        if (-1 == fb_fd) {
                switch (errno) {
                case ENOENT: /* no such file */
                case ENXIO:  /* no driver */
                        continue;

                default:
                        perror ("open");
                        exit (EXIT_FAILURE);
                }
        }

        if (0 == ioctl (fb_fd, FBIOGET_FSCREENINFO, &si)) {
                if (si.smem_start == (unsigned long) fbuf.base)
                        break;
        } else {
                /* Apparently not a framebuffer device. */
        }

        close (fb_fd);
        fb_fd = -1;
}

/* fb_fd is the file descriptor of the framebuffer device
   for the video output overlay, or -1 if no device was found. */

4.4.3. Overlay Window and Scaling

The overlay is controlled by source and target rectangles.The source rectangle selects a subsection of the framebuffer image tobe overlaid, the target rectangle an area in the outgoing video signalwhere the image will appear. Drivers may or may not support scaling,and arbitrary sizes and positions of these rectangles. Further driversmay support any (or none) of the clipping/blending methods defined fortheVideo Overlay interface.

A struct v4l2_window defines the size of the source rectangle,its position in the framebuffer and the clipping/blending method to beused for the overlay. To get the current parameters applications setthe type field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY and call theVIDIOC_G_FMT ioctl. The driver fills thev4l2_window substructure namedwin. It is not possible to retrieve apreviously programmed clipping list or bitmap.

To program the source rectangle applications set thetype field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY, initializethe winsubstructure and call theVIDIOC_S_FMT ioctl. The driver adjusts the parameters againsthardware limits and returns the actual parameters asVIDIOC_G_FMT does. LikeVIDIOC_S_FMT, theVIDIOC_TRY_FMT ioctl can beused to learn about driver capabilities without actually changingdriver state. UnlikeVIDIOC_S_FMT this also worksafter the overlay has been enabled.

A struct v4l2_crop defines the size and position of the targetrectangle. The scaling factor of the overlay is implied by the widthand height given in struct v4l2_window and struct v4l2_crop. The cropping APIapplies toVideo Output andVideoOutput Overlay devices in the same way as toVideo Capture andVideoOverlay devices, merely reversing the direction of thedata flow. For more information seeSection 1.11.


4.4.4. Enabling Overlay

There is no V4L2 ioctl to enable or disable the overlay,however the framebuffer interface of the driver may support theFBIOBLANK ioctl.


4.5. Codec Interface

Suspended: This interface has been be suspended from the V4L2 APIimplemented in Linux 2.6 until we have more experience with codecdevice interfaces.

A V4L2 codec can compress, decompress, transform, or otherwiseconvert video data from one format into another format, in memory.Applications send data to be converted to the driver through awrite() call, and receive the converted data through aread() call. For efficiency a driver may also support streamingI/O.

[to do]


4.6. Effect Devices Interface

Suspended: This interface has been be suspended from the V4L2 APIimplemented in Linux 2.6 until we have more experience with effectdevice interfaces.

A V4L2 video effect device can do image effects, filtering, orcombine two or more images or image streams. For example videotransitions or wipes. Applications send data to be processed andreceive the result data either withread() andwrite()functions, or through the streaming I/O mechanism.

[to do]


4.7. Raw VBI Data Interface

VBI is an abbreviation of Vertical Blanking Interval, a gapin the sequence of lines of an analog video signal. During VBIno picture information is transmitted, allowing some time while theelectron beam of a cathode ray tube TV returns to the top of thescreen. Using an oscilloscope you will find here the verticalsynchronization pulses and short data packages ASKmodulated[22]onto the video signal. These are transmissions of services such asTeletext or Closed Caption.

Subject of this interface type is raw VBI data, as sampled offa video signal, or to be added to a signal for output.The data format is similar to uncompressed video images, a number oflines times a number of samples per line, we call this a VBI image.

Conventionally V4L2 VBI devices are accessed through characterdevice special files named/dev/vbi and/dev/vbi0 to/dev/vbi31 withmajor number 81 and minor numbers 224 to 255./dev/vbi is typically a symbolic link to thepreferred VBI device. This convention applies to both input and outputdevices.

To address the problems of finding related video and VBIdevices VBI capturing and output is also available as device functionunder/dev/video. To capture or output raw VBIdata with these devices applications must call theVIDIOC_S_FMTioctl. Accessed as/dev/vbi, raw VBI capturingor output is the default device function.


4.7.1. Querying Capabilities

Devices supporting the raw VBI capturing or output API setthe V4L2_CAP_VBI_CAPTURE orV4L2_CAP_VBI_OUTPUT flags, respectively, in thecapabilitiesfield of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl. At least one of theread/write, streaming or asynchronous I/O methods must besupported. VBI devices may or may not have a tuner or modulator.


4.7.2. Supplemental Functions

VBI devices shall support videoinput or output, tuner ormodulator, and controls ioctlsas needed. The video standard ioctls provideinformation vital to program a VBI device, therefore must besupported.


4.7.3. Raw VBI Format Negotiation

Raw VBI sampling abilities can vary, in particular thesampling frequency. To properly interpret the data V4L2 specifies anioctl to query the sampling parameters. Moreover, to allow for someflexibility applications can also suggest different parameters.

As usual these parameters are notreset atopen() time to permit Unix tool chains, programming adevice and then reading from it as if it was a plain file. Wellwritten V4L2 applications should always ensure they really get whatthey want, requesting reasonable parameters and then checking if theactual parameters are suitable.

To query the current raw VBI capture parametersapplications set the type field of astruct v4l2_format toV4L2_BUF_TYPE_VBI_CAPTUREorV4L2_BUF_TYPE_VBI_OUTPUT, and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_vbi_formatvbi member of thefmtunion.

To request different parameters applications set thetype field of a struct v4l2_format as above andinitialize all fields of the struct v4l2_vbi_formatvbimember of thefmt union, or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl with a pointer to this structure. Drivers returnanEINVAL error code only when the given parameters are ambiguous, otherwisethey modify the parameters according to the hardware capabilites andreturn the actual parameters. When the driver allocates resources atthis point, it may return an EBUSY error code to indicate the returnedparameters are valid but the required resources are currently notavailable. That may happen for instance when the video and VBI areasto capture would overlap, or when the driver supports multiple opensand another process already requested VBI capturing or output. Anyway,applications must expect other resource allocation points which mayreturnEBUSY, at theVIDIOC_STREAMON ioctland the first read(), write() and select() call.

VBI devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.

Table 4-4. struct v4l2_vbi_format

__u32 sampling_rate Samples per second, i. e. unit 1 Hz.
__u32 offset

Horizontal offset of the VBI image,relative to the leading edge of the line synchronization pulse andcounted in samples: The first sample in the VBI image will be locatedoffset /sampling_rate seconds following the leadingedge. See also Figure 4-1.

__u32 samples_per_line  
__u32 sample_format

Defines the sample format as in Chapter 2, a four-character-code.aUsually this isV4L2_PIX_FMT_GREY, i. e. each sampleconsists of 8 bits with lower values oriented towards the black level.Do not assume any other correlation of values with the signal level.For example, the MSB does not necessarily indicate if the signal is'high' or 'low' because 128 may not be the mean value of thesignal. Drivers shall not convert the sample format by software.

__u32 start[2] This is the scanning system line numberassociated with the first line of the VBI image, of the first and thesecond field respectively. SeeFigure 4-2andFigure 4-3 for valid values. VBI input drivers canreturn start values 0 if the hardware cannot reliable identifyscanning lines, VBI acquisition may not require thisinformation.
__u32 count[2] The number of lines in the first and secondfield image, respectively.

Drivers should be asflexibility as possible. For example, it may be possible to extend ormove the VBI capture window down to the picture area, implementing a'full field mode' to capture data service transmissions embedded inthe picture.

An application can set the first or secondcount value to zero if no data is requiredfrom the respective field;count[1] if thescanning system is progressive, i. e. not interlaced. Thecorresponding start value shall be ignored by the application anddriver. Anyway, drivers may not support single field capturing andreturn both count values non-zero.

Bothcount values set to zero, or line numbersoutside the bounds depicted inFigure 4-2 andFigure 4-3, or a field image coveringlines of two fields, are invalid and shall not be returned by thedriver.

To initialize the startand count fields, applications must firstdetermine the current video standard selection. Thev4l2_std_id ortheframelines field of struct v4l2_standard canbe evaluated for this purpose.

__u32 flags See Table 4-5 below. Currentlyonly drivers set flags, applications must set this field tozero.
__u32 reserved[2] This array is reserved for future extensions.Drivers and applications must set it to zero.
Notes:
a. A few devices may be unable tosample VBI data at all but can extend the video capture window to theVBI region.

Table 4-5. Raw VBI Format Flags

V4L2_VBI_UNSYNC 0x0001

This flag indicates hardware which does notproperly distinguish between fields. Normally the VBI image stores thefirst field (lower scanning line numbers) first in memory. This may bea top or bottom field depending on the video standard. When this flagis set the first or second field may be stored first, however thefields are still in correct temporal order with the older field firstin memory.a

V4L2_VBI_INTERLACED 0x0002 By default the two field images will be passedsequentially; all lines of the first field followed by all lines ofthe second field (compareSection 3.6V4L2_FIELD_SEQ_TB andV4L2_FIELD_SEQ_BT, whether the top or bottomfield is first in memory depends on the video standard). When thisflag is set, the two fields are interlaced (cf.V4L2_FIELD_INTERLACED). The first line of thefirst field followed by the first line of the second field, then thetwo second lines, and so on. Such a layout may be necessary when thehardware has been programmed to capture or output interlaced videoimages and is unable to separate the fields for VBI capturing atthe same time. For simplicity setting this flag implies that bothcount values are equal and non-zero.
Notes:
a. Most VBI services transmit on both fields, butsome have different semantics depending on the field number. Thesecannot be reliable decoded or encoded whenV4L2_VBI_UNSYNC is set.

Figure 4-1. Line synchronization

Figure 4-2. ITU-R 525 line numbering (M/NTSC and M/PAL)

(1) For the purpose of this specification field 2starts in line 264 and not 263.5 because half line capturing is notsupported.

Figure 4-3. ITU-R 625 line numbering

(1) For the purpose of this specification field 2starts in line 314 and not 313.5 because half line capturing is notsupported.

Remember the VBI image format depends on the selectedvideo standard, therefore the application must choose a new standard orquery the current standard first. Attempts to read or write data aheadof format negotiation, or after switching the video standard which mayinvalidate the negotiated VBI parameters, should be refused by thedriver. A format change during active I/O is not permitted.


4.7.4. Reading and writing VBI images

To assure synchronization with the field number and easierimplementation, the smallest unit of data passed at a time is oneframe, consisting of two fields of VBI images immediately following inmemory.

The total size of a frame computes as follows:

(count[0] + count[1]) *
samples_per_line * sample size in bytes

The sample size is most likely always one byte,applications must check the sample_formatfield though, to function properly with other drivers.

A VBI device may support read/write and/or streaming (memory mapping oruser pointer) I/O. The latter bears thepossibility of synchronizing video andVBI data by using buffer timestamps.

Remember the VIDIOC_STREAMON ioctl and the first read(),write() and select() call can be resource allocation points returninganEBUSY error code if the required hardware resources are temporarilyunavailable, for example the device is already in use by anotherprocess.


4.8. Sliced VBI Data Interface

VBI stands for Vertical Blanking Interval, a gap in thesequence of lines of an analog video signal. During VBI no pictureinformation is transmitted, allowing some time while the electron beamof a cathode ray tube TV returns to the top of the screen.

Sliced VBI devices use hardware to demodulate data transmittedin the VBI. V4L2 drivers shallnot do this bysoftware, see also theraw VBIinterface. The data is passed as short packets of fixed size,covering one scan line each. The number of packets per video frame isvariable.

Sliced VBI capture and output devices are accessed through thesame character special files as raw VBI devices. When a driversupports both interfaces, the default function of a/dev/vbi device israw VBIcapturing or output, and the sliced VBI function is only availableafter calling theVIDIOC_S_FMT ioctl as defined below. Likewise a/dev/video device may support the sliced VBI API,however the default function here is video capturing or output.Different file descriptors must be used to pass raw and sliced VBIdata simultaneously, if this is supported by the driver.


4.8.1. Querying Capabilities

Devices supporting the sliced VBI capturing or output APIset the V4L2_CAP_SLICED_VBI_CAPTURE orV4L2_CAP_SLICED_VBI_OUTPUT flag respectively, inthecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl. At least one of theread/write, streaming or asynchronousI/Omethods must be supported. Sliced VBI devices may have a tuneror modulator.


4.8.2. Supplemental Functions

Sliced VBI devices shall support videoinput or output and tuner ormodulator ioctls if they have these capabilities, and they maysupportcontrol ioctls. Thevideo standard ioctls provide informationvital to program a sliced VBI device, therefore must besupported.


4.8.3. Sliced VBI Format Negotiation

To find out which data services are supported by thehardware applications can call theVIDIOC_G_SLICED_VBI_CAP ioctl.All drivers implementing the sliced VBI interface must support thisioctl. The results may differ from those of theVIDIOC_S_FMT ioctlwhen the number of VBI lines the hardware can capture or output perframe, or the number of services it can identify on a given line arelimited. For example on PAL line 16 the hardware may be able to lookfor a VPS or Teletext signal, but not both at the same time.

To determine the currently selected services applicationsset the type field of struct v4l2_format to V4L2_BUF_TYPE_SLICED_VBI_CAPTUREorV4L2_BUF_TYPE_SLICED_VBI_OUTPUT, and theVIDIOC_G_FMTioctl fills thefmt.sliced member, astruct v4l2_sliced_vbi_format.

Applications can request different parameters byinitializing or modifying the fmt.slicedmember and calling the VIDIOC_S_FMT ioctl with a pointer to thev4l2_format structure.

The sliced VBI API is more complicated than the raw VBI APIbecause the hardware must be told which VBI service to expect on eachscan line. Not all services may be supported by the hardware on alllines (this is especially true for VBI output where Teletext is oftenunsupported and other services can only be inserted in one specificline). In many cases, however, it is sufficient to just set theservice_set field to the required servicesand let the driver fill theservice_linesarray according to hardware capabilities. Only if more precise controlis needed should the programmer set theservice_lines array explicitly.

The VIDIOC_S_FMT ioctl returns anEINVAL error code only when thegiven parameters are ambiguous, otherwise it modifies the parametersaccording to hardware capabilities. When the driver allocatesresources at this point, it may return anEBUSY error code if the requiredresources are temporarily unavailable. Other resource allocationpoints which may returnEBUSY can be theVIDIOC_STREAMON ioctl and the firstread(),write() andselect() call.

Table 4-6. structv4l2_sliced_vbi_format

__u32 service_set

Ifservice_set is non-zero when passed withVIDIOC_S_FMTorVIDIOC_TRY_FMT, theservice_lines array will be filled by thedriver according to the services specified in this field. For example,ifservice_setis initialized withV4L2_SLICED_TELETEXT_B | V4L2_SLICED_WSS_625, adriver for the cx25840 video decoder sets lines 7-22 of bothfieldsatoV4L2_SLICED_TELETEXT_Band line 23 of the first field toV4L2_SLICED_WSS_625. Ifservice_set is set to zero, then the valuesofservice_lines will be used instead.

On return the driver sets this field to the union of allelements of the returnedservice_linesarray. It may contain less services than requested, perhaps just one,if the hardware cannot handle more services simultaneously. It may beempty (zero) if none of the requested services are supported by thehardware.

__u16 service_lines[2][24]

Applications initialize thisarray with sets of data services the driver shall look for or inserton the respective scan line. Subject to hardware capabilities driversreturn the requested set, a subset, which may be just a singleservice, or an empty set. When the hardware cannot handle multipleservices on the same line the driver shall choose one. No assumptionscan be made on which service the driver chooses.

Dataservices are defined in Table 4-7. Array indicesmap to ITU-R line numbers (see also Figure 4-2 and Figure 4-3) as follows:

    Element 525 line systems 625 line systems
    service_lines[0][1] 1 1
    service_lines[0][23] 23 23
    service_lines[1][1] 264 314
    service_lines[1][23] 286 336
    Drivers must setservice_lines[0][0] andservice_lines[1][0] to zero.
__u32 io_size Maximum number of bytes passed byone read() or write() call, and the buffer size in bytes fortheVIDIOC_QBUF andVIDIOC_DQBUF ioctl. Drivers set this field tothe size of struct v4l2_sliced_vbi_data times the number of non-zeroelements in the returnedservice_linesarray (that is the number of lines potentially carrying data).
__u32 reserved[2] This array is reserved for futureextensions. Applications and drivers must set it to zero.
Notes:
a. According to ETS 300 706 lines 6-22 of thefirst field and lines 5-22 of the second field may carry Teletextdata.

Table 4-7. Sliced VBI services

Symbol Value Reference Lines, usually Payload
V4L2_SLICED_TELETEXT_B(Teletext System B) 0x0001 ETS 300 706,ITU BT.653 PAL/SECAM line 7-22, 320-335 (second field 7-22) Last 42 of the 45 byte Teletext packet, that iswithout clock run-in and framing code, lsb first transmitted.
V4L2_SLICED_VPS 0x0400 ETS 300 231 PAL line 16 Byte number 3 to 15 according to Figure 9 ofETS 300 231, lsb first transmitted.
V4L2_SLICED_CAPTION_525 0x1000 EIA 608-B NTSC line 21, 284 (second field 21) Two bytes in transmission order, including paritybit, lsb first transmitted.
V4L2_SLICED_WSS_625 0x4000 ITU BT.1119,EN 300 294 PAL/SECAM line 23
Byte         0                 1
      msb         lsb  msb           lsb
 Bit  7 6 5 4 3 2 1 0  x x 13 12 11 10 9
V4L2_SLICED_VBI_525 0x1000 Set of services applicable to 525line systems.
V4L2_SLICED_VBI_625 0x4401 Set of services applicable to 625line systems.

Drivers may return an EINVAL error code when applications attempt toread or write data without prior format negotiation, after switchingthe video standard (which may invalidate the negotiated VBIparameters) and after switching the video input (which may change thevideo standard as a side effect). The VIDIOC_S_FMT ioctl may returnan EBUSY error code when applications attempt to change the format while i/o isin progress (between aVIDIOC_STREAMON andVIDIOC_STREAMOFF call,and after the firstread() orwrite() call).


4.8.4. Reading and writing sliced VBI data

A single read() or write() call must pass all databelonging to one video frame. That is an array ofv4l2_sliced_vbi_data structures with one ormore elements and a total size not exceedingio_size bytes. Likewise in streaming I/Omode one buffer of io_size bytes mustcontain data of one video frame. Theid ofunusedv4l2_sliced_vbi_data elements must bezero.

Table 4-8. structv4l2_sliced_vbi_data

__u32 id A flag from Table 2identifying the type of data in this packet. Only a single bit must beset. When theid of a captured packet iszero, the packet is empty and the contents of other fields areundefined. Applications shall ignore empty packets. When theid of a packet for output is zero thecontents of thedata field are undefinedand the driver must no longer insert data on the requestedfield andline.
__u32 field The video field number this data has been capturedfrom, or shall be inserted at.0 for the firstfield,1 for the second field.
__u32 line The field (as opposed to frame) line number thisdata has been captured from, or shall be inserted at. SeeFigure 4-2 andFigure 4-3 for validvalues. Sliced VBI capture devices can set the line number of allpackets to0 if the hardware cannot reliablyidentify scan lines. The field number must always be valid.
__u32 reserved This field is reserved for future extensions.Applications and drivers must set it to zero.
__u8 data[48] The packet payload. See Table 2 for the contents and number ofbytes passed for each data type. The contents of padding bytes at theend of this array are undefined, drivers and applications shall ignorethem.

Packets are always passed in ascending line number order,without duplicate line numbers. Thewrite() function and theVIDIOC_QBUF ioctl must return an EINVAL error code when applications violatethis rule. They must also return anEINVAL error code when applications pass anincorrect field or line number, or a combination offield,line andid which has not been negotiated with theVIDIOC_G_FMT orVIDIOC_S_FMT ioctl. When the line numbers areunknown the driver must pass the packets in transmitted order. Thedriver can insert empty packets withid setto zero anywhere in the packet array.

To assure synchronization and to distinguish from framedropping, when a captured frame does not carry any of the requesteddata services drivers must pass one or more empty packets. When anapplication fails to pass VBI data in time for output, the drivermust output the last VPS and WSS packet again, and disable the outputof Closed Caption and Teletext data, or output data which is ignoredby Closed Caption and Teletext decoders.

A sliced VBI device may support read/write and/or streaming (memory mapping and/oruserpointer) I/O. The latter bears the possibility of synchronizingvideo and VBI data by using buffer timestamps.


4.9. Teletext Interface

This interface aims at devices receiving and demodulatingTeletext data [ETS 300 706,ITU BT.653], evaluating theTeletext packages and storing formatted pages in cache memory. Suchdevices are usually implemented as microcontrollers with serialinterface (I2C) and can be found on olderTV cards, dedicated Teletext decoding cards and home-brew devicesconnected to the PC parallel port.

The Teletext API was designed by Martin Buck. It is defined inthe kernel header filelinux/videotext.h, thespecification is available fromhttp://home.pages.de/~videotext/. (Videotext is the name ofthe German public television Teletext service.) Conventional characterdevice file names are/dev/vtx and/dev/vttuner, with device number 83, 0 and 83, 16respectively. A similar interface exists for the Philips SAA5249Teletext decoder [specification?] with character device file names/dev/tlkN, device number 102, N.

Eventually the Teletext API was integrated into the V4L APIwith character device file names/dev/vtx0 to/dev/vtx31, device major number 81, minor numbers192 to 223. For reference the V4L Teletext API specification isreproduced here in full: "Teletext interfaces talk the existing VTXAPI." Teletext devices with major number 83 and 102 will be removed inLinux 2.6.

There are no plans to replace the Teletext API or to integrateit into V4L2. Please write to the Video4Linux mailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list when the need arises.


4.10. Radio Interface

This interface is intended for AM and FM (analog) radioreceivers.

Conventionally V4L2 radio devices are accessed throughcharacter device special files named/dev/radioand/dev/radio0 to/dev/radio63 with major number 81 and minornumbers 64 to 127.


4.10.1. Querying Capabilities

Devices supporting the radio interface set theV4L2_CAP_RADIO andV4L2_CAP_TUNER flag in thecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl. Other combinations ofcapability flags are reserved for future extensions.


4.10.2. Supplemental Functions

Radio devices can support controls, and must support the tuner ioctls.

They do not support the video input or output, audio inputor output, video standard, cropping and scaling, compression andstreaming parameter, or overlay ioctls. All other ioctls and I/Omethods are reserved for future extensions.


4.10.3. Programming

Radio devices may have a couple audio controls (as discussedin Section 1.8) such as a volume control, possibly customcontrols. Further all radio devices have one tuner (these arediscussed inSection 1.6) with index number zero to selectthe radio frequency and to determine if a monaural or FM stereoprogram is received. Drivers switch automatically between AM and FMdepending on the selected frequency. TheVIDIOC_G_TUNER ioctlreports the supported frequency range.


4.11. RDS Interface

The Radio Data System transmits supplementaryinformation in binary format, for example the station name or travelinformation, on a inaudible audio subcarrier of a radio program. Thisinterface aims at devices capable of receiving and decoding RDSinformation.

The V4L API defines its RDS API as follows.

From radio devices supporting it, RDS data can be readwith the read() function. The data is packed in groups of three,as follows:

  1. First Octet Least Significant Byte of RDS Block

  2. Second Octet Most Significant Byte of RDS Block

  3. Third Octet Bit 7: Error bit. Indicates that anuncorrectable error occurred during reception of this block. Bit 6:Corrected bit. Indicates that an error was corrected for this datablock. Bits 5-3: Received Offset. Indicates the offset received by thesync system. Bits 2-0: Offset Name. Indicates the offset applied tothis data.

It was argued the RDS API should beextended before integration into V4L2, no new API has been devised yet.Please write to the Video4Linux mailing list for discussion:https://listman.redhat.com/mailman/listinfo/video4linux-list. Meanwhile no V4L2 driver should set theV4L2_CAP_RDS_CAPTURE capability flag.

I. Function Reference

Table of Contents V4L2 close() -- Close a V4L2 device V4L2 ioctl() -- Program a V4L2 device ioctl VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command ioctl VIDIOC_ENUMAUDIO -- Enumerate audio inputs ioctl VIDIOC_ENUMAUDOUT -- Enumerate audio outputs ioctl VIDIOC_ENUM_FMT -- Enumerate image formats ioctl VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes ioctl VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals ioctl VIDIOC_ENUMINPUT -- Enumerate video inputs ioctl VIDIOC_ENUMOUTPUT -- Enumerate video outputs ioctl VIDIOC_ENUMSTD -- Enumerate supported video standards ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT -- Query or select the current audio output ioctl VIDIOC_G_CHIP_IDENT -- Identify the chips on a TV card ioctl VIDIOC_G_CROP, VIDIOC_S_CROP -- Get or set the current cropping rectangle ioctl VIDIOC_G_CTRL, VIDIOC_S_CTRL -- Get or set the value of a control ioctl VIDIOC_G_ENC_INDEX -- Get meta data about a compressed video stream ioctl VIDIOC_G_EXT_CTRLS, VIDIOC_S_EXT_CTRLS,VIDIOC_TRY_EXT_CTRLS -- Get or set the value of several controls, try controlvalues ioctl VIDIOC_G_FBUF, VIDIOC_S_FBUF -- Get or set frame buffer overlay parameters ioctl VIDIOC_G_FMT, VIDIOC_S_FMT,VIDIOC_TRY_FMT -- Get or set the data format, try a format ioctl VIDIOC_G_FREQUENCY, VIDIOC_S_FREQUENCY -- Get or set tuner or modulator radiofrequency ioctl VIDIOC_G_INPUT, VIDIOC_S_INPUT -- Query or select the current video input ioctl VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP --  ioctl VIDIOC_G_MODULATOR, VIDIOC_S_MODULATOR -- Get or set modulator attributes ioctl VIDIOC_G_OUTPUT, VIDIOC_S_OUTPUT -- Query or select the current video output ioctl VIDIOC_G_PARM, VIDIOC_S_PARM -- Get or set streaming parameters ioctl VIDIOC_G_PRIORITY, VIDIOC_S_PRIORITY -- Query or request the access priority associated with afile descriptor ioctl VIDIOC_G_SLICED_VBI_CAP -- Query sliced VBI capabilities ioctl VIDIOC_G_STD, VIDIOC_S_STD -- Query or select the video standard of the current input ioctl VIDIOC_G_TUNER, VIDIOC_S_TUNER -- Get or set tuner attributes ioctl VIDIOC_LOG_STATUS -- Log driver status information ioctl VIDIOC_OVERLAY -- Start or stop video overlay ioctl VIDIOC_QBUF, VIDIOC_DQBUF -- Exchange a buffer with the driver ioctl VIDIOC_QUERYBUF -- Query the status of a buffer ioctl VIDIOC_QUERYCAP -- Query device capabilities ioctl VIDIOC_QUERYCTRL, VIDIOC_QUERYMENU -- Enumerate controls and menu control items ioctl VIDIOC_QUERYSTD -- Sense the video standard received by the currentinput ioctl VIDIOC_REQBUFS -- Initiate Memory Mapping or User Pointer I/O ioctl VIDIOC_STREAMON, VIDIOC_STREAMOFF -- Start or stop streaming I/O V4L2 mmap() -- Map device memory into application address space V4L2 munmap() -- Unmap device memory V4L2 open() -- Open a V4L2 device V4L2 poll() -- Wait for some event on a file descriptor V4L2 read() -- Read from a V4L2 device V4L2 select() -- Synchronous I/O multiplexing V4L2 write() -- Write to a V4L2 device

V4L2 close()

Name

v4l2-close -- Close a V4L2 device

Synopsis

#include 

int close(int fd);

Arguments

fd

File descriptor returned by open().

Description

你可能感兴趣的:(Q_CAMERA)