V4L2 API Specification

api video audio output descriptor struct

目录(?)[-]

  1. Video for Linux Two API Specification
    1. Revision 024
      1. Michael H Schimek
      2. Bill Dirks
      3. Hans Verkuil
      4. Martin Rubli
    2. Introduction
    3. Introduction
    4. Chapter 1 Common API Elements
      1. Opening and Closing Devices
        1. Device Naming
        2. Related Devices
        3. Multiple Opens
        4. Shared Data Streams
        5. Functions
      2. Querying Capabilities
      3. Application Priority
      4. Video Inputs and Outputs
      5. Audio Inputs and Outputs
      6. Tuners and Modulators
        1. Tuners
        2. Modulators
        3. Radio Frequency
        4. Satellite Receivers
      7. Video Standards
      8. User Controls
      9. Extended Controls
        1. Introduction
        2. The Extended Control API
        3. Enumerating Extended Controls
        4. Creating Control Panels
        5. MPEG Control Reference
          1. Generic MPEG Controls
          2. CX2341x MPEG Controls
        6. Camera Control Reference
      10. Data Formats
        1. Data Format Negotiation
        2. Image Format Enumeration
      11. Image Cropping Insertion and Scaling
        1. Cropping Structures
        2. Scaling Adjustments
        3. Examples
      12. Streaming Parameters
    5. Chapter 2 Image Formats
      1. Standard Image Formats
      2. Colorspaces
      3. Indexed Format
      4. RGB Formats
    6. Packed RGB formats
      1. Name
      2. Description
    7. V4L2_PIX_FMT_SBGGR8 BA81
      1. Name
      2. Description
    8. V4L2_PIX_FMT_SBGGR16 BA82
      1. Name
      2. Description
      3. YUV Formats
    9. Packed YUV formats
      1. Name
      2. Description
    10. V4L2_PIX_FMT_GREY GREY
      1. Name
      2. Description
    11. V4L2_PIX_FMT_Y16 Y16
      1. Name
      2. Description
    12. V4L2_PIX_FMT_YUYV YUYV
      1. Name
      2. Description
    13. V4L2_PIX_FMT_UYVY UYVY
      1. Name
      2. Description
    14. V4L2_PIX_FMT_Y41P Y41P
      1. Name
      2. Description
    15. V4L2_PIX_FMT_YVU420 YV12 V4L2_PIX_FMT_YUV420 YU12
      1. Name
      2. Description
    16. V4L2_PIX_FMT_YVU410 YVU9 V4L2_PIX_FMT_YUV410 YUV9
      1. Name
      2. Description
    17. V4L2_PIX_FMT_YUV422P 422P
      1. Name
      2. Description
    18. V4L2_PIX_FMT_YUV411P 411P
      1. Name
      2. Description
    19. V4L2_PIX_FMT_NV12 NV12 V4L2_PIX_FMT_NV21 NV21
      1. Name
      2. Description
      3. Compressed Formats
      4. Reserved Format Identifiers
    20. Chapter 3 InputOutput
      1. ReadWrite
      2. Streaming IO Memory Mapping
      3. Streaming IO User Pointers
      4. Asynchronous IO
      5. Buffers
        1. Timecodes
      6. Field Order
    21. Chapter 4 Interfaces
      1. Video Capture Interface
        1. Querying Capabilities
        2. Supplemental Functions
        3. Image Format Negotiation
        4. Reading Images
      2. Video Overlay Interface
        1. Querying Capabilities
        2. Supplemental Functions
        3. Setup
        4. Overlay Window
        5. Enabling Overlay
      3. Video Output Interface
        1. Querying Capabilities
        2. Supplemental Functions
        3. Image Format Negotiation
        4. Writing Images
      4. Video Output Overlay Interface
        1. Querying Capabilities
        2. Framebuffer
        3. Overlay Window and Scaling
        4. Enabling Overlay
      5. Codec Interface
      6. Effect Devices Interface
      7. Raw VBI Data Interface
        1. Querying Capabilities
        2. Supplemental Functions
        3. Raw VBI Format Negotiation
        4. Reading and writing VBI images
      8. Sliced VBI Data Interface
        1. Querying Capabilities
        2. Supplemental Functions
        3. Sliced VBI Format Negotiation
        4. Reading and writing sliced VBI data
      9. Teletext Interface
      10. Radio Interface
        1. Querying Capabilities
        2. Supplemental Functions
        3. Programming
      11. RDS Interface
    22. I Function Reference
    23. V4L2 close
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    24. V4L2 ioctl
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    25. ioctl VIDIOC_CROPCAP
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    26. ioctl VIDIOC_DBG_G_REGISTER VIDIOC_DBG_S_REGISTER
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    27. ioctl VIDIOC_ENCODER_CMD VIDIOC_TRY_ENCODER_CMD
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    28. ioctl VIDIOC_ENUMAUDIO
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    29. ioctl VIDIOC_ENUMAUDOUT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    30. ioctl VIDIOC_ENUM_FMT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    31. ioctl VIDIOC_ENUM_FRAMESIZES
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Structs
      6. Enums
      7. Return Value
    32. ioctl VIDIOC_ENUM_FRAMEINTERVALS
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Notes
      6. Structs
      7. Enums
      8. Return Value
    33. ioctl VIDIOC_ENUMINPUT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    34. ioctl VIDIOC_ENUMOUTPUT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    35. ioctl VIDIOC_ENUMSTD
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    36. ioctl VIDIOC_G_AUDIO VIDIOC_S_AUDIO
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    37. ioctl VIDIOC_G_AUDOUT VIDIOC_S_AUDOUT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    38. ioctl VIDIOC_G_CHIP_IDENT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    39. ioctl VIDIOC_G_CROP VIDIOC_S_CROP
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    40. ioctl VIDIOC_G_CTRL VIDIOC_S_CTRL
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    41. ioctl VIDIOC_G_ENC_INDEX
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    42. ioctl VIDIOC_G_EXT_CTRLS VIDIOC_S_EXT_CTRLSVIDIOC_TRY_EXT_CTRLS
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    43. ioctl VIDIOC_G_FBUF VIDIOC_S_FBUF
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    44. ioctl VIDIOC_G_FMT VIDIOC_S_FMTVIDIOC_TRY_FMT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    45. ioctl VIDIOC_G_FREQUENCY VIDIOC_S_FREQUENCY
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    46. ioctl VIDIOC_G_INPUT VIDIOC_S_INPUT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    47. ioctl VIDIOC_G_JPEGCOMP VIDIOC_S_JPEGCOMP
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    48. ioctl VIDIOC_G_MODULATOR VIDIOC_S_MODULATOR
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    49. ioctl VIDIOC_G_OUTPUT VIDIOC_S_OUTPUT
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    50. ioctl VIDIOC_G_PARM VIDIOC_S_PARM
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    51. ioctl VIDIOC_G_PRIORITY VIDIOC_S_PRIORITY
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    52. ioctl VIDIOC_G_SLICED_VBI_CAP
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    53. ioctl VIDIOC_G_STD VIDIOC_S_STD
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    54. ioctl VIDIOC_G_TUNER VIDIOC_S_TUNER
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    55. ioctl VIDIOC_LOG_STATUS
      1. Name
      2. Synopsis
      3. Description
      4. Return Value
    56. ioctl VIDIOC_OVERLAY
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    57. ioctl VIDIOC_QBUF VIDIOC_DQBUF
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    58. ioctl VIDIOC_QUERYBUF
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    59. ioctl VIDIOC_QUERYCAP
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    60. ioctl VIDIOC_QUERYCTRL VIDIOC_QUERYMENU
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    61. ioctl VIDIOC_QUERYSTD
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    62. ioctl VIDIOC_REQBUFS
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    63. ioctl VIDIOC_STREAMON VIDIOC_STREAMOFF
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    64. V4L2 mmap
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    65. V4L2 munmap
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    66. V4L2 open
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    67. V4L2 poll
      1. Name
      2. Synopsis
      3. Description
      4. Return Value
    68. V4L2 read
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    69. V4L2 select
      1. Name
      2. Synopsis
      3. Description
      4. Return Value
    70. V4L2 write
      1. Name
      2. Synopsis
      3. Arguments
      4. Description
      5. Return Value
    71. Chapter 5 V4L2 Driver Programming
    72. Chapter 6 Changes
      1. Differences between V4L and V4L2
        1. Opening and Closing Devices
        2. Querying Capabilities
        3. Video Sources
        4. Tuning
        5. Image Properties
        6. Audio
        7. Frame Buffer Overlay
        8. Cropping
        9. Reading Images Memory Mapping
          1. Capturing using the read method
          2. Capturing using memory mapping
        10. Reading Raw VBI Data
        11. Miscellaneous
      2. Changes of the V4L2 API
        1. Early Versions
        2. V4L2 Version 016 1999-01-31
        3. V4L2 Version 018 1999-03-16
        4. V4L2 Version 019 1999-06-05
        5. V4L2 Version 020 1999-09-10
        6. V4L2 Version 020 incremental changes
        7. V4L2 Version 020 2000-11-23
        8. V4L2 Version 020 2002-07-25
        9. V4L2 in Linux 2546 2002-10
        10. V4L2 2003-06-19
        11. V4L2 2003-11-05
        12. V4L2 in Linux 266 2004-05-09
        13. V4L2 in Linux 268
        14. V4L2 spec erratum 2004-08-01
        15. V4L2 in Linux 2614
        16. V4L2 in Linux 2615
        17. V4L2 spec erratum 2005-11-27
        18. V4L2 spec erratum 2006-01-10
        19. V4L2 spec erratum 2006-02-03
        20. V4L2 spec erratum 2006-02-04
        21. V4L2 in Linux 2617
        22. V4L2 spec erratum 2006-09-23 Draft 015
        23. V4L2 in Linux 2618
        24. V4L2 in Linux 2619
        25. V4L2 spec erratum 2006-10-12 Draft 017
        26. V4L2 in Linux 2621
        27. V4L2 in Linux 2622
        28. V4L2 in Linux 2624
        29. V4L2 in Linux 2625
      3. Relation of V4L2 to other Linux multimedia APIs
        1. X Video Extension
        2. Digital Video
        3. Audio Interfaces
      4. Experimental API Elements
      5. Obsolete API Elements
    73. Appendix A Video For Linux Two Header File
    74. Appendix B Video Capture Example
    75. Appendix C GNU Free Documentation License
      1. C1 0 PREAMBLE
      2. C2 1 APPLICABILITY AND DEFINITIONS
      3. C3 2 VERBATIM COPYING
      4. C4 3 COPYING IN QUANTITY
      5. C5 4 MODIFICATIONS
      6. C6 5 COMBINING DOCUMENTS
      7. C7 6 COLLECTIONS OF DOCUMENTS
      8. C8 7 AGGREGATION WITH INDEPENDENT WORKS
      9. C9 8 TRANSLATION
      10. C10 9 TERMINATION
      11. C11 10 FUTURE REVISIONS OF THIS LICENSE
      12. C12 Addendum
    76. List of Types
    77. References
      1. Notes

Video for Linux Two API Specification

Revision 0.24

Michael H Schimek

            
          

Bill Dirks

Hans Verkuil

Martin Rubli

This document is copyrighted © 1999-2008 by BillDirks, Michael H. Schimek, Hans Verkuil and Martin Rubli.

Permission is granted to copy, distribute and/or modifythis document under the terms of the GNU Free Documentation License,Version 1.1 or any later version published by the Free SoftwareFoundation; with no Invariant Sections, with no Front-Cover Texts, andwith no Back-Cover Texts. A copy of the license is included in theappendix entitled "GNU Free Documentation License".

Programming examples can be used and distributed withoutrestrictions.


Table of Contents

Introduction

1.  Common API Elements
1.1.  Opening and Closing Devices
1.1.1.  Device Naming
1.1.2.  Related Devices
1.1.3.  Multiple Opens
1.1.4.  Shared Data Streams
1.1.5.  Functions
1.2.  Querying Capabilities
1.3.  Application Priority
1.4.  Video Inputs and Outputs
1.5.  Audio Inputs and Outputs
1.6.  Tuners and Modulators
1.6.1.  Tuners
1.6.2.  Modulators
1.6.3.  Radio Frequency
1.6.4.  Satellite Receivers
1.7.  Video Standards
1.8.  User Controls
1.9.  Extended Controls
1.9.1.  Introduction
1.9.2.  The Extended Control API
1.9.3.  Enumerating Extended Controls
1.9.4.  Creating Control Panels
1.9.5.  MPEG Control Reference
1.9.6.  Camera Control Reference
1.10.  Data Formats
1.10.1.  Data Format Negotiation
1.10.2.  Image Format Enumeration
1.11.  Image Cropping, Insertion and Scaling
1.11.1.  Cropping Structures
1.11.2.  Scaling Adjustments
1.11.3.  Examples
1.12.  Streaming Parameters
2.  Image Formats
2.1.  Standard Image Formats
2.2.  Colorspaces
2.3.  Indexed Format
2.4.  RGB Formats
Packed RGB formats -- Packed RGB formats
V4L2_PIX_FMT_SBGGR8 ('BA81') -- Bayer RGB format
V4L2_PIX_FMT_SBGGR16 ('BA82') -- Bayer RGB format
2.5.  YUV Formats
Packed YUV formats -- Packed YUV formats
V4L2_PIX_FMT_GREY ('GREY') -- Grey-scale image
V4L2_PIX_FMT_Y16 ('Y16 ') -- Grey-scale image
V4L2_PIX_FMT_YUYV ('YUYV') -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2
V4L2_PIX_FMT_UYVY ('UYVY') -- Variation of V4L2_PIX_FMT_YUYV with different order of samplesin memory
V4L2_PIX_FMT_Y41P ('Y41P') -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1
V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12') -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0
V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9') -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0
V4L2_PIX_FMT_YUV422P ('422P') -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to V4L2_PIX_FMT_YUYV
V4L2_PIX_FMT_YUV411P ('411P') -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to V4L2_PIX_FMT_Y41P
V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21') -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to V4L2_PIX_FMT_YVU420
2.6.  Compressed Formats
2.7.  Reserved Format Identifiers
3.  Input/Output
3.1.  Read/Write
3.2.  Streaming I/O (Memory Mapping)
3.3.  Streaming I/O (User Pointers)
3.4.  Asynchronous I/O
3.5.  Buffers
3.5.1.  Timecodes
3.6.  Field Order
4.  Interfaces
4.1.  Video Capture Interface
4.1.1.  Querying Capabilities
4.1.2.  Supplemental Functions
4.1.3.  Image Format Negotiation
4.1.4.  Reading Images
4.2.  Video Overlay Interface
4.2.1.  Querying Capabilities
4.2.2.  Supplemental Functions
4.2.3.  Setup
4.2.4.  Overlay Window
4.2.5.  Enabling Overlay
4.3.  Video Output Interface
4.3.1.  Querying Capabilities
4.3.2.  Supplemental Functions
4.3.3.  Image Format Negotiation
4.3.4.  Writing Images
4.4.  Video Output Overlay Interface
4.4.1.  Querying Capabilities
4.4.2.  Framebuffer
4.4.3.  Overlay Window and Scaling
4.4.4.  Enabling Overlay
4.5.  Codec Interface
4.6.  Effect Devices Interface
4.7.  Raw VBI Data Interface
4.7.1.  Querying Capabilities
4.7.2.  Supplemental Functions
4.7.3.  Raw VBI Format Negotiation
4.7.4.  Reading and writing VBI images
4.8.  Sliced VBI Data Interface
4.8.1.  Querying Capabilities
4.8.2.  Supplemental Functions
4.8.3.  Sliced VBI Format Negotiation
4.8.4.  Reading and writing sliced VBI data
4.9.  Teletext Interface
4.10.  Radio Interface
4.10.1.  Querying Capabilities
4.10.2.  Supplemental Functions
4.10.3.  Programming
4.11.  RDS Interface
I.  Function Reference
V4L2 close() -- Close a V4L2 device
V4L2 ioctl() -- Program a V4L2 device
ioctl VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities
ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers
ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command
ioctl VIDIOC_ENUMAUDIO -- Enumerate audio inputs
ioctl VIDIOC_ENUMAUDOUT -- Enumerate audio outputs
ioctl VIDIOC_ENUM_FMT -- Enumerate image formats
ioctl VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes
ioctl VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals
ioctl VIDIOC_ENUMINPUT -- Enumerate video inputs
ioctl VIDIOC_ENUMOUTPUT -- Enumerate video outputs
ioctl VIDIOC_ENUMSTD -- Enumerate supported video standards
ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes
ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT -- Query or select the current audio output
ioctl VIDIOC_G_CHIP_IDENT -- Identify the chips on a TV card
ioctl VIDIOC_G_CROP, VIDIOC_S_CROP -- Get or set the current cropping rectangle
ioctl VIDIOC_G_CTRL, VIDIOC_S_CTRL -- Get or set the value of a control
ioctl VIDIOC_G_ENC_INDEX -- Get meta data about a compressed video stream
ioctl VIDIOC_G_EXT_CTRLS, VIDIOC_S_EXT_CTRLS,VIDIOC_TRY_EXT_CTRLS -- Get or set the value of several controls, try controlvalues
ioctl VIDIOC_G_FBUF, VIDIOC_S_FBUF -- Get or set frame buffer overlay parameters
ioctl VIDIOC_G_FMT, VIDIOC_S_FMT,VIDIOC_TRY_FMT -- Get or set the data format, try a format
ioctl VIDIOC_G_FREQUENCY, VIDIOC_S_FREQUENCY -- Get or set tuner or modulator radiofrequency
ioctl VIDIOC_G_INPUT, VIDIOC_S_INPUT -- Query or select the current video input
ioctl VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP -- 
ioctl VIDIOC_G_MODULATOR, VIDIOC_S_MODULATOR -- Get or set modulator attributes
ioctl VIDIOC_G_OUTPUT, VIDIOC_S_OUTPUT -- Query or select the current video output
ioctl VIDIOC_G_PARM, VIDIOC_S_PARM -- Get or set streaming parameters
ioctl VIDIOC_G_PRIORITY, VIDIOC_S_PRIORITY -- Query or request the access priority associated with afile descriptor
ioctl VIDIOC_G_SLICED_VBI_CAP -- Query sliced VBI capabilities
ioctl VIDIOC_G_STD, VIDIOC_S_STD -- Query or select the video standard of the current input
ioctl VIDIOC_G_TUNER, VIDIOC_S_TUNER -- Get or set tuner attributes
ioctl VIDIOC_LOG_STATUS -- Log driver status information
ioctl VIDIOC_OVERLAY -- Start or stop video overlay
ioctl VIDIOC_QBUF, VIDIOC_DQBUF -- Exchange a buffer with the driver
ioctl VIDIOC_QUERYBUF -- Query the status of a buffer
ioctl VIDIOC_QUERYCAP -- Query device capabilities
ioctl VIDIOC_QUERYCTRL, VIDIOC_QUERYMENU -- Enumerate controls and menu control items
ioctl VIDIOC_QUERYSTD -- Sense the video standard received by the currentinput
ioctl VIDIOC_REQBUFS -- Initiate Memory Mapping or User Pointer I/O
ioctl VIDIOC_STREAMON, VIDIOC_STREAMOFF -- Start or stop streaming I/O
V4L2 mmap() -- Map device memory into application address space
V4L2 munmap() -- Unmap device memory
V4L2 open() -- Open a V4L2 device
V4L2 poll() -- Wait for some event on a file descriptor
V4L2 read() -- Read from a V4L2 device
V4L2 select() -- Synchronous I/O multiplexing
V4L2 write() -- Write to a V4L2 device
5.  V4L2 Driver Programming
6.  Changes
6.1.  Differences between V4L and V4L2
6.1.1.  Opening and Closing Devices
6.1.2.  Querying Capabilities
6.1.3.  Video Sources
6.1.4.  Tuning
6.1.5.  Image Properties
6.1.6.  Audio
6.1.7.  Frame Buffer Overlay
6.1.8.  Cropping
6.1.9.  Reading Images, Memory Mapping
6.1.10.  Reading Raw VBI Data
6.1.11.  Miscellaneous
6.2.  Changes of the V4L2 API
6.2.1.  Early Versions
6.2.2.  V4L2 Version 0.16 1999-01-31
6.2.3.  V4L2 Version 0.18 1999-03-16
6.2.4.  V4L2 Version 0.19 1999-06-05
6.2.5.  V4L2 Version 0.20 (1999-09-10)
6.2.6.  V4L2 Version 0.20 incremental changes
6.2.7.  V4L2 Version 0.20 2000-11-23
6.2.8.  V4L2 Version 0.20 2002-07-25
6.2.9.  V4L2 in Linux 2.5.46, 2002-10
6.2.10.  V4L2 2003-06-19
6.2.11.  V4L2 2003-11-05
6.2.12.  V4L2 in Linux 2.6.6, 2004-05-09
6.2.13.  V4L2 in Linux 2.6.8
6.2.14.  V4L2 spec erratum 2004-08-01
6.2.15.  V4L2 in Linux 2.6.14
6.2.16.  V4L2 in Linux 2.6.15
6.2.17.  V4L2 spec erratum 2005-11-27
6.2.18.  V4L2 spec erratum 2006-01-10
6.2.19.  V4L2 spec erratum 2006-02-03
6.2.20.  V4L2 spec erratum 2006-02-04
6.2.21.  V4L2 in Linux 2.6.17
6.2.22.  V4L2 spec erratum 2006-09-23 (Draft 0.15)
6.2.23.  V4L2 in Linux 2.6.18
6.2.24.  V4L2 in Linux 2.6.19
6.2.25.  V4L2 spec erratum 2006-10-12 (Draft 0.17)
6.2.26.  V4L2 in Linux 2.6.21
6.2.27.  V4L2 in Linux 2.6.22
6.2.28.  V4L2 in Linux 2.6.24
6.2.29.  V4L2 in Linux 2.6.25
6.3.  Relation of V4L2 to other Linux multimedia APIs
6.3.1.  X Video Extension
6.3.2.  Digital Video
6.3.3.  Audio Interfaces
6.4.  Experimental API Elements
6.5.  Obsolete API Elements
A.  Video For Linux Two Header File
B.  Video Capture Example
C.  GNU Free Documentation License
C.1.  0. PREAMBLE
C.2.  1. APPLICABILITY AND DEFINITIONS
C.3.  2. VERBATIM COPYING
C.4.  3. COPYING IN QUANTITY
C.5.  4. MODIFICATIONS
C.6.  5. COMBINING DOCUMENTS
C.7.  6. COLLECTIONS OF DOCUMENTS
C.8.  7. AGGREGATION WITH INDEPENDENT WORKS
C.9.  8. TRANSLATION
C.10.  9. TERMINATION
C.11.  10. FUTURE REVISIONS OF THIS LICENSE
C.12.  Addendum
List of Types
References
List of Figures
1-1.  Image Cropping, Insertion and Scaling
3-1.  Field Order, Top Field First Transmitted
3-2.  Field Order, Bottom Field First Transmitted
4-1.  Line synchronization
4-2.  ITU-R 525 line numbering (M/NTSC and M/PAL)
4-3.  ITU-R 625 line numbering
List of Examples
1-1.  Information about the current video input
1-2.  Switching to the first video input
1-3.  Information about the current audio input
1-4.  Switching to the first audio input
1-5.  Information about the current video standard
1-6.  Listing the video standards supported by the currentinput
1-7.  Selecting a new video standard
1-8.  Enumerating all controls
1-9.  Changing controls
1-10.  Resetting the cropping parameters
1-11.  Simple downscaling
1-12.  Selecting an output area
1-13.  Current scaling factor and pixel aspect
2-1.  ITU-R Rec. BT.601 color conversion
2-1.  V4L2_PIX_FMT_BGR24 4 × 4 pixelimage
2-1.  V4L2_PIX_FMT_SBGGR8 4 × 4pixel image
2-1.  V4L2_PIX_FMT_SBGGR16 4 × 4pixel image
2-1.  V4L2_PIX_FMT_GREY 4 × 4pixel image
2-1.  V4L2_PIX_FMT_Y16 4 × 4pixel image
2-1.  V4L2_PIX_FMT_YUYV 4 × 4pixel image
2-1.  V4L2_PIX_FMT_UYVY 4 × 4pixel image
2-1.  V4L2_PIX_FMT_Y41P 8 × 4pixel image
2-1.  V4L2_PIX_FMT_YVU420 4 × 4pixel image
2-1.  V4L2_PIX_FMT_YVU410 4 × 4pixel image
2-1.  V4L2_PIX_FMT_YUV422P 4 × 4pixel image
2-1.  V4L2_PIX_FMT_YUV411P 4 × 4pixel image
2-1.  V4L2_PIX_FMT_NV12 4 × 4pixel image
3-1.  Mapping buffers
3-2.  Initiating streaming I/O with user pointers
4-1.  Finding a framebuffer device for OSD

Introduction

Video For Linux Two is the second version of the Video ForLinux API, a kernel interface for analog radio and video capture andoutput drivers.

Early drivers used ad-hoc interfaces. These were replaced inLinux 2.2 by Alan Cox' V4L API, based on the interface of the bttvdriver. In 1999 Bill Dirks started the development of V4L2 to fix someshortcomings of V4L and to support a wider range of devices. The APIwas revised again in 2002 prior to its inclusion in Linux 2.5/2.6, andwork continues on improvements and additions while maintainingcompatibility with existing drivers and applications. In 2006/2007efforts began on FreeBSD drivers with a V4L2 interface.

This book documents the V4L2 API. Intended audience aredriver and application writers.

If you have questions or ideas regarding the API, pleasewrite to the Video4Linux mailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list. For inquiries aboutthe V4L2 specification contact the [email protected].

The latest version of this document and the DocBook SGMLsources are hosted at http://v4l2spec.bytesex.org,andhttp://linuxtv.org/downloads/video4linux/API/V4L2_API.


Chapter 1. Common API Elements

Programming a V4L2 device consists of thesesteps:

  • Opening the device

  • Changing device properties, selecting a video and audioinput, video standard, picture brightness a. o.

  • Negotiating a data format

  • Negotiating an input/output method

  • The actual input/output loop

  • Closing the device

In practice most steps are optional and can be executed out oforder. It depends on the V4L2 device type, you can read about thedetails inChapter 4. In this chapter we will discussthe basic concepts applicable to all devices.


1.1. Opening and Closing Devices

1.1.1. Device Naming

V4L2 drivers are implemented as kernel modules, loadedmanually by the system administrator or automatically when a device isfirst opened. The driver modules plug into the "videodev" kernelmodule. It provides helper functions and a common applicationinterface specified in this document.

Each driver thus loaded registers one or more device nodeswith major number 81 and a minor number between 0 and 255. Assigningminor numbers to V4L2 devices is entirely up to the system administrator,this is primarily intended to solve conflicts between devices.[1] The module options to select minor numbers are namedafter the device special file with a "_nr" suffix. For example "video_nr"for/dev/video video capture devices. The number isan offset to the base minor number associated with the device type.[2] When the driver supports multiple devices of the sametype more than one minor number can be assigned, separated by commas:

> insmod mydriver.o video_nr=0,1 radio_nr=0,1

In /etc/modules.conf this may bewritten as:

alias char-major-81-0 mydriver
alias char-major-81-1 mydriver
alias char-major-81-64 mydriver              
options mydriver video_nr=0,1 radio_nr=0,1   
          
When an application attempts to open a devicespecial file with major number 81 and minor number 0, 1, or 64, load"mydriver" (and the "videodev" module it depends upon).
Register the first two video capture devices withminor number 0 and 1 (base number is 0), the first two radio devicewith minor number 64 and 65 (base 64).
When no minor number is given as moduleoption the driver supplies a default.  Chapter 4recommends the base minor numbers to be used for the various devicetypes. Obviously minor numbers must be unique. When the number isalready in use the offending device will not beregistered.

By convention system administrators create variouscharacter device special files with these major and minor numbers inthe/dev directory. The names recomended for thedifferent V4L2 device types are listed inChapter 4.

The creation of character special files (withmknod) is a privileged operation anddevices cannot be opened by major and minor number. That meansapplications cannotreliable scan for loaded orinstalled drivers. The user must enter a device name, or theapplication can try the conventional device names.

Under the device filesystem (devfs) the minor numberoptions are ignored. V4L2 drivers (or by proxy the "videodev" module)automatically create the required device files in the/dev/v4l directory using the conventional devicenames above.


1.1.2. Related Devices

Devices can support several related functions. For examplevideo capturing, video overlay and VBI capturing are related becausethese functions share, amongst other, the same video input and tunerfrequency. V4L and earlier versions of V4L2 used the same device nameand minor number for video capturing and overlay, but different onesfor VBI. Experience showed this approach has several problems[3], and to make things worse the V4L videodev moduleused to prohibit multiple opens of a device.

As a remedy the present version of the V4L2 API relaxed theconcept of device types with specific names and minor numbers. Forcompatibility with old applications drivers must still register differentminor numbers to assign a default function to the device. But if relatedfunctions are supported by the driver they must be available under allregistered minor numbers. The desired function can be selected afteropening the device as described inChapter 4.

Imagine a driver supporting video capturing, videooverlay, raw VBI capturing, and FM radio reception. It registers threedevices with minor number 0, 64 and 224 (this numbering scheme isinherited from the V4L API). Regardless if/dev/video (81, 0) or/dev/vbi (81, 224) is opened the application canselect any one of the video capturing, overlay or VBI capturingfunctions. Without programming (e. g. reading from the devicewithdd orcat)/dev/videocaptures video images, while/dev/vbi captures raw VBI data./dev/radio (81, 64) is invariable a radio device,unrelated to the video functions. Being unrelated does not imply thedevices can be used at the same time, however. Theopen()function may very well return anEBUSY error code.

Besides video input or output the hardware may alsosupport audio sampling or playback. If so, these functions areimplemented as OSS or ALSA PCM devices and eventually OSS or ALSAaudio mixer. The V4L2 API makes no provisions yet to find theserelated devices. If you have an idea please write to the Video4Linuxmailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list.


1.1.3. Multiple Opens

In general, V4L2 devices can be opened more than once.When this is supported by the driver, users can for example start a"panel" application to change controls like brightness or audiovolume, while another application captures video and audio. In other words, panelapplications are comparable to an OSS or ALSA audio mixer application.When a device supports multiple functions like capturing and overlaysimultaneously, multiple opens allow concurrentuse of the device by forked processes or specialized applications.

Multiple opens are optional, although drivers shouldpermit at least concurrent accesses without data exchange, i. e. panelapplications. This impliesopen() can return anEBUSY error code when thedevice is already in use, as well asioctl() functions initiatingdata exchange (namely theVIDIOC_S_FMT ioctl), and theread()andwrite() functions.

Mere opening a V4L2 device does not grant exclusiveaccess.[4] Initiating data exchange however assigns the rightto read or write the requested type of data, and to change relatedproperties, to this file descriptor. Applications can requestadditional access privileges using the priority mechanism described inSection 1.3.


1.1.4. Shared Data Streams

V4L2 drivers should not support multiple applicationsreading or writing the same data stream on a device by copyingbuffers, time multiplexing or similar means. This is better handled bya proxy application in user space. When the driver supports streamsharing anyway it must be implemented transparently. The V4L2 API doesnot specify how conflicts are solved.


1.1.5. Functions

To open and close V4L2 devices applications use theopen() andclose() function, respectively. Devices areprogrammed using theioctl() function as explained in thefollowing sections.


1.2. Querying Capabilities

Because V4L2 covers a wide variety of devices not allaspects of the API are equally applicable to all types of devices.Furthermore devices of the same type have different capabilities andthis specification permits the omission of a few complicated and lessimportant parts of the API.

The VIDIOC_QUERYCAP ioctl is available to check if the kerneldevice is compatible with this specification, and to query thefunctions andI/Omethods supported by the device. Other features can be queriedby calling the respective ioctl, for exampleVIDIOC_ENUMINPUTto learn about the number, types and names of video connectors on thedevice. Although abstraction is a major objective of this API, theioctl also allows driver specific applications to reliable identifythe driver.

All V4L2 drivers must supportVIDIOC_QUERYCAP. Applications should always callthis ioctl after opening the device.


1.3. Application Priority

When multiple applications share a device it may bedesirable to assign them different priorities. Contrary to thetraditional "rm -rf /" school of thought a video recording applicationcould for example block other applications from changing videocontrols or switching the current TV channel. Another objective is topermit low priority applications working in background, which can bepreempted by user controlled applications and automatically regaincontrol of the device at a later time.

Since these features cannot be implemented entirely in userspace V4L2 defines theVIDIOC_G_PRIORITYandVIDIOC_S_PRIORITYioctls to request and query the access priority associate with a filedescriptor. Opening a device assigns a medium priority, compatiblewith earlier versions of V4L2 and drivers not supporting these ioctls.Applications requiring a different priority will usually callVIDIOC_S_PRIORITY after verifying the device withtheVIDIOC_QUERYCAP ioctl.

Ioctls changing driver properties, such as VIDIOC_S_INPUT,return an EBUSY error code after another application obtained higher priority.An event mechanism to notify applications about asynchronous propertychanges has been proposed but not added yet.


1.4. Video Inputs and Outputs

Video inputs and outputs are physical connectors of adevice. These can be for example RF connectors (antenna/cable), CVBSa.k.a. Composite Video, S-Video or RGB connectors. Only video and VBIcapture devices have inputs, output devices have outputs, at least oneeach. Radio devices have no video inputs or outputs.

To learn about the number and attributes of theavailable inputs and outputs applications can enumerate them with theVIDIOC_ENUMINPUT andVIDIOC_ENUMOUTPUT ioctl, respectively. Thestruct v4l2_input returned by theVIDIOC_ENUMINPUTioctl also contains signal status information applicable when thecurrent video input is queried.

The VIDIOC_G_INPUT and VIDIOC_G_OUTPUT ioctl return theindex of the current video input or output. To select a differentinput or output applications call theVIDIOC_S_INPUT andVIDIOC_S_OUTPUT ioctl. Drivers must implement all the input ioctlswhen the device has one or more inputs, all the output ioctls when thedevice has one or more outputs.

Example 1-1. Information about the current video input

struct v4l2_input input;
int index;

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &index)) {
        perror ("VIDIOC_G_INPUT");
        exit (EXIT_FAILURE);
}

memset (&input, 0, sizeof (input));
input.index = index;

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
        perror ("VIDIOC_ENUMINPUT");
        exit (EXIT_FAILURE);
}

printf ("Current input: %s\n", input.name);
      

Example 1-2. Switching to the first video input

int index;

index = 0;

if (-1 == ioctl (fd, VIDIOC_S_INPUT, &index)) {
        perror ("VIDIOC_S_INPUT");
        exit (EXIT_FAILURE);
}
      

1.5. Audio Inputs and Outputs

Audio inputs and outputs are physical connectors of adevice. Video capture devices have inputs, output devices haveoutputs, zero or more each. Radio devices have no audio inputs oroutputs. They have exactly one tuner which in factis an audio source, but this API associatestuners with video inputs or outputs only, and radio devices havenone of these.[5] A connector on a TV card to loop back the receivedaudio signal to a sound card is not considered an audio output.

Audio and video inputs and outputs are associated. Selectinga video source also selects an audio source. This is most evident whenthe video and audio source is a tuner. Further audio connectors cancombine with more than one video input or output. Assumed twocomposite video inputs and two audio inputs exist, there may be up tofour valid combinations. The relation of video and audio connectorsis defined in theaudioset field of therespective struct v4l2_input or struct v4l2_output, where each bit representsthe index number, starting at zero, of one audio input or output.

To learn about the number and attributes of theavailable inputs and outputs applications can enumerate them with theVIDIOC_ENUMAUDIO andVIDIOC_ENUMAUDOUT ioctl, respectively. Thestruct v4l2_audio returned by theVIDIOC_ENUMAUDIO ioctlalso contains signal status information applicable when the currentaudio input is queried.

The VIDIOC_G_AUDIO and VIDIOC_G_AUDOUT ioctl reportthe current audio input and output, respectively. Note that, unlikeVIDIOC_G_INPUT andVIDIOC_G_OUTPUT these ioctls return a structureasVIDIOC_ENUMAUDIOandVIDIOC_ENUMAUDOUT do, not just an index.

To select an audio input and change its propertiesapplications call the VIDIOC_S_AUDIO ioctl. To select an audiooutput (which presently has no changeable properties) applicationscall theVIDIOC_S_AUDOUT ioctl.

Drivers must implement all input ioctls when the devicehas one or more inputs, all output ioctls when the device has oneor more outputs. When the device has any audio inputs or outputs thedriver must set theV4L2_CAP_AUDIO flag in thestruct v4l2_capability returned by theVIDIOC_QUERYCAP ioctl.

Example 1-3. Information about the current audio input

struct v4l2_audio audio;

memset (&audio, 0, sizeof (audio));

if (-1 == ioctl (fd, VIDIOC_G_AUDIO, &audio)) {
        perror ("VIDIOC_G_AUDIO");
        exit (EXIT_FAILURE);
}

printf ("Current input: %s\n", audio.name);
      

Example 1-4. Switching to the first audio input

struct v4l2_audio audio;

memset (&audio, 0, sizeof (audio)); /* clear audio.mode, audio.reserved */

audio.index = 0;

if (-1 == ioctl (fd, VIDIOC_S_AUDIO, &audio)) {
        perror ("VIDIOC_S_AUDIO");
        exit (EXIT_FAILURE);
}
      

1.6. Tuners and Modulators

1.6.1. Tuners

Video input devices can have one or more tunersdemodulating a RF signal. Each tuner is associated with one or morevideo inputs, depending on the number of RF connectors on the tuner.Thetype field of the respectivestruct v4l2_input returned by theVIDIOC_ENUMINPUT ioctl is set toV4L2_INPUT_TYPE_TUNER and itstuner field contains the index number ofthe tuner.

Radio devices have exactly one tuner with index zero, novideo inputs.

To query and change tuner properties applications use theVIDIOC_G_TUNER andVIDIOC_S_TUNER ioctl, respectively. Thestruct v4l2_tuner returned byVIDIOC_G_TUNER alsocontains signal status information applicable when the tuner of thecurrent video input, or a radio tuner is queried. Note thatVIDIOC_S_TUNER does not switch the current tuner,when there is more than one at all. The tuner is solely determined bythe current video input. Drivers must support both ioctls and set theV4L2_CAP_TUNER flag in the struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl when the device has one ormore tuners.


1.6.2. Modulators

Video output devices can have one or more modulators, uh,modulating a video signal for radiation or connection to the antennainput of a TV set or video recorder. Each modulator is associated withone or more video outputs, depending on the number of RF connectors onthe modulator. The type field of therespective struct v4l2_outputreturned by theVIDIOC_ENUMOUTPUT ioctl isset toV4L2_OUTPUT_TYPE_MODULATOR and itsmodulator field contains the index numberof the modulator. This specification does not define radio outputdevices.

To query and change modulator properties applications usethe VIDIOC_G_MODULATOR and VIDIOC_S_MODULATOR ioctl. Note thatVIDIOC_S_MODULATOR does not switch the currentmodulator, when there is more than one at all. The modulator is solelydetermined by the current video output. Drivers must support bothioctls and set theV4L2_CAP_TUNER (sic) flag inthe struct v4l2_capability returned by theVIDIOC_QUERYCAP ioctl when thedevice has one or more modulators.


1.6.3. Radio Frequency

To get and set the tuner or modulator radio frequencyapplications use the VIDIOC_G_FREQUENCY andVIDIOC_S_FREQUENCYioctl which both take a pointer to a struct v4l2_frequency. These ioctlsare used for TV and radio devices alike. Drivers must support bothioctls when the tuner or modulator ioctls are supported, orwhen the device is a radio device.


1.6.4. Satellite Receivers

To be discussed. See also proposals by Peter Schlaf, [email protected] on 23 Oct 2002,subject: "Re: [V4L] Re: v4l2 api".


1.7. Video Standards

Video devices typically support one or more different videostandards or variations of standards. Each video input and output maysupport another set of standards. This set is reported by thestd field of struct v4l2_inputandstruct v4l2_output returned by theVIDIOC_ENUMINPUT andVIDIOC_ENUMOUTPUT ioctl, respectively.

V4L2 defines one bit for each analog video standardcurrently in use worldwide, and sets aside bits for driver definedstandards, e. g. hybrid standards to watch NTSC video tapes on PAL TVsand vice versa. Applications can use the predefined bits to select aparticular standard, although presenting the user a menu of supportedstandards is preferred. To enumerate and query the attributes of thesupported standards applications use theVIDIOC_ENUMSTDioctl.

Many of the defined standards are actually just variationsof a few major standards. The hardware may in fact not distinguishbetween them, or do so internal and switch automatically. Thereforeenumerated standards also contain sets of one or more standardbits.

Assume a hypothetic tuner capable of demodulating B/PAL,G/PAL and I/PAL signals. The first enumerated standard is a set of Band G/PAL, switched automatically depending on the selected radiofrequency in UHF or VHF band. Enumeration gives a "PAL-B/G" or "PAL-I"choice. Similar a Composite input may collapse standards, enumerating"PAL-B/G/H/I", "NTSC-M" and "SECAM-D/K".[6]

To query and select the standard used by the current videoinput or output applications call theVIDIOC_G_STDandVIDIOC_S_STD ioctl, respectively. The receivedstandard can be sensed with theVIDIOC_QUERYSTD ioctl. Note parameter of all these ioctls is a pointer to av4l2_std_id type (a standard set),not an index into the standard enumeration.[7] Drivers must implement all video standard ioctlswhen the device has one or more video inputs or outputs.

Special rules apply to USB cameras where the notion of videostandards makes little sense. More generally any capture device,output devices accordingly, which is

  • incapable of capturing fields or frames at the nominalrate of the video standard, or

  • where timestamps referto the instant the field or frame was received by the driver, not thecapture time, or

  • where sequence numbersrefer to the frames received by the driver, not the capturedframes.

Here the driver shall set the std field of struct  v4l2_input and struct  v4l2_outputto zero, the VIDIOC_G_STD, VIDIOC_S_STD, VIDIOC_QUERYSTD and VIDIOC_ENUMSTD ioctls shall return the EINVAL error code. [8]

Example 1-5. Information about the current video standard

v4l2_std_id std_id;
struct v4l2_standard standard;

if (-1 == ioctl (fd, VIDIOC_G_STD, &std_id)) {
        /* Note when VIDIOC_ENUMSTD always returns EINVAL this
           is no video device or it falls under the USB exception,
           and VIDIOC_G_STD returning EINVAL is no error. */

        perror ("VIDIOC_G_STD");
        exit (EXIT_FAILURE);
}

memset (&standard, 0, sizeof (standard));
standard.index = 0;

while (0 == ioctl (fd, VIDIOC_ENUMSTD, &standard)) {
        if (standard.id & std_id) {
               printf ("Current video standard: %s\n", standard.name);
               exit (EXIT_SUCCESS);
        }

        standard.index++;
}

/* EINVAL indicates the end of the enumeration, which cannot be
   empty unless this device falls under the USB exception. */

if (errno == EINVAL || standard.index == 0) {
        perror ("VIDIOC_ENUMSTD");
        exit (EXIT_FAILURE);
}
      

Example 1-6. Listing the video standards supported by the currentinput

struct v4l2_input input;
struct v4l2_standard standard;

memset (&input, 0, sizeof (input));

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &input.index)) {
        perror ("VIDIOC_G_INPUT");
        exit (EXIT_FAILURE);
}

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
        perror ("VIDIOC_ENUM_INPUT");
        exit (EXIT_FAILURE);
}

printf ("Current input %s supports:\n", input.name);

memset (&standard, 0, sizeof (standard));
standard.index = 0;

while (0 == ioctl (fd, VIDIOC_ENUMSTD, &standard)) {
        if (standard.id & input.std)
                printf ("%s\n", standard.name);

        standard.index++;
}

/* EINVAL indicates the end of the enumeration, which cannot be
   empty unless this device falls under the USB exception. */

if (errno != EINVAL || standard.index == 0) {
        perror ("VIDIOC_ENUMSTD");
        exit (EXIT_FAILURE);
}
      

Example 1-7. Selecting a new video standard

struct v4l2_input input;
v4l2_std_id std_id;

memset (&input, 0, sizeof (input));

if (-1 == ioctl (fd, VIDIOC_G_INPUT, &input.index)) {
        perror ("VIDIOC_G_INPUT");
        exit (EXIT_FAILURE);
}

if (-1 == ioctl (fd, VIDIOC_ENUMINPUT, &input)) {
        perror ("VIDIOC_ENUM_INPUT");
        exit (EXIT_FAILURE);
}

if (0 == (input.std & V4L2_STD_PAL_BG)) {
        fprintf (stderr, "Oops. B/G PAL is not supported.\n");
        exit (EXIT_FAILURE);
}

/* Note this is also supposed to work when only B
   or G/PAL is supported. */

std_id = V4L2_STD_PAL_BG;

if (-1 == ioctl (fd, VIDIOC_S_STD, &std_id)) {
        perror ("VIDIOC_S_STD");
        exit (EXIT_FAILURE);
}
      

1.8. User Controls

Devices typically have a number of user-settable controlssuch as brightness, saturation and so on, which would be presented tothe user on a graphical user interface. But, different deviceswill have different controls available, and furthermore, the range ofpossible values, and the default value will vary from device todevice. The control ioctls provide the information and a mechanism tocreate a nice user interface for these controls that will workcorrectly with any device.

All controls are accessed using an ID value. V4L2 definesseveral IDs for specific purposes. Drivers can also implement theirown custom controls usingV4L2_CID_PRIVATE_BASEand higher values. The pre-defined control IDs have the prefixV4L2_CID_, and are listed inTable 1-1. The ID is used when querying the attributes ofa control, and when getting or setting the current value.

Generally applications should present controls to the userwithout assumptions about their purpose. Each control comes with aname string the user is supposed to understand. When the purpose isnon-intuitive the driver writer should provide a user manual, a userinterface plug-in or a driver specific panel application. PredefinedIDs were introduced to change a few controls programmatically, forexample to mute a device during a channel switch.

Drivers may enumerate different controls after switchingthe current video input or output, tuner or modulator, or audio inputor output. Different in the sense of other bounds, another default andcurrent value, step size or other menu items. A control with a certaincustom ID can also change name andtype.[9] Control values are stored globally, they do notchange when switching except to stay within the reported bounds. Theyalso do not change e. g. when the device is opened or closed, when thetuner radio frequency is changed or generally never withoutapplication request. Since V4L2 specifies no event mechanism, panelapplications intended to cooperate with other panel applications (bethey built into a larger application, as a TV viewer) may need toregularly poll control values to update their userinterface.[10]

Table 1-1. Control IDs

ID Type Description
V4L2_CID_BASE   First predefined ID, equal toV4L2_CID_BRIGHTNESS.
V4L2_CID_USER_BASE   Synonym of V4L2_CID_BASE.
V4L2_CID_BRIGHTNESS integer Picture brightness, or more precisely, the blacklevel.
V4L2_CID_CONTRAST integer Picture contrast or luma gain.
V4L2_CID_SATURATION integer Picture color saturation or chroma gain.
V4L2_CID_HUE integer Hue or color balance.
V4L2_CID_AUDIO_VOLUME integer Overall audio volume. Note some drivers alsoprovide an OSS or ALSA mixer interface.
V4L2_CID_AUDIO_BALANCE integer Audio stereo balance. Minimum corresponds to allthe way left, maximum to right.
V4L2_CID_AUDIO_BASS integer Audio bass adjustment.
V4L2_CID_AUDIO_TREBLE integer Audio treble adjustment.
V4L2_CID_AUDIO_MUTE boolean Mute audio, i. e. set the volume to zero, howeverwithout affecting V4L2_CID_AUDIO_VOLUME. LikeALSA drivers, V4L2 drivers must mute at load time to avoid excessivenoise. Actually the entire device should be reset to a low powerconsumption state.
V4L2_CID_AUDIO_LOUDNESS boolean Loudness mode (bass boost).
V4L2_CID_BLACK_LEVEL integer Another name for brightness (not a synonym ofV4L2_CID_BRIGHTNESS). This control is deprecatedand should not be used in new drivers and applications.
V4L2_CID_AUTO_WHITE_BALANCE boolean Automatic white balance (cameras).
V4L2_CID_DO_WHITE_BALANCE button This is an action control. When set (the value isignored), the device will do a white balance and then hold the currentsetting. Contrast this with the booleanV4L2_CID_AUTO_WHITE_BALANCE, which, whenactivated, keeps adjusting the white balance.
V4L2_CID_RED_BALANCE integer Red chroma balance.
V4L2_CID_BLUE_BALANCE integer Blue chroma balance.
V4L2_CID_GAMMA integer Gamma adjust.
V4L2_CID_WHITENESS integer Whiteness for grey-scale devices. This is a synonymforV4L2_CID_GAMMA. This control is deprecatedand should not be used in new drivers and applications.
V4L2_CID_EXPOSURE integer Exposure (cameras). [Unit?]
V4L2_CID_AUTOGAIN boolean Automatic gain/exposure control.
V4L2_CID_GAIN integer Gain control.
V4L2_CID_HFLIP boolean Mirror the picture horizontally.
V4L2_CID_VFLIP boolean Mirror the picture vertically.
V4L2_CID_HCENTER_DEPRECATED (formerlyV4L2_CID_HCENTER) integer Horizontal image centering. This control isdeprecated. New drivers and applications should use theCamera class controlsV4L2_CID_PAN_ABSOLUTE,V4L2_CID_PAN_RELATIVEandV4L2_CID_PAN_RESET instead.
V4L2_CID_VCENTER_DEPRECATED (formerlyV4L2_CID_VCENTER) integer Vertical image centering. Centering is intended tophysically adjust cameras. For image cropping seeSection 1.11, for clippingSection 4.2. Thiscontrol is deprecated. New drivers and applications should use theCamera class controlsV4L2_CID_TILT_ABSOLUTE,V4L2_CID_TILT_RELATIVEandV4L2_CID_TILT_RESET instead.
V4L2_CID_POWER_LINE_FREQUENCY integer Enables a power line frequency filter to avoidflicker. Possible values are:V4L2_CID_POWER_LINE_FREQUENCY_DISABLED(0),V4L2_CID_POWER_LINE_FREQUENCY_50HZ (1) andV4L2_CID_POWER_LINE_FREQUENCY_60HZ (2).
V4L2_CID_HUE_AUTO boolean Enables automatic hue control by the device. Theeffect of setting V4L2_CID_HUE while automatichue control is enabled is undefined, drivers should ignore suchrequest.
V4L2_CID_WHITE_BALANCE_TEMPERATURE integer This control specifies the white balance settingsas a color temperature in Kelvin. A driver should have a minimum of2800 (incandescent) to 6500 (daylight). For more information aboutcolor temperature seeWikipedia.
V4L2_CID_SHARPNESS integer Adjusts the sharpness filters in a camera. Theminimum value disables the filters, higher values give a sharperpicture.
V4L2_CID_BACKLIGHT_COMPENSATION integer Adjusts the backlight compensation in a camera. Theminimum value disables backlight compensation.
V4L2_CID_LASTP1   End of the predefined control IDs (currentlyV4L2_CID_BACKLIGHT_COMPENSATION + 1).
V4L2_CID_PRIVATE_BASE   ID of the first custom (driver specific) control.Applications depending on particular custom controls should check thedriver name and version, seeSection 1.2.

Applications can enumerate the available controls with theVIDIOC_QUERYCTRL andVIDIOC_QUERYMENU ioctls, get and set acontrol value with theVIDIOC_G_CTRL andVIDIOC_S_CTRL ioctls.Drivers must implementVIDIOC_QUERYCTRL,VIDIOC_G_CTRL andVIDIOC_S_CTRL when the device has one or morecontrols,VIDIOC_QUERYMENU when it has one ormore menu type controls.

Example 1-8. Enumerating all controls

struct v4l2_queryctrl queryctrl;
struct v4l2_querymenu querymenu;

static void
enumerate_menu (void)
{
        printf ("  Menu items:\n");

        memset (&querymenu, 0, sizeof (querymenu));
        querymenu.id = queryctrl.id;

        for (querymenu.index = queryctrl.minimum;
             querymenu.index <= queryctrl.maximum;
              querymenu.index++) {
                if (0 == ioctl (fd, VIDIOC_QUERYMENU, &querymenu)) {
                        printf ("  %s\n", querymenu.name);
                } else {
                        perror ("VIDIOC_QUERYMENU");
                        exit (EXIT_FAILURE);
                }
        }
}

memset (&queryctrl, 0, sizeof (queryctrl));

for (queryctrl.id = V4L2_CID_BASE;
     queryctrl.id < V4L2_CID_LASTP1;
     queryctrl.id++) {
        if (0 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
                if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
                        continue;

                printf ("Control %s\n", queryctrl.name);

                if (queryctrl.type == V4L2_CTRL_TYPE_MENU)
                        enumerate_menu ();
        } else {
                if (errno == EINVAL)
                        continue;

                perror ("VIDIOC_QUERYCTRL");
                exit (EXIT_FAILURE);
        }
}

for (queryctrl.id = V4L2_CID_PRIVATE_BASE;;
     queryctrl.id++) {
        if (0 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
                if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
                        continue;

                printf ("Control %s\n", queryctrl.name);

                if (queryctrl.type == V4L2_CTRL_TYPE_MENU)
                        enumerate_menu ();
        } else {
                if (errno == EINVAL)
                        break;

                perror ("VIDIOC_QUERYCTRL");
                exit (EXIT_FAILURE);
        }
}

Example 1-9. Changing controls

struct v4l2_queryctrl queryctrl;
struct v4l2_control control;

memset (&queryctrl, 0, sizeof (queryctrl));
queryctrl.id = V4L2_CID_BRIGHTNESS;

if (-1 == ioctl (fd, VIDIOC_QUERYCTRL, &queryctrl)) {
        if (errno != EINVAL) {
                perror ("VIDIOC_QUERYCTRL");
                exit (EXIT_FAILURE);
        } else {
                printf ("V4L2_CID_BRIGHTNESS is not supported\n");
        }
} else if (queryctrl.flags & V4L2_CTRL_FLAG_DISABLED) {
        printf ("V4L2_CID_BRIGHTNESS is not supported\n");
} else {
        memset (&control, 0, sizeof (control));
        control.id = V4L2_CID_BRIGHTNESS;
        control.value = queryctrl.default_value;

        if (-1 == ioctl (fd, VIDIOC_S_CTRL, &control)) {
                perror ("VIDIOC_S_CTRL");
                exit (EXIT_FAILURE);
        }
}

memset (&control, 0, sizeof (control));
control.id = V4L2_CID_CONTRAST;

if (0 == ioctl (fd, VIDIOC_G_CTRL, &control)) {
        control.value += 1;

        /* The driver may clamp the value or return ERANGE, ignored here */

        if (-1 == ioctl (fd, VIDIOC_S_CTRL, &control)
            && errno != ERANGE) {
                perror ("VIDIOC_S_CTRL");
                exit (EXIT_FAILURE);
        }
/* Ignore if V4L2_CID_CONTRAST is unsupported */
} else if (errno != EINVAL) {
        perror ("VIDIOC_G_CTRL");
        exit (EXIT_FAILURE);
}

control.id = V4L2_CID_AUDIO_MUTE;
control.value = TRUE; /* silence */

/* Errors ignored */
ioctl (fd, VIDIOC_S_CTRL, &control);

1.9. Extended Controls

1.9.1. Introduction

The control mechanism as originally designed was meantto be used for user settings (brightness, saturation, etc). However,it turned out to be a very useful model for implementing morecomplicated driver APIs where each driver implements only a subset ofa larger API.

The MPEG encoding API was the driving force behinddesigning and implementing this extended control mechanism: the MPEGstandard is quite large and the currently supported hardware MPEGencoders each only implement a subset of this standard. Further more,many parameters relating to how the video is encoded into an MPEGstream are specific to the MPEG encoding chip since the MPEG standardonly defines the format of the resulting MPEG stream, not how thevideo is actually encoded into that format.

Unfortunately, the original control API lacked somefeatures needed for these new uses and so it was extended into the(not terribly originally named) extended control API.


1.9.2. The Extended Control API

Three new ioctls are available: VIDIOC_G_EXT_CTRLS,VIDIOC_S_EXT_CTRLS andVIDIOC_TRY_EXT_CTRLS. These ioctls act onarrays of controls (as opposed to theVIDIOC_G_CTRL andVIDIOC_S_CTRL ioctls that act on a single control). This is neededsince it is often required to atomically change several controls atonce.

Each of the new ioctls expects a pointer to astruct v4l2_ext_controls. This structure contains a pointer to the controlarray, a count of the number of controls in that array and a controlclass. Control classes are used to group similar controls into asingle class. For example, control classV4L2_CTRL_CLASS_USER contains all user controls(i. e. all controls that can also be set using the oldVIDIOC_S_CTRL ioctl). Control classV4L2_CTRL_CLASS_MPEG contains all controlsrelating to MPEG encoding, etc.

All controls in the control array must belong to thespecified control class. An error is returned if this is not thecase.

It is also possible to use an empty control array (count== 0) to check whether the specified control class issupported.

The control array is a struct v4l2_ext_control array. Thev4l2_ext_control structure is very similar tostruct v4l2_control, except for the fact that it also allows for 64-bitvalues and pointers to be passed (although the latter is not yet usedanywhere).

It is important to realize that due to the flexibility ofcontrols it is necessary to check whether the control you want to setactually is supported in the driver and what the valid range of valuesis. So use theVIDIOC_QUERYCTRLandVIDIOC_QUERYMENU ioctls tocheck this. Also note that it is possible that some of the menuindices in a control of typeV4L2_CTRL_TYPE_MENUmay not be supported (VIDIOC_QUERYMENU willreturn an error). A good example is the list of supported MPEG audiobitrates. Some drivers only support one or two bitrates, otherssupport a wider range.


1.9.3. Enumerating Extended Controls

The recommended way to enumerate over the extendedcontrols is by using VIDIOC_QUERYCTRL in combination with theV4L2_CTRL_FLAG_NEXT_CTRL flag:

struct v4l2_queryctrl qctrl;

qctrl.id = V4L2_CTRL_FLAG_NEXT_CTRL;
while (0 == ioctl (fd, VIDIOC_QUERYCTRL, &qctrl)) {
        /* ... */
        qctrl.id |= V4L2_CTRL_FLAG_NEXT_CTRL;
}

The initial control ID is set to 0 ORed with theV4L2_CTRL_FLAG_NEXT_CTRL flag. TheVIDIOC_QUERYCTRL ioctl will return the firstcontrol with a higher ID than the specified one. When no such controlsare found an error is returned.

If you want to get all controls within a specific controlclass, then you can set the initialqctrl.id value to the control class and addan extra check to break out of the loop when a control of anothercontrol class is found:

qctrl.id = V4L2_CTRL_CLASS_MPEG | V4L2_CTRL_FLAG_NEXT_CTRL;
while (0 == ioctl (fd, VIDIOC_QUERYCTRL, &qctrl)) {
        if (V4L2_CTRL_ID2CLASS (qctrl.id) != V4L2_CTRL_CLASS_MPEG)
                break;
                /* ... */
                qctrl.id |= V4L2_CTRL_FLAG_NEXT_CTRL;
        }

The 32-bit qctrl.id value issubdivided into three bit ranges: the top 4 bits are reserved forflags (e. g.V4L2_CTRL_FLAG_NEXT_CTRL) and are notactually part of the ID. The remaining 28 bits form the control ID, ofwhich the most significant 12 bits define the control class and theleast significant 16 bits identify the control within the controlclass. It is guaranteed that these last 16 bits are always non-zerofor controls. The range of 0x1000 and up are reserved fordriver-specific controls. The macroV4L2_CTRL_ID2CLASS(id) returns the control classID based on a control ID.

If the driver does not support extended controls, thenVIDIOC_QUERYCTRL will fail when used incombination withV4L2_CTRL_FLAG_NEXT_CTRL. Inthat case the old method of enumerating control should be used (see1.8). But if it is supported, then it is guaranteed to enumerate overall controls, including driver-private controls.


1.9.4. Creating Control Panels

It is possible to create control panels for a graphicaluser interface where the user can select the various controls.Basically you will have to iterate over all controls using the methoddescribed above. Each control class starts with a control of typeV4L2_CTRL_TYPE_CTRL_CLASS.VIDIOC_QUERYCTRL will return the name of thiscontrol class which can be used as the title of a tab page within acontrol panel.

The flags field of struct v4l2_queryctrl also contains hints onthe behavior of the control. See theVIDIOC_QUERYCTRLdocumentationfor more details.


1.9.5. MPEG Control Reference

Below all controls within the MPEG control class aredescribed. First the generic controls, then controls specific forcertain hardware.


1.9.5.1. Generic MPEG Controls

Table 1-2. MPEG Control IDs

ID Type  
  Description
       
V4L2_CID_MPEG_CLASS  class  
  The MPEG classdescriptor. Calling VIDIOC_QUERYCTRL for this control will return adescription of this control class. This description can be used as thecaption of a Tab page in a GUI, for example.
       
V4L2_CID_MPEG_STREAM_TYPE  enum  
  The MPEG-1, -2 or -4output stream type. One cannot assume anything here. Each hardwareMPEG encoder tends to support different subsets of the available MPEGstream types. The currently defined stream types are:
 
V4L2_MPEG_STREAM_TYPE_MPEG2_PS  MPEG-2 program stream
V4L2_MPEG_STREAM_TYPE_MPEG2_TS  MPEG-2 transport stream
V4L2_MPEG_STREAM_TYPE_MPEG1_SS  MPEG-1 system stream
V4L2_MPEG_STREAM_TYPE_MPEG2_DVD  MPEG-2 DVD-compatible stream
V4L2_MPEG_STREAM_TYPE_MPEG1_VCD  MPEG-1 VCD-compatible stream
V4L2_MPEG_STREAM_TYPE_MPEG2_SVCD  MPEG-2 SVCD-compatible stream
       
V4L2_CID_MPEG_STREAM_PID_PMT  integer  
  Program Map TablePacket ID for the MPEG transport stream (default 16)
       
V4L2_CID_MPEG_STREAM_PID_AUDIO  integer  
  Audio Packet ID forthe MPEG transport stream (default 256)
       
V4L2_CID_MPEG_STREAM_PID_VIDEO  integer  
  Video Packet ID forthe MPEG transport stream (default 260)
       
V4L2_CID_MPEG_STREAM_PID_PCR  integer  
  Packet ID for theMPEG transport stream carrying PCR fields (default 259)
       
V4L2_CID_MPEG_STREAM_PES_ID_AUDIO  integer  
  Audio ID for MPEGPES
       
V4L2_CID_MPEG_STREAM_PES_ID_VIDEO  integer  
  Video ID for MPEGPES
       
V4L2_CID_MPEG_STREAM_VBI_FMT  enum  
  Some cards can embedVBI data (e. g. Closed Caption, Teletext) into the MPEG stream. Thiscontrol selects whether VBI data should be embedded, and if so, whatembedding method should be used. The list of possible VBI formatsdepends on the driver. The currently defined VBI format typesare:
 
V4L2_MPEG_STREAM_VBI_FMT_NONE  No VBI in the MPEG stream
V4L2_MPEG_STREAM_VBI_FMT_IVTV  VBI in private packets, IVTV format (documentedin the kernel sources in the fileDocumentation/video4linux/cx2341x/README.vbi)
       
V4L2_CID_MPEG_AUDIO_SAMPLING_FREQ  enum  
  MPEG Audio samplingfrequency. Possible values are:
 
V4L2_MPEG_AUDIO_SAMPLING_FREQ_44100  44.1 kHz
V4L2_MPEG_AUDIO_SAMPLING_FREQ_48000  48 kHz
V4L2_MPEG_AUDIO_SAMPLING_FREQ_32000  32 kHz
       
V4L2_CID_MPEG_AUDIO_ENCODING  enum  
  MPEG Audio encoding.Possible values are:
 
V4L2_MPEG_AUDIO_ENCODING_LAYER_1  MPEG Layer I encoding
V4L2_MPEG_AUDIO_ENCODING_LAYER_2  MPEG Layer II encoding
V4L2_MPEG_AUDIO_ENCODING_LAYER_3  MPEG Layer III encoding
       
V4L2_CID_MPEG_AUDIO_L1_BITRATE  enum  
  Layer I bitrate.Possible values are:
 
V4L2_MPEG_AUDIO_L1_BITRATE_32K  32 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_64K  64 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_96K  96 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_128K  128 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_160K  160 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_192K  192 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_224K  224 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_256K  256 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_288K  288 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_320K  320 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_352K  352 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_384K  384 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_416K  416 kbit/s
V4L2_MPEG_AUDIO_L1_BITRATE_448K  448 kbit/s
       
V4L2_CID_MPEG_AUDIO_L2_BITRATE  enum  
  Layer II bitrate.Possible values are:
 
V4L2_MPEG_AUDIO_L2_BITRATE_32K  32 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_48K  48 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_56K  56 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_64K  64 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_80K  80 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_96K  96 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_112K  112 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_128K  128 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_160K  160 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_192K  192 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_224K  224 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_256K  256 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_320K  320 kbit/s
V4L2_MPEG_AUDIO_L2_BITRATE_384K  384 kbit/s
       
V4L2_CID_MPEG_AUDIO_L3_BITRATE  enum  
  Layer III bitrate.Possible values are:
 
V4L2_MPEG_AUDIO_L3_BITRATE_32K  32 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_40K  40 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_48K  48 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_56K  56 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_64K  64 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_80K  80 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_96K  96 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_112K  112 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_128K  128 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_160K  160 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_192K  192 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_224K  224 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_256K  256 kbit/s
V4L2_MPEG_AUDIO_L3_BITRATE_320K  320 kbit/s
       
V4L2_CID_MPEG_AUDIO_MODE  enum  
  MPEG Audio mode.Possible values are:
 
V4L2_MPEG_AUDIO_MODE_STEREO  Stereo
V4L2_MPEG_AUDIO_MODE_JOINT_STEREO  Joint Stereo
V4L2_MPEG_AUDIO_MODE_DUAL  Bilingual
V4L2_MPEG_AUDIO_MODE_MONO  Mono
       
V4L2_CID_MPEG_AUDIO_MODE_EXTENSION  enum  
  Joint Stereoaudio mode extension. In Layer I and II they indicate which subbandsare in intensity stereo. All other subbands are coded in stereo. LayerIII is not (yet) supported. Possible valuesare:
 
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_4  Subbands 4-31 in intensity stereo
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_8  Subbands 8-31 in intensity stereo
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_12  Subbands 12-31 in intensity stereo
V4L2_MPEG_AUDIO_MODE_EXTENSION_BOUND_16  Subbands 16-31 in intensity stereo
       
V4L2_CID_MPEG_AUDIO_EMPHASIS  enum  
  Audio Emphasis.Possible values are:
 
V4L2_MPEG_AUDIO_EMPHASIS_NONE  None
V4L2_MPEG_AUDIO_EMPHASIS_50_DIV_15_uS  50/15 microsecond emphasis
V4L2_MPEG_AUDIO_EMPHASIS_CCITT_J17  CCITT J.17
       
V4L2_CID_MPEG_AUDIO_CRC  enum  
  CRC method. Possiblevalues are:
 
V4L2_MPEG_AUDIO_CRC_NONE  None
V4L2_MPEG_AUDIO_CRC_CRC16  16 bit parity check
       
V4L2_CID_MPEG_AUDIO_MUTE  bool  
  Mutes the audio whencapturing. This is not done by muting audio hardware, which can stillproduce a slight hiss, but in the encoder itself, guaranteeing a fixedand reproducable audio bitstream. 0 = unmuted, 1 = muted.
       
V4L2_CID_MPEG_VIDEO_ENCODING  enum  
  MPEG Video encodingmethod. Possible values are:
 
V4L2_MPEG_VIDEO_ENCODING_MPEG_1  MPEG-1 Video encoding
V4L2_MPEG_VIDEO_ENCODING_MPEG_2  MPEG-2 Video encoding
       
V4L2_CID_MPEG_VIDEO_ASPECT  enum  
  Video aspect.Possible values are:
 
V4L2_MPEG_VIDEO_ASPECT_1x1   
V4L2_MPEG_VIDEO_ASPECT_4x3   
V4L2_MPEG_VIDEO_ASPECT_16x9   
V4L2_MPEG_VIDEO_ASPECT_221x100   
       
V4L2_CID_MPEG_VIDEO_B_FRAMES  integer  
  Number of B-Frames(default 2)
       
V4L2_CID_MPEG_VIDEO_GOP_SIZE  integer  
  GOP size (default12)
       
V4L2_CID_MPEG_VIDEO_GOP_CLOSURE  bool  
  GOP closure (default1)
       
V4L2_CID_MPEG_VIDEO_PULLDOWN  bool  
  Enable 3:2 pulldown(default 0)
       
V4L2_CID_MPEG_VIDEO_BITRATE_MODE  enum  
  Video bitrate mode.Possible values are:
 
V4L2_MPEG_VIDEO_BITRATE_MODE_VBR  Variable bitrate
V4L2_MPEG_VIDEO_BITRATE_MODE_CBR  Constant bitrate
       
V4L2_CID_MPEG_VIDEO_BITRATE  integer  
  Video bitrate in bitsper second.
       
V4L2_CID_MPEG_VIDEO_BITRATE_PEAK  integer  
  Peak video bitrate inbits per second. Must be larger or equal to the average video bitrate.It is ignored if the video bitrate mode is set to constantbitrate.
       
V4L2_CID_MPEG_VIDEO_TEMPORAL_DECIMATION  integer  
  For every capturedframe, skip this many subsequent frames (default 0).
       
V4L2_CID_MPEG_VIDEO_MUTE  bool  
  "Mutes" the video to afixed color when capturing. This is useful for testing, to produce afixed video bitstream. 0 = unmuted, 1 = muted.
       
V4L2_CID_MPEG_VIDEO_MUTE_YUV  integer  
  Sets the "mute" colorof the video. The supplied 32-bit integer is interpreted as follows (bit0 = least significant bit):
 
Bit 0:7 V chrominance information
Bit 8:15 U chrominance information
Bit 16:23 Y luminance information
Bit 24:31 Must be zero.

1.9.5.2. CX2341x MPEG Controls

The following MPEG class controls deal with MPEGencoding settings that are specific to the Conexant CX23415 andCX23416 MPEG encoding chips.

Table 1-3. CX2341x Control IDs

ID Type  
  Description
       
V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE  enum  
  Sets the SpatialFilter mode (default MANUAL). Possible valuesare:
 
V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_MANUAL  Choose the filter manually
V4L2_MPEG_CX2341X_VIDEO_SPATIAL_FILTER_MODE_AUTO  Choose the filter automatically
       
V4L2_CID_MPEG_CX2341X_VIDEO_SPATIAL_FILTER  integer (0-15)  
  The setting for theSpatial Filter. 0 = off, 15 = maximum. (Default is 0.)
       
V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE  enum  
  Select the algorithmto use for the Luma Spatial Filter (default1D_HOR). Possible values:
 
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_OFF  No filter
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_HOR  One-dimensional horizontal
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_1D_VERT  One-dimensional vertical
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_HV_SEPARABLE  Two-dimensional separable
V4L2_MPEG_CX2341X_VIDEO_LUMA_SPATIAL_FILTER_TYPE_2D_SYM_NON_SEPARABLE  Two-dimensional symmetricalnon-separable
       
V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE  enum  
  Select the algorithmfor the Chroma Spatial Filter (default 1D_HOR).Possible values are:
 
V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_OFF  No filter
V4L2_MPEG_CX2341X_VIDEO_CHROMA_SPATIAL_FILTER_TYPE_1D_HOR  One-dimensional horizontal
       
V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE  enum  
  Sets the TemporalFilter mode (default MANUAL). Possible valuesare:
 
V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_MANUAL  Choose the filter manually
V4L2_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER_MODE_AUTO  Choose the filter automatically
       
V4L2_CID_MPEG_CX2341X_VIDEO_TEMPORAL_FILTER  integer (0-31)  
  The setting for theTemporal Filter. 0 = off, 31 = maximum. (Default is 8 for full-scalecapturing and 0 for scaled capturing.)
       
V4L2_CID_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE  enum  
  Median Filter Type(default OFF). Possible values are:
 
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_OFF  No filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR  Horizontal filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_VERT  Vertical filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_HOR_VERT  Horizontal and vertical filter
V4L2_MPEG_CX2341X_VIDEO_MEDIAN_FILTER_TYPE_DIAG  Diagonal filter
       
V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_BOTTOM  integer (0-255)  
  Threshold above whichthe luminance median filter is enabled (default 0)
       
V4L2_CID_MPEG_CX2341X_VIDEO_LUMA_MEDIAN_FILTER_TOP  integer (0-255)  
  Threshold below whichthe luminance median filter is enabled (default 255)
       
V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_BOTTOM  integer (0-255)  
  Threshold above whichthe chroma median filter is enabled (default 0)
       
V4L2_CID_MPEG_CX2341X_VIDEO_CHROMA_MEDIAN_FILTER_TOP  integer (0-255)  
  Threshold below whichthe chroma median filter is enabled (default 255)
       
V4L2_CID_MPEG_CX2341X_STREAM_INSERT_NAV_PACKETS  bool  
  The CX2341X MPEG encodercan insert one empty MPEG-2 PES packet into the stream between everyfour video frames. The packet size is 2048 bytes, including thepacket_start_code_prefix and stream_id fields. The stream_id is 0xBF(private stream 2). The payload consists of 0x00 bytes, to be filledin by the application. 0 = do not insert, 1 = insert packets.

1.9.6. Camera Control Reference

The Camera class includes controls for mechanical (orequivalent digital) features of a device such as controllable lensesor sensors.

Table 1-4. Camera Control IDs

ID Type  
  Description
       
V4L2_CID_CAMERA_CLASS  class  
  The Camera classdescriptor. Calling VIDIOC_QUERYCTRL for this control will return adescription of this control class.
       
V4L2_CID_EXPOSURE_AUTO  integer  
  Enables automaticadjustments of the exposure time and/or iris aperture. The effect ofmanual changes of the exposure time or iris aperture while thesefeatures are enabled is undefined, drivers should ignore suchrequests. Possible values are:
 
V4L2_EXPOSURE_AUTO  Automatic exposure time, automatic irisaperture.
V4L2_EXPOSURE_MANUAL  Manual exposure time, manual iris.
V4L2_EXPOSURE_SHUTTER_PRIORITY  Manual exposure time, auto iris.
V4L2_EXPOSURE_APERTURE_PRIORITY  Auto exposure time, manual iris.
       
V4L2_CID_EXPOSURE_ABSOLUTE  integer  
  Determines the exposuretime of the camera sensor. The exposure time is limited by the frameinterval. Drivers should interpret the values as 100 µs units,where the value 1 stands for 1/10000th of a second, 10000 for 1 secondand 100000 for 10 seconds.
       
V4L2_CID_EXPOSURE_AUTO_PRIORITY  boolean  
  WhenV4L2_CID_EXPOSURE_AUTO is set toAUTO orSHUTTER_PRIORITY,this control determines if the device may dynamically vary the framerate. By default this feature is disabled (0) and the frame rate mustremain constant.
       
V4L2_CID_PAN_RELATIVE  integer  
  This control turns thecamera horizontally by the specified amount. The unit is undefined. Apositive value moves the camera to the right (clockwise when viewedfrom above), a negative value to the left. A value of zero does notcause motion.
       
V4L2_CID_TILT_RELATIVE  integer  
  This control turns thecamera vertically by the specified amount. The unit is undefined. Apositive value moves the camera up, a negative value down. A value ofzero does not cause motion.
       
V4L2_CID_PAN_RESET  boolean  
  When this control is setto TRUE (1), the camera moves horizontally to thedefault position.
       
V4L2_CID_TILT_RESET  boolean  
  When this control is setto TRUE (1), the camera moves vertically to thedefault position.
       
V4L2_CID_PAN_ABSOLUTE  integer  
  This controlturns the camera horizontally to the specified position. Positivevalues move the camera to the right (clockwise when viewed from above),negative values to the left. Drivers should interpret the values as arcseconds, with valid values between -180 * 3600 and +180 * 3600inclusive.
       
V4L2_CID_TILT_ABSOLUTE  integer  
  This controlturns the camera vertically to the specified position. Positive valuesmove the camera up, negative values down. Drivers should interpret thevalues as arc seconds, with valid values between -180 * 3600 and +180* 3600 inclusive.
       
V4L2_CID_FOCUS_ABSOLUTE  integer  
  This control sets thefocal point of the camera to the specified position. The unit isundefined. Positive values set the focus closer to the camera,negative values towards infinity.
       
V4L2_CID_FOCUS_RELATIVE  integer  
  This control moves thefocal point of the camera by the specified amount. The unit isundefined. Positive values move the focus closer to the camera,negative values towards infinity.
       
V4L2_CID_FOCUS_AUTO  boolean  
  Enables automatic focusadjustments. The effect of manual focus adjustments while this featureis enabled is undefined, drivers should ignore such requests.
       

1.10. Data Formats

1.10.1. Data Format Negotiation

Different devices exchange different kinds of data withapplications, for example video images, raw or sliced VBI data, RDSdatagrams. Even within one kind many different formats are possible,in particular an abundance of image formats. Although drivers mustprovide a default and the selection persists across closing andreopening a device, applications should always negotiate a data formatbefore engaging in data exchange. Negotiation means the applicationasks for a particular format and the driver selects and reports thebest the hardware can do to satisfy the request. Of courseapplications can also just query the current selection.

A single mechanism exists to negotiate all data formatsusing the aggregate struct v4l2_format and theVIDIOC_G_FMTandVIDIOC_S_FMT ioctls. Additionally theVIDIOC_TRY_FMT ioctl can beused to examine what the hardwarecoulddo,without actually selecting a new data format. The data formatssupported by the V4L2 API are covered in the respective device sectioninChapter 4. For a closer look at image formats seeChapter 2.

The VIDIOC_S_FMT ioctl is a majorturning-point in the initialization sequence. Prior to this pointmultiple panel applications can access the same device concurrently toselect the current input, change controls or modify other properties.The first VIDIOC_S_FMT assigns a logical stream(video data, VBI data etc.) exclusively to one file descriptor.

Exclusive means no other application, more precisely noother file descriptor, can grab this stream or change deviceproperties inconsistent with the negotiated parameters. A videostandard change for example, when the new standard uses a differentnumber of scan lines, can invalidate the selected image format.Therefore only the file descriptor owning the stream can makeinvalidating changes. Accordingly multiple file descriptors whichgrabbed different logical streams prevent each other from interferingwith their settings. When for example video overlay is about to startor already in progress, simultaneous video capturing may be restrictedto the same cropping and image size.

When applications omit theVIDIOC_S_FMT ioctl its locking side effects areimplied by the next step, the selection of an I/O method with theVIDIOC_REQBUFS ioctl or implicit with the first read() orwrite() call.

Generally only one logical stream can be assigned to afile descriptor, the exception being drivers permitting simultaneousvideo capturing and overlay using the same file descriptor forcompatibility with V4L and earlier versions of V4L2. Switching thelogical stream or returning into "panel mode" is possible by closingand reopening the device. Driversmay support aswitch usingVIDIOC_S_FMT.

All drivers exchanging data withapplications must support the VIDIOC_G_FMT andVIDIOC_S_FMT ioctl. Implementation of theVIDIOC_TRY_FMT is highly recommended butoptional.


1.10.2. Image Format Enumeration

Apart of the generic format negotiation functionsa special ioctl to enumerate all image formats supported by videocapture, overlay or output devices is available.[11]

The VIDIOC_ENUM_FMT ioctl must be supportedby all drivers exchanging image data with applications.

Important: Drivers are not supposed to convert image formats inkernel space. They must enumerate only formats directly supported bythe hardware. If necessary driver writers should publish an exampleconversion routine or library for integration into applications.


1.11. Image Cropping, Insertion and Scaling

Some video capture devices can sample a subsection of thepicture and shrink or enlarge it to an image of arbitrary size. Wecall these abilities cropping and scaling. Some video output devicescan scale an image up or down and insert it at an arbitrary scan lineand horizontal offset into a video signal.

Applications can use the following API to select an area inthe video signal, query the default area and the hardware limits.Despite their name, theVIDIOC_CROPCAP,VIDIOC_G_CROPandVIDIOC_S_CROP ioctls apply to input as well as outputdevices.

Scaling requires a source and a target. On a video captureor overlay device the source is the video signal, and the croppingioctls determine the area actually sampled. The target are imagesread by the application or overlaid onto the graphics screen. Theirsize (and position for an overlay) is negotiated with theVIDIOC_G_FMT andVIDIOC_S_FMTioctls.

On a video output device the source are the images passed inby the application, and their size is again negotiated with theVIDIOC_G/S_FMT ioctls, or may be encoded in acompressed video stream. The target is the video signal, and thecropping ioctls determine the area where the images areinserted.

Source and target rectangles are defined even if the devicedoes not support scaling or theVIDIOC_G/S_CROPioctls. Their size (and position where applicable) will be fixed inthis case.All capture and output device must support theVIDIOC_CROPCAP ioctl such that applications candetermine if scaling takes place.


1.11.1. Cropping Structures

Figure 1-1. Image Cropping, Insertion and Scaling

For capture devices the coordinates of the top leftcorner, width and height of the area which can be sampled is given bythebounds substructure of thestruct v4l2_cropcap returned by theVIDIOC_CROPCAPioctl. To support a wide range of hardware this specification does notdefine an origin or units. However by convention drivers shouldhorizontally count unscaled samples relative to 0H (the leading edgeof the horizontal sync pulse, see Figure 4-1).Vertically ITU-R linenumbers of the first field (Figure 4-2,Figure 4-3), multiplied by two if the driver can capture bothfields.

The top left corner, width and height of the sourcerectangle, that is the area actually sampled, is given by struct v4l2_cropusing the same coordinate system as struct v4l2_cropcap. Applications canuse the VIDIOC_G_CROPandVIDIOC_S_CROP ioctls to get and set thisrectangle. It must lie completely within the capture boundaries andthe driver may further adjust the requested size and/or positionaccording to hardware limitations.

Each capture device has a default source rectangle, givenby the defrect substructure ofstruct v4l2_cropcap. The center of this rectangle shall align with thecenter of the active picture area of the video signal, and cover whatthe driver writer considers the complete picture. Drivers shall resetthe source rectangle to the default when the driver is first loaded,but not later.

For output devices these structures and ioctls are usedaccordingly, defining thetarget rectangle wherethe images will be inserted into the video signal.


1.11.2. Scaling Adjustments

Video hardware can have various cropping, insertion andscaling limitations. It may only scale up or down, support onlydiscrete scaling factors, or have different scaling abilities inhorizontal and vertical direction. Also it may not support scaling atall. At the same time the struct v4l2_crop rectangle may have to bealigned, and both the source and target rectangles may have arbitraryupper and lower size limits. In particular the maximumwidth and heightin struct v4l2_crop may be smaller than thestruct v4l2_cropcap.bounds area. Therefore, asusual, drivers are expected to adjust the requested parameters andreturn the actual values selected.

Applications can change the source or the target rectanglefirst, as they may prefer a particular image size or a certain area inthe video signal. If the driver has to adjust both to satisfy hardwarelimitations, the last requested rectangle shall take priority, and thedriver should preferably adjust the opposite one. The VIDIOC_TRY_FMTioctl however shall not change the driver state and therefore onlyadjust the requested rectangle.

Suppose scaling on a video capture device is restricted toa factor 1:1 or 2:1 in either direction and the target image size mustbe a multiple of 16 × 16 pixels. The source croppingrectangle is set to defaults, which are also the upper limit in thisexample, of 640 × 400 pixels at offset 0, 0. Anapplication requests an image size of 300 × 225pixels, assuming video will be scaled down from the "full picture"accordingly. The driver sets the image size to the closest possiblevalues 304 × 224, then chooses the cropping rectangleclosest to the requested size, that is 608 × 224(224 × 2:1 would exceed the limit 400). The offset0, 0 is still valid, thus unmodified. Given the default croppingrectangle reported byVIDIOC_CROPCAP theapplication can easily propose another offset to center the croppingrectangle.

Now the application may insist on covering an area using apicture aspect ratio closer to the original request, so it asks for acropping rectangle of 608 × 456 pixels. The presentscaling factors limit cropping to 640 × 384, so thedriver returns the cropping size 608 × 384 and adjuststhe image size to closest possible 304 × 192.


1.11.3. Examples

Source and target rectangles shall remain unchanged acrossclosing and reopening a device, such that piping data into or out of adevice will work without special preparations. More advancedapplications should ensure the parameters are suitable before startingI/O.

Example 1-10. Resetting the cropping parameters

(A video capture device is assumed; changeV4L2_BUF_TYPE_VIDEO_CAPTURE for otherdevices.)

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
        perror ("VIDIOC_CROPCAP");
        exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
crop.c = cropcap.defrect; 

/* Ignore if cropping is not supported (EINVAL). */

if (-1 == ioctl (fd, VIDIOC_S_CROP, &crop)
    && errno != EINVAL) {
        perror ("VIDIOC_S_CROP");
        exit (EXIT_FAILURE);
}
      

Example 1-11. Simple downscaling

(A video capture device is assumed.)

struct v4l2_cropcap cropcap;
struct v4l2_format format;

reset_cropping_parameters ();

/* Scale down to 1/4 size of full picture. */

memset (&format, 0, sizeof (format)); /* defaults */

format.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

format.fmt.pix.width = cropcap.defrect.width >> 1;
format.fmt.pix.height = cropcap.defrect.height >> 1;
format.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;

if (-1 == ioctl (fd, VIDIOC_S_FMT, &format)) {
        perror ("VIDIOC_S_FORMAT");
        exit (EXIT_FAILURE);
}

/* We could check the actual image size now, the actual scaling factor
   or if the driver can scale at all. */
        

Example 1-12. Selecting an output area

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
        perror ("VIDIOC_CROPCAP");
        exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));

crop.type = V4L2_BUF_TYPE_VIDEO_OUTPUT;
crop.c = cropcap.defrect;

/* Scale the width and height to 50 % of their original size
   and center the output. */

crop.c.width /= 2;
crop.c.height /= 2;
crop.c.left += crop.c.width / 2;
crop.c.top += crop.c.height / 2;

/* Ignore if cropping is not supported (EINVAL). */

if (-1 == ioctl (fd, VIDIOC_S_CROP, &crop)
    && errno != EINVAL) {
        perror ("VIDIOC_S_CROP");
        exit (EXIT_FAILURE);
}

Example 1-13. Current scaling factor and pixel aspect

(A video capture device is assumed.)

struct v4l2_cropcap cropcap;
struct v4l2_crop crop;
struct v4l2_format format;
double hscale, vscale;
double aspect;
int dwidth, dheight;

memset (&cropcap, 0, sizeof (cropcap));
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_CROPCAP, &cropcap)) {
        perror ("VIDIOC_CROPCAP");
        exit (EXIT_FAILURE);
}

memset (&crop, 0, sizeof (crop));
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_G_CROP, &crop)) {
        if (errno != EINVAL) {
                perror ("VIDIOC_G_CROP");
                exit (EXIT_FAILURE);
        }

        /* Cropping not supported. */
        crop.c = cropcap.defrect;
}

memset (&format, 0, sizeof (format));
format.fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;

if (-1 == ioctl (fd, VIDIOC_G_FMT, &format)) {
        perror ("VIDIOC_G_FMT");
        exit (EXIT_FAILURE);
}

/* The scaling applied by the driver. */

hscale = format.fmt.pix.width / (double) crop.c.width;
vscale = format.fmt.pix.height / (double) crop.c.height;

aspect = cropcap.pixelaspect.numerator /
         (double) cropcap.pixelaspect.denominator;
aspect = aspect * hscale / vscale;

/* Devices following ITU-R BT.601 do not capture
   square pixels. For playback on a computer monitor
   we should scale the images to this size. */

dwidth = format.fmt.pix.width / aspect;
dheight = format.fmt.pix.height;
        

1.12. Streaming Parameters

Streaming parameters are intended to optimize the videocapture process as well as I/O. Presently applications can request ahigh quality capture mode with theVIDIOC_S_PARM ioctl.

The current video standard determines a nominal number offrames per second. If less than this number of frames is to becaptured or output, applications can request frame skipping orduplicating on the driver side. This is especially useful when usingtheread() orwrite(), which are not augmented by timestampsor sequence counters, and to avoid unneccessary data copying.

Finally these ioctls can be used to determine the number ofbuffers used internally by a driver in read/write mode. Forimplications see the section discussing theread()function.

To get and set the streaming parameters applications callthe VIDIOC_G_PARM and VIDIOC_S_PARM ioctl, respectively. They takea pointer to a struct v4l2_streamparm, which contains a union holdingseparate parameters for input and output devices.

These ioctls are optional, drivers need not implementthem. If so, they return theEINVAL error code.


Chapter 2. Image Formats

The V4L2 API was primarily designed for devices exchangingimage data with applications. Thev4l2_pix_formatstructure defines the formatand layout of an image in memory. Image formats are negotiated withtheVIDIOC_S_FMTioctl. (The explanations here focus on videocapturing and output, for overlay frame buffer formats see alsoVIDIOC_G_FBUF.)

Table 2-1. struct v4l2_pix_format

__u32 width Image width in pixels.
__u32 height Image height in pixels.
Applications set these fields torequest an image size, drivers return the closest possible values. Incase of planar formats thewidth andheight applies to the largest plane. Toavoid ambiguities drivers must return values rounded up to a multipleof the scale factor of any smaller planes. For example when the imageformat is YUV 4:2:0,widthandheight must be multiples of two.
__u32 pixelformat The pixel format or type of compression, set by theapplication. This is a little endianfour character code. V4L2 definesstandard RGB formats inTable 2-1, YUV formats inSection 2.5, and reserved codes inTable 2-8
enum v4l2_field field Video images are typically interlaced. Applicationscan request to capture or output only the top or bottom field, or bothfields interlaced or sequentially stored in one buffer or alternatingin separate buffers. Drivers return the actual field order selected.For details see Section 3.6.
__u32 bytesperline Distance in bytes between the leftmost pixels in twoadjacent lines.

Both applications and driverscan set this field to request padding bytes at the end of each line.Drivers however may ignore the value requested by the application,returningwidth times bytes per pixel or alarger value required by the hardware. That implies applications canjust set this field to zero to get a reasonabledefault.

Video hardware may access padding bytes,therefore they must reside in accessible memory. Consider cases wherepadding bytes after the last line of an image cross a system pageboundary. Input devices may write padding bytes, the value isundefined. Output devices ignore the contents of paddingbytes.

When the image format is planar thebytesperline value applies to the largestplane and is divided by the same factor as thewidth field for any smaller planes. Forexample the Cb and Cr planes of a YUV 4:2:0 image have half as manypadding bytes following each line as the Y plane. To avoid ambiguitiesdrivers must return abytesperlinevaluerounded up to a multiple of the scale factor.

__u32 sizeimage Size in bytes of the buffer to hold a complete image,set by the driver. Usually this isbytesperline timesheight. When the image consists of variablelength compressed data this is the maximum number of bytes required tohold an image.
enum v4l2_colorspace colorspace This information supplements thepixelformat and must be set by the driver,seeSection 2.2.
__u32 priv Reserved for custom (driver defined) additionalinformation about formats. When not used drivers and applications mustset this field to zero.

2.1. Standard Image Formats

In order to exchange images between drivers andapplications, it is necessary to have standard image data formatswhich both sides will interpret the same way. V4L2 includes severalsuch formats, and this section is intended to be an unambiguousspecification of the standard image data formats in V4L2.

V4L2 drivers are not limited to these formats, however.Driver-specific formats are possible. In that case the application maydepend on a codec to convert images to one of the standard formatswhen needed. But the data can still be stored and retrieved in theproprietary format. For example, a device may support a proprietarycompressed format. Applications can still capture and save the data inthe compressed format, saving much disk space, and later use a codecto convert the images to the X Windows screen format when the video isto be displayed.

Even so, ultimately, some standard formats are needed, sothe V4L2 specification would not be complete without well-definedstandard formats.

The V4L2 standard formats are mainly uncompressed formats. Thepixels are always arranged in memory from left to right, and from topto bottom. The first byte of data in the image buffer is always forthe leftmost pixel of the topmost row. Following that is the pixelimmediately to its right, and so on until the end of the top row ofpixels. Following the rightmost pixel of the row there may be zero ormore bytes of padding to guarantee that each row of pixel data has acertain alignment. Following the pad bytes, if any, is data for theleftmost pixel of the second row from the top, and so on. The last rowhas just as many pad bytes after it as the other rows.

In V4L2 each format has an identifier which looks likePIX_FMT_XXX, defined in thevideodev.h header file. These identifiersrepresentfour character codeswhich are also listed below, however they are not the same as thoseused in the Windows world.


2.2. Colorspaces

[intro]

Gamma Correction

[to do]

E'R = f(R)

E'G = f(G)

E'B = f(B)

Construction of luminance and color-differencesignals

[to do]

E'Y =CoeffR E'R+ CoeffG E'G+ CoeffB E'B

(E'R - E'Y) = E'R- CoeffR E'R- CoeffG E'G- CoeffB E'B

(E'B - E'Y) = E'B- CoeffR E'R- CoeffG E'G- CoeffB E'B

Re-normalized color-difference signals

The color-difference signals are scaled back to unityrange [-0.5;+0.5]:

KB = 0.5 / (1 - CoeffB)

KR = 0.5 / (1 - CoeffR)

PB =KB (E'B - E'Y) = 0.5 (CoeffR / CoeffB) E'R+ 0.5 (CoeffG / CoeffB) E'G+ 0.5 E'B

PR =KR (E'R - E'Y) = 0.5 E'R+ 0.5 (CoeffG / CoeffR) E'G+ 0.5 (CoeffB / CoeffR) E'B

Quantization

[to do]

Y' = (Lum. Levels - 1) · E'Y + Lum. Offset

CB = (Chrom. Levels - 1)· PB + Chrom. Offset

CR = (Chrom. Levels - 1)· PR + Chrom. Offset

Rounding to the nearest integer and clamping to the range[0;255] finally yields the digital color components Y'CbCrstored in YUV images.

Example 2-1. ITU-R Rec. BT.601 color conversion

Forward Transformation

int ER, EG, EB;         /* gamma corrected RGB input [0;255] */
int Y1, Cb, Cr;         /* output [0;255] */

double r, g, b;         /* temporaries */
double y1, pb, pr;

int
clamp (double x)
{
        int r = x;      /* round to nearest */

        if (r < 0)         return 0;
        else if (r > 255)  return 255;
        else               return r;
}

r = ER / 255.0;
g = EG / 255.0;
b = EB / 255.0;

y1  =  0.299  * r + 0.587 * g + 0.114  * b;
pb  = -0.169  * r - 0.331 * g + 0.5    * b;
pr  =  0.5    * r - 0.419 * g - 0.081  * b;

Y1 = clamp (219 * y1 + 16);
Cb = clamp (224 * pb + 128);
Cr = clamp (224 * pr + 128);

/* or shorter */

y1 = 0.299 * ER + 0.587 * EG + 0.114 * EB;

Y1 = clamp ( (219 / 255.0)                    *       y1  + 16);
Cb = clamp (((224 / 255.0) / (2 - 2 * 0.114)) * (EB - y1) + 128);
Cr = clamp (((224 / 255.0) / (2 - 2 * 0.299)) * (ER - y1) + 128);
      

Inverse Transformation

int Y1, Cb, Cr;         /* gamma pre-corrected input [0;255] */
int ER, EG, EB;         /* output [0;255] */

double r, g, b;         /* temporaries */
double y1, pb, pr;

int
clamp (double x)
{
        int r = x;      /* round to nearest */

        if (r < 0)         return 0;
        else if (r > 255)  return 255;
        else               return r;
}

y1 = (255 / 219.0) * (Y1 - 16);
pb = (255 / 224.0) * (Cb - 128);
pr = (255 / 224.0) * (Cr - 128);

r = 1.0 * y1 + 0     * pb + 1.402 * pr;
g = 1.0 * y1 - 0.344 * pb - 0.714 * pr;
b = 1.0 * y1 + 1.772 * pb + 0     * pr;

ER = clamp (r * 255); /* [ok? one should prob. limit y1,pb,pr] */
EG = clamp (g * 255);
EB = clamp (b * 255);
      

Table 2-2. enum v4l2_colorspace

Identifier Value Description Chromaticities[a] White Point Gamma Correction Luminance E'Y Quantization
Red Green Blue Y' Cb, Cr
V4L2_COLORSPACE_SMPTE170M 1 NTSC/PAL according toSMPTE 170M,ITU BT.601 x = 0.630, y = 0.340 x = 0.310, y = 0.595 x = 0.155, y = 0.070 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I 0.299 E'R+ 0.587 E'G+ 0.114 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_SMPTE240M 2 1125-Line (US) HDTV, see SMPTE 240M x = 0.630, y = 0.340 x = 0.310, y = 0.595 x = 0.155, y = 0.070 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4 I for I ≤0.0228,1.1115 I0.45 - 0.1115 for 0.0228 < I 0.212 E'R+ 0.701 E'G+ 0.087 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_REC709 3 HDTV and modern devices, see ITU BT.709 x = 0.640, y = 0.330 x = 0.300, y = 0.600 x = 0.150, y = 0.060 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I 0.2125 E'R+ 0.7154 E'G+ 0.0721 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_BT878 4 Broken Bt878 extents[b],ITU BT.601 ? ? ? ? ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 237 E'Y + 16 224 PB,R + 128 (probably)
V4L2_COLORSPACE_470_SYSTEM_M 5 M/NTSC[c] according toITU BT.470,ITU BT.601 x = 0.67, y = 0.33 x = 0.21, y = 0.71 x = 0.14, y = 0.08 x = 0.310, y = 0.316, Illuminant C ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_470_SYSTEM_BG 6 625-line PAL and SECAM systems according toITU BT.470, ITU BT.601 x = 0.64, y = 0.33 x = 0.29, y = 0.60 x = 0.15, y = 0.06 x = 0.313, y = 0.329,Illuminant D65 ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 219 E'Y + 16 224 PB,R + 128
V4L2_COLORSPACE_JPEG 7 JPEG Y'CbCr, see JFIF,ITU BT.601 ? ? ? ? ? 0.299 E'R+ 0.587 E'G+ 0.114 E'B 256 E'Y + 16[d] 256 PB,R + 128
V4L2_COLORSPACE_SRGB 8 [?] x = 0.640, y = 0.330 x = 0.300, y = 0.600 x = 0.150, y = 0.060 x = 0.3127, y = 0.3290, Illuminant D65 E' = 4.5 I for I ≤0.018,1.099 I0.45 - 0.099 for 0.018 < I n/a
Notes:
a. The coordinates of the color primaries aregiven in the CIE system (1931)
b. The ubiquitous Bt878 video capture chipquantizes E'Y to 238 levels, yielding a rangeof Y' = 16 … 253, unlike Rec. 601 Y' = 16 …235. This is not a typo in the Bt878 documentation, it has beenimplemented in silicon. The chroma extents are unclear.
c. No identifier exists for M/PAL which usesthe chromaticities of M/NTSC, the remaining parameters are equal to B andG/PAL.
d. Note JFIF quantizesY'PBPR in range [0;+1] and[-0.5;+0.5] to257 levels, however Y'CbCr signalsare still clamped to [0;255].

2.3. Indexed Format

In this format each pixel is represented by an 8 bit indexinto a 256 entry ARGB palette. It is intended forVideo Output Overlays only. There are no ioctls toaccess the palette, this must be done with ioctls of the Linux framebuffer API.

Table 2-3. Indexed Image Format

Identifier Code   Byte 0                                                    
    Bit 7 6 5 4 3 2 1 0                                                    
V4L2_PIX_FMT_PAL8 'PAL8'   i7 i6 i5 i4 i3 i2 i1 i0                                                    

2.4. RGB Formats

Table of Contents
Packed RGB formats -- Packed RGB formats
V4L2_PIX_FMT_SBGGR8 ('BA81') -- Bayer RGB format
V4L2_PIX_FMT_SBGGR16 ('BA82') -- Bayer RGB format

Packed RGB formats

Name

Packed RGB formats -- Packed RGB formats

Description

These formats are designed to match the pixel formats oftypical PC graphics frame buffers. They occupy 8, 16, 24 or 32 bitsper pixel. These are all packed-pixel formats, meaning all the datafor a pixel lie next to each other in memory.

When one of these formats is used, drivers shall report thecolorspace V4L2_COLORSPACE_SRGB.

Table 2-1. Packed RGB Image Formats

Identifier Code   Byte 0 in memory   Byte 1   Byte 2   Byte 3
    Bit 7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0
V4L2_PIX_FMT_RGB332 'RGB1'   b1 b0 g2 g1 g0 r2 r1 r0                                                    
V4L2_PIX_FMT_RGB444 'R444'   g3 g2 g1 g0 b3 b2 b1 b0   a3 a2 a1 a0 r3 r2 r1 r0                                  
V4L2_PIX_FMT_RGB555 'RGBO'   g2 g1 g0 r4 r3 r2 r1 r0   a b4 b3 b2 b1 b0 g4 g3                                  
V4L2_PIX_FMT_RGB565 'RGBP'   g2 g1 g0 r4 r3 r2 r1 r0   b4 b3 b2 b1 b0 g5 g4 g3                                  
V4L2_PIX_FMT_RGB555X 'RGBQ'   a b4 b3 b2 b1 b0 g4 g3   g2 g1 g0 r4 r3 r2 r1 r0                                  
V4L2_PIX_FMT_RGB565X 'RGBR'   b4 b3 b2 b1 b0 g5 g4 g3   g2 g1 g0 r4 r3 r2 r1 r0                                  
V4L2_PIX_FMT_BGR24 'BGR3'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0                
V4L2_PIX_FMT_RGB24 'RGB3'   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0                
V4L2_PIX_FMT_BGR32 'BGR4'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0   a7 a6 a5 a4 a3 a2 a1 a0
V4L2_PIX_FMT_RGB32 'RGB4'   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0   a7 a6 a5 a4 a3 a2 a1 a0

Bit 7 is the most significant bit. The value of a = alphabits is undefined when reading from the driver, ignored when writingto the driver, except when alpha blending has been negotiated for aVideo Overlay or Video Output Overlay.

Example 2-1. V4L2_PIX_FMT_BGR24 4 × 4 pixelimage

Byte Order. Each cell is one byte.

start + 0: B00 G00 R00 B01 G01 R01 B02 G02 R02 B03 G03 R03
start + 12: B10 G10 R10 B11 G11 R11 B12 G12 R12 B13 G13 R13
start + 24: B20 G20 R20 B21 G21 R21 B22 G22 R22 B23 G23 R23
start + 36: B30 G30 R30 B31 G31 R31 B32 G32 R32 B33 G33 R33

Important: Drivers may interpret these formats differently.

Some RGB formats above are uncommon and were probablydefined in error. Drivers may interpret them as inTable 2-2.

Table 2-2. Packed RGB Image Formats (corrected)

Identifier Code   Byte 0 in memory   Byte 1   Byte 2   Byte 3
    Bit 7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0
V4L2_PIX_FMT_RGB332 'RGB1'   r2 r1 r0 g2 g1 g0 b1 b0                                                    
V4L2_PIX_FMT_RGB444 'R444'   g3 g2 g1 g0 b3 b2 b1 b0   a3 a2 a1 a0 r3 r2 r1 r0                                  
V4L2_PIX_FMT_RGB555 'RGBO'   g2 g1 g0 b4 b3 b2 b1 b0   a r4 r3 r2 r1 r0 g4 g3                                  
V4L2_PIX_FMT_RGB565 'RGBP'   g2 g1 g0 b4 b3 b2 b1 b0   r4 r3 r2 r1 r0 g5 g4 g3                                  
V4L2_PIX_FMT_RGB555X 'RGBQ'   a r4 r3 r2 r1 r0 g4 g3   g2 g1 g0 b4 b3 b2 b1 b0                                  
V4L2_PIX_FMT_RGB565X 'RGBR'   r4 r3 r2 r1 r0 g5 g4 g3   g2 g1 g0 b4 b3 b2 b1 b0                                  
V4L2_PIX_FMT_BGR24 'BGR3'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0                
V4L2_PIX_FMT_RGB24 'RGB3'   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0                
V4L2_PIX_FMT_BGR32 'BGR4'   b7 b6 b5 b4 b3 b2 b1 b0   g7 g6 g5 g4 g3 g2 g1 g0   r7 r6 r5 r4 r3 r2 r1 r0   a7 a6 a5 a4 a3 a2 a1 a0
V4L2_PIX_FMT_RGB32 'RGB4'   a7 a6 a5 a4 a3 a2 a1 a0   r7 r6 r5 r4 r3 r2 r1 r0   g7 g6 g5 g4 g3 g2 g1 g0   b7 b6 b5 b4 b3 b2 b1 b0

A test utility to determine which RGB formats a driveractually supports is available from the LinuxTV v4l-dvb repository.Seehttp://linuxtv.org/repo/ for access instructions.

V4L2_PIX_FMT_SBGGR8 ('BA81')

Name

V4L2_PIX_FMT_SBGGR8 -- Bayer RGB format

Description

This is commonly the native format of digital cameras,reflecting the arrangement of sensors on the CCD device. Only one red,green or blue value is given for each pixel. Missing components mustbe interpolated from neighbouring pixels. From left to right the firstrow consists of a blue and green value, the second row of a green andred value. This scheme repeats to the right and down for every twocolumns and rows.

Example 2-1. V4L2_PIX_FMT_SBGGR8 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: B00 G01 B02 G03
start + 4: G10 R11 G12 R13
start + 8: B20 G21 B22 G23
start + 12: G30 R31 G32 R33

V4L2_PIX_FMT_SBGGR16 ('BA82')

Name

V4L2_PIX_FMT_SBGGR16 -- Bayer RGB format

Description

This format is similar to V4L2_PIX_FMT_SBGGR8, except each pixel hasa depth of 16 bits. The least significant byte is stored at lowermemory addresses (little-endian). Note the actual sampling precisionmay be lower than 16 bits, for example 10 bits per pixel with valuesin range 0 to 1023.

Example 2-1. V4L2_PIX_FMT_SBGGR16 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: B00low B00high G01low G01high B02low B02high G03low G03high
start + 8: G10low G10high R11low R11high G12low G12high R13low R13high
start + 16: B20low B20high G21low G21high B22low B22high G23low G23high
start + 24: G30low G30high R31low R31high G32low G32high R33low R33high

2.5. YUV Formats

Table of Contents
Packed YUV formats -- Packed YUV formats
V4L2_PIX_FMT_GREY ('GREY') -- Grey-scale image
V4L2_PIX_FMT_Y16 ('Y16 ') -- Grey-scale image
V4L2_PIX_FMT_YUYV ('YUYV') -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2
V4L2_PIX_FMT_UYVY ('UYVY') -- Variation of V4L2_PIX_FMT_YUYV with different order of samplesin memory
V4L2_PIX_FMT_Y41P ('Y41P') -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1
V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12') -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0
V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9') -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0
V4L2_PIX_FMT_YUV422P ('422P') -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to V4L2_PIX_FMT_YUYV
V4L2_PIX_FMT_YUV411P ('411P') -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to V4L2_PIX_FMT_Y41P
V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21') -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to V4L2_PIX_FMT_YVU420

YUV is the format native to TV broadcast and composite videosignals. It separates the brightness information (Y) from the colorinformation (U and V or Cb and Cr). The color information consists ofred and bluecolor differencesignals, this waythe green component can be reconstructed by subtracting from thebrightness component. SeeSection 2.2 for conversionexamples. YUV was chosen because early television would only transmitbrightness information. To add color in a way compatible with existingreceivers a new signal carrier was added to transmit the colordifference signals. Secondary in the YUV format the U and V componentsusually have lower resolution than the Y component. This is an analogvideo compression technique taking advantage of a property of thehuman visual system, being more sensitive to brightnessinformation.

Packed YUV formats

Name

Packed YUV formats -- Packed YUV formats

Description

Similar to the packed RGB formats these formats storethe Y, Cb and Cr component of each pixel in one 16 or 32 bitword.

Table 2-1. Packed YUV Image Formats

Identifier Code   Byte 0 in memory   Byte 1   Byte 2   Byte 3
    Bit 7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0   7 6 5 4 3 2 1 0
V4L2_PIX_FMT_YUV444 'Y444'   Cb3 Cb2 Cb1 Cb0 Cr3 Cr2 Cr1 Cr0   a3 a2 a1 a0 Y'3 Y'2 Y'1 Y'0                                  
V4L2_PIX_FMT_YUV555 'YUVO'   Cb2 Cb1 Cb0 Cr4 Cr3 Cr2 Cr1 Cr0   a Y'4 Y'3 Y'2 Y'1 Y'0 Cb4 Cb3                                  
V4L2_PIX_FMT_YUV565 'YUVP'   Cb2 Cb1 Cb0 Cr4 Cr3 Cr2 Cr1 Cr0   Y'4 Y'3 Y'2 Y'1 Y'0 Cb5 Cb4 Cb3                                  
V4L2_PIX_FMT_YUV32 'YUV4'   a7 a6 a5 a4 a3 a2 a1 a0   Y'7 Y'6 Y'5 Y'4 Y'3 Y'2 Y'1 Y'0   Cb7 Cb6 Cb5 Cb4 Cb3 Cb2 Cb1 Cb0   Cr7 Cr6 Cr5 Cr4 Cr3 Cr2 Cr1 Cr0

Bit 7 is the most significant bit. The value of a = alphabits is undefined when reading from the driver, ignored when writingto the driver, except when alpha blending has been negotiated for aVideo Overlay or Video Output Overlay.

V4L2_PIX_FMT_GREY ('GREY')

Name

V4L2_PIX_FMT_GREY -- Grey-scale image

Description

This is a grey-scale image. It is really a degenerateY'CbCr format which simply contains no Cb or Cr data.

Example 2-1. V4L2_PIX_FMT_GREY 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33

V4L2_PIX_FMT_Y16 ('Y16 ')

Name

V4L2_PIX_FMT_Y16 -- Grey-scale image

Description

This is a grey-scale image with a depth of 16 bits perpixel. The least significant byte is stored at lower memory addresses(little-endian). Note the actual sampling precision may be lower than16 bits, for example 10 bits per pixel with values in range 0 to1023.

Example 2-1. V4L2_PIX_FMT_Y16 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00low Y'00high Y'01low Y'01high Y'02low Y'02high Y'03low Y'03high
start + 8: Y'10low Y'10high Y'11low Y'11high Y'12low Y'12high Y'13low Y'13high
start + 16: Y'20low Y'20high Y'21low Y'21high Y'22low Y'22high Y'23low Y'23high
start + 24: Y'30low Y'30high Y'31low Y'31high Y'32low Y'32high Y'33low Y'33high

V4L2_PIX_FMT_YUYV ('YUYV')

Name

V4L2_PIX_FMT_YUYV -- Packed format with ½ horizontal chromaresolution, also known as YUV 4:2:2

Description

In this format each four bytes is two pixels. Each fourbytes is two Y's, a Cb and a Cr. Each Y goes to one of the pixels, andthe Cb and Cr belong to both pixels. As you can see, the Cr and Cbcomponents have half the horizontal resolution of the Y component.V4L2_PIX_FMT_YUYVis known in the Windowsenvironment as YUY2.

Example 2-1. V4L2_PIX_FMT_YUYV 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Cb00 Y'01 Cr00 Y'02 Cb01 Y'03 Cr01
start + 8: Y'10 Cb10 Y'11 Cr10 Y'12 Cb11 Y'13 Cr11
start + 16: Y'20 Cb20 Y'21 Cr20 Y'22 Cb21 Y'23 Cr21
start + 24: Y'30 Cb30 Y'31 Cr30 Y'32 Cb31 Y'33 Cr31

Color Sample Location.

  0   1   2   3
0 Y C Y   Y C Y
1 Y C Y   Y C Y
2 Y C Y   Y C Y
3 Y C Y   Y C Y

V4L2_PIX_FMT_UYVY ('UYVY')

Name

V4L2_PIX_FMT_UYVY -- Variation of V4L2_PIX_FMT_YUYV with different order of samplesin memory

Description

In this format each four bytes is two pixels. Each fourbytes is two Y's, a Cb and a Cr. Each Y goes to one of the pixels, andthe Cb and Cr belong to both pixels. As you can see, the Cr and Cbcomponents have half the horizontal resolution of the Ycomponent.

Example 2-1. V4L2_PIX_FMT_UYVY 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Cb00 Y'00 Cr00 Y'01 Cb01 Y'02 Cr01 Y'03
start + 8: Cb10 Y'10 Cr10 Y'11 Cb11 Y'12 Cr11 Y'13
start + 16: Cb20 Y'20 Cr20 Y'21 Cb21 Y'22 Cr21 Y'23
start + 24: Cb30 Y'30 Cr30 Y'31 Cb31 Y'32 Cr31 Y'33

Color Sample Location.

  0   1   2   3
0 Y C Y   Y C Y
1 Y C Y   Y C Y
2 Y C Y   Y C Y
3 Y C Y   Y C Y

V4L2_PIX_FMT_Y41P ('Y41P')

Name

V4L2_PIX_FMT_Y41P -- Format with ¼ horizontal chromaresolution, also known as YUV 4:1:1

Description

In this format each 12 bytes is eight pixels. In thetwelve bytes are two CbCr pairs and eight Y's. The first CbCr pairgoes with the first four Y's, and the second CbCr pair goes with theother four Y's. The Cb and Cr components have one fourth thehorizontal resolution of the Y component.

Do not confuse this format with V4L2_PIX_FMT_YUV411P. Y41P is derived from "YUV 4:1:1packed", whileYUV411P stands for "YUV 4:1:1planar".

Example 2-1. V4L2_PIX_FMT_Y41P 8 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Cb00 Y'00 Cr00 Y'01 Cb01 Y'02 Cr01 Y'03 Y'04 Y'05 Y'06 Y'07
start + 12: Cb10 Y'10 Cr10 Y'11 Cb11 Y'12 Cr11 Y'13 Y'14 Y'15 Y'16 Y'17
start + 24: Cb20 Y'20 Cr20 Y'21 Cb21 Y'22 Cr21 Y'23 Y'24 Y'25 Y'26 Y'27
start + 36: Cb30 Y'30 Cr30 Y'31 Cb31 Y'32 Cr31 Y'33 Y'34 Y'35 Y'36 Y'37

Color Sample Location.

  0   1   2   3   4   5   6   7
0 Y   Y C Y   Y   Y   Y C Y   Y
1 Y   Y C Y   Y   Y   Y C Y   Y
2 Y   Y C Y   Y   Y   Y C Y   Y
3 Y   Y C Y   Y   Y   Y C Y   Y

V4L2_PIX_FMT_YVU420 ('YV12'), V4L2_PIX_FMT_YUV420 ('YU12')

Name

V4L2_PIX_FMT_YVU420V4L2_PIX_FMT_YUV420 -- Planar formats with ½ horizontal andvertical chroma resolution, also known as YUV 4:2:0

Description

These are planar formats, as opposed to a packed format.The three components are separated into three sub- images or planes.The Y plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_YVU420, the Cr plane immediatelyfollows the Y plane in memory. The Cr plane is half the width and halfthe height of the Y plane (and of the image). Each Cr belongs to fourpixels, a two-by-two square of the image. For example,Cr0 belongs to Y'00,Y'01, Y'10, andY'11. Following the Cr plane is the Cb plane,just like the Cr plane.V4L2_PIX_FMT_YUV420 isthe same except the Cb plane comes first, then the Cr plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have half as many pad bytes after their rows. In otherwords, two Cx rows (including padding) is exactly as long as one Y row(including padding).

Example 2-1. V4L2_PIX_FMT_YVU420 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cr00 Cr01    
start + 18: Cr10 Cr11    
start + 20: Cb00 Cb01    
start + 22: Cb10 Cb11    

Color Sample Location.

  0   1   2   3
0 Y   Y   Y   Y
    C       C  
1 Y   Y   Y   Y
             
2 Y   Y   Y   Y
    C       C  
3 Y   Y   Y   Y

V4L2_PIX_FMT_YVU410 ('YVU9'), V4L2_PIX_FMT_YUV410 ('YUV9')

Name

V4L2_PIX_FMT_YVU410V4L2_PIX_FMT_YUV410 -- Planar formats with ¼ horizontal andvertical chroma resolution, also known as YUV 4:1:0

Description

These are planar formats, as opposed to a packed format.The three components are separated into three sub-images or planes.The Y plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_YVU410, the Cr plane immediatelyfollows the Y plane in memory. The Cr plane is ¼ the width and¼ the height of the Y plane (and of the image). Each Cr belongsto 16 pixels, a four-by-four square of the image. Following the Crplane is the Cb plane, just like the Cr plane.V4L2_PIX_FMT_YUV410 is the same, except the Cbplane comes first, then the Cr plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have ¼ as many pad bytes after their rows. Inother words, four Cx rows (including padding) are exactly as long asone Y row (including padding).

Example 2-1. V4L2_PIX_FMT_YVU410 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cr00      
start + 17: Cb00      

Color Sample Location.

  0   1   2   3
0 Y   Y   Y   Y
             
1 Y   Y   Y   Y
        C      
2 Y   Y   Y   Y
             
3 Y   Y   Y   Y

V4L2_PIX_FMT_YUV422P ('422P')

Name

V4L2_PIX_FMT_YUV422P -- Format with ½ horizontal chroma resolution,also known as YUV 4:2:2. Planar layout as opposed to V4L2_PIX_FMT_YUYV

Description

This format is not commonly used. This is a planarversion of the YUYV format. The three components are separated intothree sub-images or planes. The Y plane is first. The Y plane has onebyte per pixel. The Cb plane immediately follows the Y plane inmemory. The Cb plane is half the width of the Y plane (and of theimage). Each Cb belongs to two pixels. For example,Cb0 belongs to Y'00,Y'01. Following the Cb plane is the Cr plane,just like the Cb plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have half as many pad bytes after their rows. In otherwords, two Cx rows (including padding) is exactly as long as one Y row(including padding).

Example 2-1. V4L2_PIX_FMT_YUV422P 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cb00 Cb01    
start + 18: Cb10 Cb11    
start + 20: Cb20 Cb21    
start + 22: Cb30 Cb31    
start + 24: Cr00 Cr01    
start + 26: Cr10 Cr11    
start + 28: Cr20 Cr21    
start + 30: Cr30 Cr31    

Color Sample Location.

  0   1   2   3
0 Y C Y   Y C Y
1 Y C Y   Y C Y
2 Y C Y   Y C Y
3 Y C Y   Y C Y

V4L2_PIX_FMT_YUV411P ('411P')

Name

V4L2_PIX_FMT_YUV411P -- Format with ¼ horizontal chroma resolution,also known as YUV 4:1:1. Planar layout as opposed to V4L2_PIX_FMT_Y41P

Description

This format is not commonly used. This is a planarformat similar to the 4:2:2 planar format except with half as manychroma. The three components are separated into three sub-images orplanes. The Y plane is first. The Y plane has one byte per pixel. TheCb plane immediately follows the Y plane in memory. The Cb plane is¼ the width of the Y plane (and of the image). Each Cb belongsto 4 pixels all on the same row. For example,Cb0 belongs to Y'00,Y'01, Y'02 andY'03. Following the Cb plane is the Cr plane,just like the Cb plane.

If the Y plane has pad bytes after each row, then the Crand Cb planes have ¼ as many pad bytes after their rows. Inother words, four C x rows (including padding) is exactly as long asone Y row (including padding).

Example 2-1. V4L2_PIX_FMT_YUV411P 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cb00      
start + 17: Cb10      
start + 18: Cb20      
start + 19: Cb30      
start + 20: Cr00      
start + 21: Cr10      
start + 22: Cr20      
start + 23: Cr30      

Color Sample Location.

  0   1   2   3
0 Y   Y C Y   Y
1 Y   Y C Y   Y
2 Y   Y C Y   Y
3 Y   Y C Y   Y

V4L2_PIX_FMT_NV12 ('NV12'), V4L2_PIX_FMT_NV21 ('NV21')

Name

V4L2_PIX_FMT_NV12V4L2_PIX_FMT_NV21 -- Formats with ½ horizontal and verticalchroma resolution, also known as YUV 4:2:0. One luminance and onechrominance plane with alternating chroma samples as opposed to V4L2_PIX_FMT_YVU420

Description

These are two-plane versions of the YUV 4:2:0 format.The three components are separated into two sub-images or planes. TheY plane is first. The Y plane has one byte per pixel. ForV4L2_PIX_FMT_NV12, a combined CbCr planeimmediately follows the Y plane in memory. The CbCr plane is the samewidth, in bytes, as the Y plane (and of the image), but is half astall in pixels. Each CbCr pair belongs to four pixels. For example,Cb0/Cr0 belongs toY'00, Y'01,Y'10, Y'11.V4L2_PIX_FMT_NV21 is the same except the Cb andCr bytes are swapped, the CrCb plane starts with a Cr byte.

If the Y plane has pad bytes after each row, then theCbCr plane has as many pad bytes after its rows.

Example 2-1. V4L2_PIX_FMT_NV12 4 × 4pixel image

Byte Order. Each cell is one byte.

start + 0: Y'00 Y'01 Y'02 Y'03
start + 4: Y'10 Y'11 Y'12 Y'13
start + 8: Y'20 Y'21 Y'22 Y'23
start + 12: Y'30 Y'31 Y'32 Y'33
start + 16: Cb00 Cr00 Cb01 Cr01
start + 20: Cb10 Cr10 Cb11 Cr11

Color Sample Location.

  0   1   2   3
0 Y   Y   Y   Y
    C       C  
1 Y   Y   Y   Y
             
2 Y   Y   Y   Y
    C       C  
3 Y   Y   Y   Y

2.6. Compressed Formats

Table 2-7. Compressed Image Formats

Identifier Code Details
V4L2_PIX_FMT_JPEG 'JPEG' TBD. See also VIDIOC_G_JPEGCOMPVIDIOC_S_JPEGCOMP.
V4L2_PIX_FMT_MPEG 'MPEG' MPEG stream. The actual format is determined byextended control V4L2_CID_MPEG_STREAM_TYPE, seeTable 1-2.

2.7. Reserved Format Identifiers

These formats are not defined by this specification, theyare just listed for reference and to avoid naming conflicts. If youwant to register your own format, send an e-mail to the V4L mailinglisthttps://listman.redhat.com/mailman/listinfo/video4linux-list for inclusion in thevideodev.hfile. If you want to share your format with other developers add alink to your documentation and send a copy to the maintainer of thisdocument, Michael Schimek, forinclusion in this section. If you think your format should be listedin a standard format section please make a proposal on the V4L mailinglist.

Table 2-8. Reserved Image Formats

Identifier Code Details
V4L2_PIX_FMT_DV 'dvsd' unknown
V4L2_PIX_FMT_ET61X251 'E625' Compressed format of the ET61X251 driver.
V4L2_PIX_FMT_HI240 'HI24'

8 bit RGB format used by the BTTV driver,http://bytesex.org/bttv/

V4L2_PIX_FMT_HM12 'HM12'

YUV 4:2:0 format used by theIVTV driver,http://www.ivtvdriver.org/

The format is documented in thekernel sources in the fileDocumentation/video4linux/cx2341x/README.hm12

V4L2_PIX_FMT_MJPEG 'MJPG' Compressed format used by the Zoran driver
V4L2_PIX_FMT_PWC1 'PWC1' Compressed format of the PWC driver.
V4L2_PIX_FMT_PWC2 'PWC2' Compressed format of the PWC driver.
V4L2_PIX_FMT_SN9C10X 'S910' Compressed format of the SN9C102 driver.
V4L2_PIX_FMT_WNVA 'WNVA'

Used by the Winnov Videum driver,http://www.thedirks.org/winnov/

V4L2_PIX_FMT_YYUV 'YYUV' unknown

Chapter 3. Input/Output

The V4L2 API defines several different methods to read from orwrite to a device. All drivers exchanging data with applications mustsupport at least one of them.

The classic I/O method using the read()and write() function is automatically selectedafter opening a V4L2 device. When the driver does not support thismethod attempts to read or write will fail at any time.

Other methods must be negotiated. To select the streaming I/Omethod with memory mapped or user buffers applications call theVIDIOC_REQBUFS ioctl. The asynchronous I/O method is not definedyet.

Video overlay can be considered another I/O method, althoughthe application does not directly receive the image data. It isselected by initiating video overlay with theVIDIOC_S_FMT ioctl.For more information seeSection 4.2.

Generally exactly one I/O method, including overlay, isassociated with each file descriptor. The only exceptions areapplications not exchanging data with a driver ("panel applications",seeSection 1.1) and drivers permitting simultaneous video capturingand overlay using the same file descriptor, for compatibility with V4Land earlier versions of V4L2.

VIDIOC_S_FMT andVIDIOC_REQBUFS would permit this to some degree,but for simplicity drivers need not support switching the I/O method(after first switching away from read/write) other than by closingand reopening the device.

The following sections describe the various I/O methods inmore detail.


3.1. Read/Write

Input and output devices support theread() and write() function,respectively, when the V4L2_CAP_READWRITE flag inthecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl is set.

Drivers may need the CPU to copy the data, but they may alsosupport DMA to or from user memory, so this I/O method is notnecessarily less efficient than other methods merely exchanging bufferpointers. It is considered inferior though because no meta-informationlike frame counters or timestamps are passed. This information isnecessary to recognize frame dropping and to synchronize with otherdata streams. However this is also the simplest I/O method, requiringlittle or no setup to exchange data. It permits command line stuntslike this (the vidctrl tool isfictitious):

> vidctrl /dev/video --input=0 --format=YUYV --size=352x288
> dd if=/dev/video of=myimage.422 bs=202752 count=1

To read from the device applications use theread() function, to write thewrite() function.Drivers must implement one I/O method if theyexchange data with applications, but it need not be this.[12] When reading or writing is supported, the drivermust also support the select() and poll()function.[13]


3.2. Streaming I/O (Memory Mapping)

Input and output devices support this I/O method when theV4L2_CAP_STREAMING flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl is set. There are twostreaming methods, to determine if the memory mapping flavor issupported applications must call theVIDIOC_REQBUFS ioctl.

Streaming is an I/O method where only pointers to buffersare exchanged between application and driver, the data itself is notcopied. Memory mapping is primarily intended to map buffers in devicememory into the application's address space. Device memory can be forexample the video memory on a graphics card with a video captureadd-on. However, being the most efficient I/O method available for along time, many other drivers support streaming as well, allocatingbuffers in DMA-able main memory.

A driver can support many sets of buffers. Each set isidentified by a unique buffer type value. The sets are independent andeach set can hold a different type of data. To access different setsat the same time different file descriptors must be used.[14]

To allocate device buffers applications call theVIDIOC_REQBUFS ioctl with the desired number of buffers and buffertype, for exampleV4L2_BUF_TYPE_VIDEO_CAPTURE.This ioctl can also be used to change the number of buffers or to freethe allocated memory, provided none of the buffers are stillmapped.

Before applications can access the buffers they must mapthem into their address space with themmap() function. Thelocation of the buffers in device memory can be determined with theVIDIOC_QUERYBUF ioctl. Them.offsetandlength returned in a struct v4l2_buffer arepassed as sixth and second parameter to themmap() function. The offset and length valuesmust not be modified. Remember the buffers are allocated in physicalmemory, as opposed to virtual memory which can be swapped out to disk.Applications should free the buffers as soon as possible with themunmap() function.

Example 3-1. Mapping buffers

struct v4l2_requestbuffers reqbuf;
struct {
        void *start;
        size_t length;
} *buffers;
unsigned int i;

memset (&reqbuf, 0, sizeof (reqbuf));
reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbuf.memory = V4L2_MEMORY_MMAP;
reqbuf.count = 20;

if (-1 == ioctl (fd, VIDIOC_REQBUFS, &reqbuf)) {
        if (errno == EINVAL)
                printf ("Video capturing or mmap-streaming is not supported\n");
        else
                perror ("VIDIOC_REQBUFS");

        exit (EXIT_FAILURE);
}

/* We want at least five buffers. */

if (reqbuf.count < 5) {
        /* You may need to free the buffers here. */
        printf ("Not enough buffer memory\n");
        exit (EXIT_FAILURE);
}

buffers = calloc (reqbuf.count, sizeof (*buffers));
assert (buffers != NULL);

for (i = 0; i < reqbuf.count; i++) {
        struct v4l2_buffer buffer;

        memset (&buffer, 0, sizeof (buffer));
        buffer.type = reqbuf.type;
	buffer.memory = V4L2_MEMORY_MMAP;
        buffer.index = i;

        if (-1 == ioctl (fd, VIDIOC_QUERYBUF, &buffer)) {
                perror ("VIDIOC_QUERYBUF");
                exit (EXIT_FAILURE);
        }

        buffers[i].length = buffer.length; /* remember for munmap() */

        buffers[i].start = mmap (NULL, buffer.length,
                                 PROT_READ | PROT_WRITE, /* recommended */
                                 MAP_SHARED,             /* recommended */
                                 fd, buffer.m.offset);

        if (MAP_FAILED == buffers[i].start) {
                /* If you do not exit here you should unmap() and free()
                   the buffers mapped so far. */
                perror ("mmap");
                exit (EXIT_FAILURE);
        }
}

/* Cleanup. */

for (i = 0; i < reqbuf.count; i++)
        munmap (buffers[i].start, buffers[i].length);
      

Conceptually streaming drivers maintain two buffer queues, an incomingand an outgoing queue. They separate the synchronous capture or outputoperation locked to a video clock from the application which issubject to random disk or network delays and preemption byother processes, thereby reducing the probability of data loss.The queues are organized as FIFOs, buffers will beoutput in the order enqueued in the incoming FIFO, and werecaptured in the order dequeued from the outgoing FIFO.

The driver may require a minimum number of buffers enqueuedat all times to function, apart of this no limit exists on the numberof buffers applications can enqueue in advance, or dequeue andprocess. They can also enqueue in a different order than buffers havebeen dequeued, and the driver can fill enqueuedempty buffers in any order.[15]The index number of a buffer (struct v4l2_bufferindex) plays no role here, it onlyidentifies the buffer.

Initially all mapped buffers are in dequeued state,inaccessible by the driver. For capturing applications it is customaryto first enqueue all mapped buffers, then to start capturing and enterthe read loop. Here the application waits until a filled buffer can bedequeued, and re-enqueues the buffer when the data is no longerneeded. Output applications fill and enqueue buffers, when enoughbuffers are stacked up the output is started withVIDIOC_STREAMON. In the write loop, whenthe application runs out of free buffers, it must wait until an emptybuffer can be dequeued and reused.

To enqueue and dequeue a buffer applications use theVIDIOC_QBUF andVIDIOC_DQBUF ioctl. The status of a buffer beingmapped, enqueued, full or empty can be determined at any time using theVIDIOC_QUERYBUF ioctl. Two methods exist to suspend execution of theapplication until one or more buffers can be dequeued. By defaultVIDIOC_DQBUFblocks when no buffer is in theoutgoing queue. When theO_NONBLOCK flag wasgiven to theopen()function,VIDIOC_DQBUFreturns immediately with anEAGAIN error code when no buffer is available. Theselect()orpoll() function are always available.

To start and stop capturing or output applications call theVIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl. NoteVIDIOC_STREAMOFF removes all buffers from bothqueues as a side effect. Since there is no notion of doing anything"now" on a multitasking system, if an application needs to synchronizewith another event it should examine the struct v4l2_buffertimestamp of captured buffers, or set thefield before enqueuing buffers for output.

Drivers implementing memory mapping I/O mustsupport theVIDIOC_REQBUFS,VIDIOC_QUERYBUF,VIDIOC_QBUF,VIDIOC_DQBUF,VIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl, themmap(),munmap(),select() andpoll()function.[16]

[capture example]


3.3. Streaming I/O (User Pointers)

Input and output devices support this I/O method when theV4L2_CAP_STREAMING flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl is set. If the particular userpointer method (not only memory mapping) is supported must bedetermined by calling theVIDIOC_REQBUFS ioctl.

This I/O method combines advantages of the read/write andmemory mapping methods. Buffers are allocated by the applicationitself, and can reside for example in virtual or shared memory. Onlypointers to data are exchanged, these pointers and meta-informationare passed in struct v4l2_buffer. The driver must be switchedinto user pointer I/O mode by calling theVIDIOC_REQBUFS with thedesired buffer type. No buffers are allocated beforehands,consequently they are not indexed and cannot be queried like mappedbuffers with theVIDIOC_QUERYBUF ioctl.

Example 3-2. Initiating streaming I/O with user pointers

struct v4l2_requestbuffers reqbuf;

memset (&reqbuf, 0, sizeof (reqbuf));
reqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbuf.memory = V4L2_MEMORY_USERPTR;

if (ioctl (fd, VIDIOC_REQBUFS, &reqbuf) == -1) {
        if (errno == EINVAL)
                printf ("Video capturing or user pointer streaming is not supported\n");
        else
                perror ("VIDIOC_REQBUFS");

        exit (EXIT_FAILURE);
}
      

Buffer addresses and sizes are passed on the fly with theVIDIOC_QBUF ioctl. Although buffers are commonly cycled,applications can pass different addresses and sizes at eachVIDIOC_QBUF call. If required by the hardware thedriver swaps memory pages within physical memory to create acontinuous area of memory. This happens transparently to theapplication in the virtual memory subsystem of the kernel. When bufferpages have been swapped out to disk they are brought back and finallylocked in physical memory for DMA.[17]

Filled or displayed buffers are dequeued with theVIDIOC_DQBUF ioctl. The driver can unlock the memory pages at anytime between the completion of the DMA and this ioctl. The memory isalso unlocked when VIDIOC_STREAMOFF is called, VIDIOC_REQBUFS, orwhen the device is closed. Applications must take care not to freebuffers without dequeuing. For once, the buffers remain locked untilfurther, wasting physical memory. Second the driver will not benotified when the memory is returned to the application's free listand subsequently reused for other purposes, possibly completing therequested DMA and overwriting valuable data.

For capturing applications it is customary to enqueue anumber of empty buffers, to start capturing and enter the read loop.Here the application waits until a filled buffer can be dequeued, andre-enqueues the buffer when the data is no longer needed. Outputapplications fill and enqueue buffers, when enough buffers are stackedup output is started. In the write loop, when the applicationruns out of free buffers it must wait until an empty buffer can bedequeued and reused. Two methods exist to suspend execution of theapplication until one or more buffers can be dequeued. By defaultVIDIOC_DQBUF blocks when no buffer is in theoutgoing queue. When theO_NONBLOCK flag wasgiven to theopen() function,VIDIOC_DQBUFreturns immediately with anEAGAIN error code when no buffer is available. Theselect() orpoll() function are always available.

To start and stop capturing or output applications call theVIDIOC_STREAMON andVIDIOC_STREAMOFF ioctl. NoteVIDIOC_STREAMOFF removes all buffers from bothqueues and unlocks all buffers as a side effect. Since there is nonotion of doing anything "now" on a multitasking system, if anapplication needs to synchronize with another event it should examinethe struct v4l2_buffertimestamp of capturedbuffers, or set the field before enqueuing buffers for output.

Drivers implementing user pointer I/O mustsupport the VIDIOC_REQBUFS,VIDIOC_QBUF,VIDIOC_DQBUF,VIDIOC_STREAMONandVIDIOC_STREAMOFF ioctl, theselect() andpoll() function.[18]


3.4. Asynchronous I/O

This method is not defined yet.


3.5. Buffers

A buffer contains data exchanged by application anddriver using one of the Streaming I/O methods. Only pointers tobuffers are exchanged, the data itself is not copied. These pointers,together with meta-information like timestamps or field parity, arestored in a struct v4l2_buffer, argument tothe VIDIOC_QUERYBUFVIDIOC_QBUF and VIDIOC_DQBUFioctl.

Nominally timestamps refer to the first data byte transmitted.In practice however the wide range of hardware covered by the V4L2 APIlimits timestamp accuracy. Often an interrupt routine willsample the system clock shortly after the field or frame was storedcompletely in memory. So applications must expect a constantdifference up to one field or frame period plus a small (few scanlines) random error. The delay and error can be muchlarger due to compression or transmission over an external bus whenthe frames are not properly stamped by the sender. This is frequentlythe case with USB cameras. Here timestamps refer to the instant thefield or frame was received by the driver, not the capture time. Thesedevices identify by not enumerating any video standards, seeSection 1.7.

Similar limitations apply to output timestamps. Typicallythe video hardware locks to a clock controlling the video timing, thehorizontal and vertical synchronization pulses. At some point in theline sequence, possibly the vertical blanking, an interrupt routinesamples the system clock, compares against the timestamp and programsthe hardware to repeat the previous field or frame, or to display thebuffer contents.

Apart of limitations of the video device and naturalinaccuracies of all clocks, it should be noted system time itself isnot perfectly stable. It can be affected by power saving cycles,warped to insert leap seconds, or even turned back or forth by thesystem administrator affecting long term measurements. [19]

Table 3-1. struct v4l2_buffer

__u32 index   Number of the buffer, set by the application. Thisfield is only used for memory mappingI/Oand can range from zero to the number of buffers allocatedwith theVIDIOC_REQBUFS ioctl (struct v4l2_requestbufferscount) minus one.
enum v4l2_buf_type type   Type of the buffer, same as struct v4l2_formattype or struct v4l2_requestbufferstype, set by the application.
__u32 bytesused   The number of bytes occupied by the data in thebuffer. It depends on the negotiated data format and may change witheach buffer for compressed variable size data like JPEG images.Drivers must set this field whentyperefers to an input stream, applications when an output stream.
__u32 flags   Flags set by the application or driver, seeTable 3-3.
enum v4l2_field field   Indicates the field order of the image in thebuffer, see Table 3-8. This field is not used whenthe buffer contains VBI data. Drivers must set it whentype refers to an input stream,applications when an output stream.
struct timeval timestamp  

For input streams this is thesystem time (as returned by the gettimeofday()function) when the first data byte was captured. For output streamsthe data will not be displayed before this time, secondary to thenominal frame rate determined by the current video standard inenqueued order. Applications can for example zero this field todisplay frames as soon as possible. The driver stores the time atwhich the first data byte was actually sent out in thetimestamp field. This permitsapplications to monitor the drift between the video and systemclock.

struct v4l2_timecode timecode   When type isV4L2_BUF_TYPE_VIDEO_CAPTUREand theV4L2_BUF_FLAG_TIMECODE flag is set inflags, this structure contains a frametimecode. InV4L2_FIELD_ALTERNATEmode the top and bottom field contain the same timecode.Timecodes are intended to help video editing and are typically recorded onvideo tapes, but also embedded in compressed formats like MPEG. Thisfield is independent of the timestamp andsequencefields.
__u32 sequence   Set by the driver, counting the frames in thesequence.

In V4L2_FIELD_ALTERNATE mode the top andbottom field have the same sequence number. The count starts at zeroand includes dropped or repeated frames. A dropped frame was receivedby an input device but could not be stored due to lack of free bufferspace. A repeated frame was displayed again by an output devicebecause the application did not pass new data intime.

Note this may count the frames receivede.g. over USB, without taking into account the frames dropped by theremote hardware due to limited compression throughput or busbandwidth. These devices identify by not enumerating any videostandards, seeSection 1.7.

enum v4l2_memory memory   This field must be set by applications and/or driversin accordance with the selected I/O method.
union m    
  __u32 offset When memory isV4L2_MEMORY_MMAP this is the offset of the bufferfrom the start of the device memory. The value is returned by thedriver and apart of serving as parameter to themmap() functionnot useful for applications. SeeSection 3.2 for details.
  unsigned long userptr When memory isV4L2_MEMORY_USERPTR this is a pointer to thebuffer (casted to unsigned long type) in virtual memory, set by theapplication. SeeSection 3.3 for details.
__u32 length   Size of the buffer (not the payload) in bytes.
__u32 input   Some video capture drivers support rapid andsynchronous video input changes, a function useful for example invideo surveillance applications. For this purpose applications set theV4L2_BUF_FLAG_INPUT flag, and this field to thenumber of a video input as in struct v4l2_input fieldindex.
__u32 reserved   A place holder for future extensions and custom(driver defined) buffer typesV4L2_BUF_TYPE_PRIVATE and higher.

Table 3-2. enum v4l2_buf_type

V4L2_BUF_TYPE_VIDEO_CAPTURE 1 Buffer of a video capture stream, see Section 4.1.
V4L2_BUF_TYPE_VIDEO_OUTPUT 2 Buffer of a video output stream, see Section 4.3.
V4L2_BUF_TYPE_VIDEO_OVERLAY 3 Buffer for video overlay, see Section 4.2.
V4L2_BUF_TYPE_VBI_CAPTURE 4 Buffer of a raw VBI capture stream, see Section 4.7.
V4L2_BUF_TYPE_VBI_OUTPUT 5 Buffer of a raw VBI output stream, see Section 4.7.
V4L2_BUF_TYPE_SLICED_VBI_CAPTURE 6 Buffer of a sliced VBI capture stream, see Section 4.8.
V4L2_BUF_TYPE_SLICED_VBI_OUTPUT 7 Buffer of a sliced VBI output stream, see Section 4.8.
V4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY 8 Buffer for video output overlay (OSD), see Section 4.4. Status: Experimental.
V4L2_BUF_TYPE_PRIVATE 0x80 This and higher values are reserved for custom(driver defined) buffer types.

Table 3-3. Buffer Flags

V4L2_BUF_FLAG_MAPPED 0x0001 The buffer resides in device memory and has been mappedinto the application's address space, seeSection 3.2 for details.Drivers set or clear this flag when theVIDIOC_QUERYBUF,VIDIOC_QBUForVIDIOC_DQBUF ioctl is called. Set by the driver.
V4L2_BUF_FLAG_QUEUED 0x0002 Internally drivers maintain two buffer queues, anincoming and outgoing queue. When this flag is set, the buffer iscurrently on the incoming queue. It automatically moves to theoutgoing queue after the buffer has been filled (capture devices) ordisplayed (output devices). Drivers set or clear this flag when theVIDIOC_QUERYBUF ioctl is called. After(successful) calling theVIDIOC_QBUFioctl it isalways set and afterVIDIOC_DQBUF alwayscleared.
V4L2_BUF_FLAG_DONE 0x0004 When this flag is set, the buffer is currently onthe outgoing queue, ready to be dequeued from the driver. Drivers setor clear this flag when theVIDIOC_QUERYBUFioctlis called. After calling theVIDIOC_QBUF orVIDIOC_DQBUFit is always cleared. Of course abuffer cannot be on both queues at the same time, theV4L2_BUF_FLAG_QUEUEDandV4L2_BUF_FLAG_DONE flag are mutually exclusive.They can be both cleared however, then the buffer is in "dequeued"state, in the application domain to say so.
V4L2_BUF_FLAG_KEYFRAME 0x0008 Drivers set or clear this flag when calling theVIDIOC_DQBUF ioctl. It may be set by videocapture devices when the buffer contains a compressed image which is akey frame (or field), i. e. can be decompressed on its own.
V4L2_BUF_FLAG_PFRAME 0x0010 Similar to V4L2_BUF_FLAG_KEYFRAMEthis flags predicted frames or fields which contain only differences to aprevious key frame.
V4L2_BUF_FLAG_BFRAME 0x0020 Similar to V4L2_BUF_FLAG_PFRAME this is a bidirectional predicted frame or field. [ooc tbd]
V4L2_BUF_FLAG_TIMECODE 0x0100 The timecode field is valid.Drivers set or clear this flag when theVIDIOC_DQBUFioctl is called.
V4L2_BUF_FLAG_INPUT 0x0200 The input field is valid.Applications set or clear this flag before calling theVIDIOC_QBUF ioctl.

Table 3-4. enum v4l2_memory

V4L2_MEMORY_MMAP 1 The buffer is used for memorymapping I/O.
V4L2_MEMORY_USERPTR 2 The buffer is used for userpointer I/O.
V4L2_MEMORY_OVERLAY 3 [to do]

3.5.1. Timecodes

The v4l2_timecode structure isdesigned to hold aSMPTE 12M or similar timecode.(structtimeval timestamps are stored instruct v4l2_buffer fieldtimestamp.)

Table 3-5. struct v4l2_timecode

__u32 type Frame rate the timecodes are based on, see Table 3-6.
__u32 flags Timecode flags, see Table 3-7.
__u8 frames Frame count, 0 ... 23/24/29/49/59, depending on the type of timecode.
__u8 seconds Seconds count, 0 ... 59. This is a binary, not BCD number.
__u8 minutes Minutes count, 0 ... 59. This is a binary, not BCD number.
__u8 hours Hours count, 0 ... 29. This is a binary, not BCD number.
__u8 userbits[4] The "user group" bits from the timecode.

Table 3-6. Timecode Types

V4L2_TC_TYPE_24FPS 1 24 frames per second, i. e. film.
V4L2_TC_TYPE_25FPS 2 25 frames per second, i. e. PAL or SECAM video.
V4L2_TC_TYPE_30FPS 3 30 frames per second, i. e. NTSC video.
V4L2_TC_TYPE_50FPS 4  
V4L2_TC_TYPE_60FPS 5  

Table 3-7. Timecode Flags

V4L2_TC_FLAG_DROPFRAME 0x0001 Indicates "drop frame" semantics for counting framesin 29.97 fps material. When set, frame numbers 0 and 1 at the start ofeach minute, except minutes 0, 10, 20, 30, 40, 50 are omitted from thecount.
V4L2_TC_FLAG_COLORFRAME 0x0002 The "color frame" flag.
V4L2_TC_USERBITS_field 0x000C Field mask for the "binary group flags".
V4L2_TC_USERBITS_USERDEFINED 0x0000 Unspecified format.
V4L2_TC_USERBITS_8BITCHARS 0x0008 8-bit ISO characters.

3.6. Field Order

We have to distinguish between progressive and interlacedvideo. Progressive video transmits all lines of a video imagesequentially. Interlaced video divides an image into two fields,containing only the odd and even lines of the image, respectively.Alternating the so called odd and even field are transmitted, and dueto a small delay between fields a cathode ray TV displays the linesinterleaved, yielding the original frame. This curious technique wasinvented because at refresh rates similar to film the image wouldfade out too quickly. Transmitting fields reduces the flicker withoutthe necessity of doubling the frame rate and with it the bandwidthrequired for each channel.

It is important to understand a video camera does not exposeone frame at a time, merely transmitting the frames separated intofields. The fields are in fact captured at two different instances intime. An object on screen may well move between one field and thenext. For applications analysing motion it is of paramount importanceto recognize which field of a frame is older, thetemporalorder.

When the driver provides or accepts images field by fieldrather than interleaved, it is also important applications understandhow the fields combine to frames. We distinguish between top andbottom fields, thespatial order: The first lineof the top field is the first line of an interlaced frame, the firstline of the bottom field is the second line of that frame.

However because fields were captured one after the other,arguing whether a frame commences with the top or bottom field ispointless. Any two successive top and bottom, or bottom and top fieldsyield a valid frame. Only when the source was progressive to beginwith, e. g. when transferring film to video, two fields may come fromthe same frame, creating a natural order.

Counter to intuition the top field is not necessarily theolder field. Whether the older field contains the top or bottom linesis a convention determined by the video standard. Hence thedistinction between temporal and spatial order of fields. The diagramsbelow should make this clearer.

All video capture and output devices must report the currentfield order. Some drivers may permit the selection of a differentorder, to this end applications initialize thefield field of struct v4l2_pix_format beforecalling theVIDIOC_S_FMT ioctl. If this is not desired it shouldhave the valueV4L2_FIELD_ANY (0).

Table 3-8. enum v4l2_field

V4L2_FIELD_ANY 0 Applications request this field order when anyone of theV4L2_FIELD_NONE,V4L2_FIELD_TOP,V4L2_FIELD_BOTTOM, orV4L2_FIELD_INTERLACED formats is acceptable.Drivers choose depending on hardware capabilities or e. g. therequested image size, and return the actual field order. struct v4l2_bufferfield can never beV4L2_FIELD_ANY.
V4L2_FIELD_NONE 1 Images are in progressive format, not interlaced.The driver may also indicate this order when it cannot distinguishbetweenV4L2_FIELD_TOPandV4L2_FIELD_BOTTOM.
V4L2_FIELD_TOP 2 Images consist of the top field only.
V4L2_FIELD_BOTTOM 3 Images consist of the bottom field only.Applications may wish to prevent a device from capturing interlacedimages because they will have "comb" or "feathering" artefacts aroundmoving objects.
V4L2_FIELD_INTERLACED 4 Images contain both fields, interleaved line byline. The temporal order of the fields (whether the top or bottomfield is first transmitted) depends on the current video standard.M/NTSC transmits the bottom field first, all other standards the topfield first.
V4L2_FIELD_SEQ_TB 5 Images contain both fields, the top field linesare stored first in memory, immediately followed by the bottom fieldlines. Fields are always stored in temporal order, the older one firstin memory. Image sizes refer to the frame, not fields.
V4L2_FIELD_SEQ_BT 6 Images contain both fields, the bottom fieldlines are stored first in memory, immediately followed by the topfield lines. Fields are always stored in temporal order, the older onefirst in memory. Image sizes refer to the frame, not fields.
V4L2_FIELD_ALTERNATE 7 The two fields of a frame are passed in separatebuffers, in temporal order, i. e. the older one first. To indicate the fieldparity (whether the current field is a top or bottom field) the driveror application, depending on data direction, must set struct v4l2_bufferfieldtoV4L2_FIELD_TOP orV4L2_FIELD_BOTTOM. Any two successive fields pairto build a frame. If fields are successive, without any dropped fieldsbetween them (fields can drop individually), can be determined fromthe struct v4l2_buffersequence field. Imagesizes refer to the frame, not fields. This format cannot be selectedwhen using the read/write I/O method.
V4L2_FIELD_INTERLACED_TB 8 Images contain both fields, interleaved line byline, top field first. The top field is transmitted first.
V4L2_FIELD_INTERLACED_BT 9 Images contain both fields, interleaved line byline, top field first. The bottom field is transmitted first.

Figure 3-1. Field Order, Top Field First Transmitted

Figure 3-2. Field Order, Bottom Field First Transmitted


Chapter 4. Interfaces

4.1. Video Capture Interface

Video capture devices sample an analog video signal and storethe digitized images in memory. Today nearly all devices can captureat full 25 or 30 frames/second. With this interface applications cancontrol the capture process and move images from the driver into userspace.

Conventionally V4L2 video capture devices are accessed throughcharacter device special files named/dev/videoand/dev/video0 to/dev/video63 with major number 81 and minornumbers 0 to 63./dev/video is typically asymbolic link to the preferred video device. Note the same devicefiles are used for video output devices.


4.1.1. Querying Capabilities

Devices supporting the video capture interface set theV4L2_CAP_VIDEO_CAPTURE flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl. As secondary device functionsthey may also support thevideo overlay(V4L2_CAP_VIDEO_OVERLAY) and theraw VBI capture(V4L2_CAP_VBI_CAPTURE) interface. At least one ofthe read/write or streaming I/O methods must be supported. Tuners andaudio inputs are optional.


4.1.2. Supplemental Functions

Video capture devices shall support audio input, tuner, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo input andvideo standard ioctls must be supported byall video capture devices.


4.1.3. Image Format Negotiation

The result of a capture operation is determined bycropping and image format parameters. The former select an area of thevideo picture to capture, the latter how images are stored in memory,i. e. in RGB or YUV format, the number of bits per pixel or width andheight. Together they also define how images are scaled in theprocess.

As usual these parameters are not resetatopen() time to permit Unix tool chains, programming a deviceand then reading from it as if it was a plain file. Well written V4L2applications ensure they really get what they want, including croppingand scaling.

Cropping initialization at minimum requires to reset theparameters to defaults. An example is given inSection 1.11.

To query the current image format applications set thetype field of a struct v4l2_formattoV4L2_BUF_TYPE_VIDEO_CAPTURE and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_pix_formatpix member of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as above andinitialize all fields of the struct v4l2_pix_formatvbi member of thefmt union, or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl with a pointer to this structure. Drivers mayadjust the parameters and finally return the actual parameters asVIDIOC_G_FMT does.

Like VIDIOC_S_FMT theVIDIOC_TRY_FMT ioctl can be used to learn about hardware limitationswithout disabling I/O or possibly time consuming hardwarepreparations.

The contents of struct v4l2_pix_format are discussed inChapter 2. See also the specification of theVIDIOC_G_FMT,VIDIOC_S_FMTandVIDIOC_TRY_FMT ioctls for details. Videocapture devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.


4.1.4. Reading Images

A video capture device may support the read() function and/or streaming (memory mapping oruser pointer) I/O. SeeChapter 3 for details.


4.2. Video Overlay Interface

Also known as Framebuffer Overlay or Previewing

Video overlay devices have the ability to genlock (TV-)videointo the (VGA-)video signal of a graphics card, or to store capturedimages directly in video memory of a graphics card, typically withclipping. This can be considerable more efficient than capturingimages and displaying them by other means. In the old days when onlynuclear power plants needed cooling towers this used to be the onlyway to put live video into a window.

Video overlay devices are accessed through the same characterspecial files as video capture devices.Note the default function of a /dev/video deviceis video capturing. The overlay function is only available aftercalling theVIDIOC_S_FMT ioctl.

The driver may support simultaneous overlay and capturingusing the read/write and streaming I/O methods. If so, operation atthe nominal frame rate of the video standard is not guaranteed. Framesmay be directed away from overlay to capture, or one field may be usedfor overlay and the other for capture if the capture parameters permitthis.

Applications should use different file descriptors forcapturing and overlay. This must be supported by all drivers capableof simultaneous capturing and overlay. Optionally these drivers mayalso permit capturing and overlay with a single file descriptor forcompatibility with V4L and earlier versions of V4L2.[20]


4.2.1. Querying Capabilities

Devices supporting the video overlay interface set theV4L2_CAP_VIDEO_OVERLAY flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl. The overlay I/O method specifiedbelow must be supported. Tuners and audio inputs are optional.


4.2.2. Supplemental Functions

Video overlay devices shall support audio input, tuner, controls,cropping and scaling andstreaming parameter ioctls as needed.Thevideo input andvideo standard ioctls must be supported byall video overlay devices.


4.2.3. Setup

Before overlay can commence applications must program thedriver with frame buffer parameters, namely the address and size ofthe frame buffer and the image format, for example RGB 5:6:5. TheVIDIOC_G_FBUF andVIDIOC_S_FBUF ioctls are available to getand set these parameters, respectively. TheVIDIOC_S_FBUF ioctl is privileged because itallows to set up DMA into physical memory, bypassing the memoryprotection mechanisms of the kernel. Only the superuser can change theframe buffer address and size. Users are not supposed to run TVapplications as root or with SUID bit set. A small helper applicationwith suitable privileges should query the graphics system and programthe V4L2 driver at the appropriate time.

Some devices add the video overlay to the output signalof the graphics card. In this case the frame buffer is not modified bythe video device, and the frame buffer address and pixel format arenot needed by the driver. TheVIDIOC_S_FBUF ioctlis not privileged. An application can check for this type of device bycalling theVIDIOC_G_FBUFioctl.

A driver may support any (or none) of five clipping/blendingmethods:

  1. Chroma-keying displays the overlaid image only wherepixels in the primary graphics surface assume a certain color.

  2. A bitmap can be specified where each bit correspondsto a pixel in the overlaid image. When the bit is set, thecorresponding video pixel is displayed, otherwise a pixel of thegraphics surface.

  3. A list of clipping rectangles can be specified. Inthese regions no video is displayed, so thegraphics surface can be seen here.

  4. The framebuffer has an alpha channel that can be usedto clip or blend the framebuffer with the video.

  5. A global alpha value can be specified to blend theframebuffer contents with video images.

When simultaneous capturing and overlay is supported andthe hardware prohibits different image and frame buffer formats, theformat requested first takes precedence. The attempt to capture(VIDIOC_S_FMT) or overlay (VIDIOC_S_FBUF) may fail with anEBUSY error code or return accordingly modified parameters..


4.2.4. Overlay Window

The overlaid image is determined by cropping and overlaywindow parameters. The former select an area of the video picture tocapture, the latter how images are overlaid and clipped. Croppinginitialization at minimum requires to reset the parameters todefaults. An example is given in Section 1.11.

The overlay window is described by a struct v4l2_window. Itdefines the size of the image, its position over the graphics surfaceand the clipping to be applied. To get the current parametersapplications set the type field of astruct v4l2_format toV4L2_BUF_TYPE_VIDEO_OVERLAY andcall theVIDIOC_G_FMT ioctl. The driver fills thev4l2_windowsubstructure namedwin. It is not possible to retrieve apreviously programmed clipping list or bitmap.

To program the overlay window applications set thetype field of a struct v4l2_formattoV4L2_BUF_TYPE_VIDEO_OVERLAY, initialize thewin substructure and call theVIDIOC_S_FMT ioctl. The driver adjusts the parameters againsthardware limits and returns the actual parameters asVIDIOC_G_FMT does. LikeVIDIOC_S_FMT, theVIDIOC_TRY_FMT ioctl can beused to learn about driver capabilities without actually changingdriver state. UnlikeVIDIOC_S_FMT this also worksafter the overlay has been enabled.

The scaling factor of the overlaid image is implied by thewidth and height given in struct v4l2_window and the size of the croppingrectangle. For more information seeSection 1.11.

When simultaneous capturing and overlay is supported andthe hardware prohibits different image and window sizes, the sizerequested first takes precedence. The attempt to capture or overlay aswell (VIDIOC_S_FMT) may fail with anEBUSY error code or return accordinglymodified parameters.

Table 4-1. struct v4l2_window

struct v4l2_rect w Size and position of the window relative to thetop, left corner of the frame buffer defined withVIDIOC_S_FBUF. Thewindow can extend the frame buffer width and height, thex andycoordinates can be negative, and it can lie completely outside theframe buffer. The driver clips the window accordingly, or if that isnot possible, modifies its size and/or position.
enum v4l2_field field Applications set this field to determine whichvideo field shall be overlaid, typically one ofV4L2_FIELD_ANY(0),V4L2_FIELD_TOP,V4L2_FIELD_BOTTOMorV4L2_FIELD_INTERLACED. Drivers may have to choosea different field order and return the actual setting here.
__u32 chromakey When chroma-keying has been negotiated withVIDIOC_S_FBUF applications set this field to the desired pixel valuefor the chroma key. The format is the same as the pixel format of theframebuffer (struct v4l2_framebufferfmt.pixelformat field), with bytes in hostorder. E. g. forV4L2_PIX_FMT_BGR24 the value should be 0xRRGGBB on a little endian, 0xBBGGRR on a bigendian host.
struct v4l2_clip * clips When chroma-keying has notbeen negotiated andVIDIOC_G_FBUF indicated this capability,applications can set this field to point to an array ofclipping rectangles.
Like the window coordinatesw, clipping rectangles are defined relativeto the top, left corner of the frame buffer. However clippingrectangles must not extend the frame buffer width and height, and theymust not overlap. If possible applications should merge adjacentrectangles. Whether this must create x-y or y-x bands, or the order ofrectangles, is not defined. When clip lists are not supported thedriver ignores this field. Its contents after callingVIDIOC_S_FMTare undefined.
__u32 clipcount When the application set theclips field, this field must contain thenumber of clipping rectangles in the list. When clip lists are notsupported the driver ignores this field, its contents after callingVIDIOC_S_FMT are undefined. When clip lists aresupported but no clipping is desired this field must be set tozero.
void * bitmap When chroma-keying hasnot been negotiated andVIDIOC_G_FBUF indicatedthis capability, applications can set this field to point to aclipping bit mask.

It must be of the same sizeas the window, w.width andw.height. Each bit corresponds to a pixelin the overlaid image, which is displayed only when the bit isset. Pixel coordinates translate to bits like:

((__u8 *) bitmap)[w.width * y + x / 8] & (1 << (x & 7))

where 0 ≤ x <w.width and0 ≤y <w.height.a

When a clippingbit mask is not supported the driver ignores this field, its contentsafter callingVIDIOC_S_FMT are undefined. When a bit mask is supportedbut no clipping is desired this field must be set toNULL.

Applications need not create aclip list or bit mask. When they pass both, or despite negotiatingchroma-keying, the results are undefined. Regardless of the chosenmethod, the clipping abilities of the hardware may be limited inquantity or quality. The results when these limits are exceeded areundefined.b

__u8 global_alpha

The global alpha value used to blend theframebuffer with video images, if global alpha blending has beennegotiated (V4L2_FBUF_FLAG_GLOBAL_ALPHA, seeVIDIOC_S_FBUF,Table 3).

Notethis field was added in Linux 2.6.23, extending the structure. Howeverthe VIDIOC_G/S/TRY_FMTioctls,which take a pointer to a v4l2_format parent structure with paddingbytes at the end, are not affected.

Notes:
a. Should we require w.width to be a multiple of eight?
b. When the image is written into frame buffermemory it will be undesirable if the driver clips out less pixelsthan expected, because the application and graphics system are notaware these regions need to be refreshed. The driver should clip outmore pixels or not write the image at all.

Table 4-2. struct v4l2_clip[21]

struct v4l2_rect c Coordinates of the clipping rectangle, relative tothe top, left corner of the frame buffer. Only window pixelsoutsideall clipping rectangles aredisplayed.
struct v4l2_clip * next Pointer to the next clipping rectangle, NULL whenthis is the last rectangle. Drivers ignore this field, it cannot beused to pass a linked list of clipping rectangles.

Table 4-3. struct v4l2_rect

__s32 left Horizontal offset of the top, left corner of therectangle, in pixels.
__s32 top Vertical offset of the top, left corner of therectangle, in pixels. Offsets increase to the right and down.
__s32 width Width of the rectangle, in pixels.
__s32 height Height of the rectangle, in pixels. Width andheight cannot be negative, the fields are signed for hystericalreasons.

4.2.5. Enabling Overlay

To start or stop the frame buffer overlay applications callthe VIDIOC_OVERLAY ioctl.


4.3. Video Output Interface

Video output devices encode stills or image sequences asanalog video signal. With this interface applications cancontrol the encoding process and move images from user space tothe driver.

Conventionally V4L2 video output devices are accessed throughcharacter device special files named/dev/videoand/dev/video0 to/dev/video63 with major number 81 and minornumbers 0 to 63./dev/video is typically asymbolic link to the preferred video device. Note the same devicefiles are used for video capture devices.


4.3.1. Querying Capabilities

Devices supporting the video output interface set theV4L2_CAP_VIDEO_OUTPUT flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl. As secondary device functionsthey may also support theraw VBIoutput (V4L2_CAP_VBI_OUTPUT) interface. Atleast one of the read/write or streaming I/O methods must besupported. Modulators and audio outputs are optional.


4.3.2. Supplemental Functions

Video output devices shall support audio output, modulator, controls,cropping and scaling andstreaming parameterioctls as needed.Thevideo output andvideo standard ioctls must be supported byall video output devices.


4.3.3. Image Format Negotiation

The output is determined by cropping and image formatparameters. The former select an area of the video picture where theimage will appear, the latter how images are stored in memory, i. e. inRGB or YUV format, the number of bits per pixel or width and height.Together they also define how images are scaled in the process.

As usual these parameters are not resetatopen() time to permit Unix tool chains, programming a deviceand then writing to it as if it was a plain file. Well written V4L2applications ensure they really get what they want, including croppingand scaling.

Cropping initialization at minimum requires to reset theparameters to defaults. An example is given inSection 1.11.

To query the current image format applications set thetype field of a struct v4l2_formattoV4L2_BUF_TYPE_VIDEO_OUTPUT and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_pix_formatpix member of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as above andinitialize all fields of the struct v4l2_pix_formatvbi member of thefmt union, or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl with a pointer to this structure. Drivers mayadjust the parameters and finally return the actual parameters asVIDIOC_G_FMT does.

Like VIDIOC_S_FMT theVIDIOC_TRY_FMT ioctl can be used to learn about hardware limitationswithout disabling I/O or possibly time consuming hardwarepreparations.

The contents of struct v4l2_pix_format are discussed inChapter 2. See also the specification of theVIDIOC_G_FMT,VIDIOC_S_FMTandVIDIOC_TRY_FMT ioctls for details. Videooutput devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.


4.3.4. Writing Images

A video output device may support the write() function and/or streaming (memory mapping oruser pointer) I/O. SeeChapter 3 for details.


4.4. Video Output Overlay Interface

Also known as On-Screen Display (OSD)

Experimental: This is an experimentalinterface and may change in the future.

Some video output devices can overlay a framebuffer image ontothe outgoing video signal. Applications can set up such an overlayusing this interface, which borrows structures and ioctls of theVideo Overlay interface.

The OSD function is accessible through the same characterspecial file as the Video Output function.Note the default function of such a /dev/video deviceis video capturing or output. The OSD function is only available aftercalling theVIDIOC_S_FMT ioctl.


4.4.1. Querying Capabilities

Devices supporting the Video OutputOverlay interface set theV4L2_CAP_VIDEO_OUTPUT_OVERLAY flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl.


4.4.2. Framebuffer

Contrary to the Video Overlayinterface the framebuffer is normally implemented on the TV card andnot the graphics card. On Linux it is accessible as a framebufferdevice (/dev/fbN). Given a V4L2 device,applications can find the corresponding framebuffer device by callingthe VIDIOC_G_FBUF ioctl. It returns, amongst other information, thephysical address of the framebuffer in thebase field of struct v4l2_framebuffer. Theframebuffer device ioctlFBIOGET_FSCREENINFOreturns the same address in thesmem_startfield of structfb_fix_screeninfo. TheFBIOGET_FSCREENINFO ioctl and structfb_fix_screeninfo are defined in thelinux/fb.h header file.

The width and height of the framebuffer depends on thecurrent video standard. A V4L2 driver may reject attempts to changethe video standard (or any other ioctl which would imply a framebuffersize change) with anEBUSY error code until all applications closed theframebuffer device.

Example 4-1. Finding a framebuffer device for OSD

#include 

struct v4l2_framebuffer fbuf;
unsigned int i;
int fb_fd;

if (-1 == ioctl (fd, VIDIOC_G_FBUF, &fbuf)) {
        perror ("VIDIOC_G_FBUF");
        exit (EXIT_FAILURE);
}

for (i = 0; i < 30; ++i) {
        char dev_name[16];
        struct fb_fix_screeninfo si;

        snprintf (dev_name, sizeof (dev_name), "/dev/fb%u", i);

        fb_fd = open (dev_name, O_RDWR);
        if (-1 == fb_fd) {
                switch (errno) {
                case ENOENT: /* no such file */
                case ENXIO:  /* no driver */
                        continue;

                default:
                        perror ("open");
                        exit (EXIT_FAILURE);
                }
        }

        if (0 == ioctl (fb_fd, FBIOGET_FSCREENINFO, &si)) {
                if (si.smem_start == (unsigned long) fbuf.base)
                        break;
        } else {
                /* Apparently not a framebuffer device. */
        }

        close (fb_fd);
        fb_fd = -1;
}

/* fb_fd is the file descriptor of the framebuffer device
   for the video output overlay, or -1 if no device was found. */

4.4.3. Overlay Window and Scaling

The overlay is controlled by source and target rectangles.The source rectangle selects a subsection of the framebuffer image tobe overlaid, the target rectangle an area in the outgoing video signalwhere the image will appear. Drivers may or may not support scaling,and arbitrary sizes and positions of these rectangles. Further driversmay support any (or none) of the clipping/blending methods defined fortheVideo Overlay interface.

A struct v4l2_window defines the size of the source rectangle,its position in the framebuffer and the clipping/blending method to beused for the overlay. To get the current parameters applications setthe type field of a struct v4l2_format toV4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY and call theVIDIOC_G_FMT ioctl. The driver fills thev4l2_window substructure namedwin. It is not possible to retrieve apreviously programmed clipping list or bitmap.

To program the source rectangle applications set thetype field of a struct v4l2_formattoV4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY, initializethe win substructure and call theVIDIOC_S_FMT ioctl. The driver adjusts the parameters againsthardware limits and returns the actual parameters asVIDIOC_G_FMT does. LikeVIDIOC_S_FMT, theVIDIOC_TRY_FMT ioctl can beused to learn about driver capabilities without actually changingdriver state. UnlikeVIDIOC_S_FMT this also worksafter the overlay has been enabled.

A struct v4l2_crop defines the size and position of the targetrectangle. The scaling factor of the overlay is implied by the widthand height given in struct v4l2_window and struct v4l2_crop. The cropping APIapplies toVideo OutputandVideoOutput Overlay devices in the same way as toVideo Capture andVideoOverlay devices, merely reversing the direction of thedata flow. For more information seeSection 1.11.


4.4.4. Enabling Overlay

There is no V4L2 ioctl to enable or disable the overlay,however the framebuffer interface of the driver may support theFBIOBLANK ioctl.


4.5. Codec Interface

Suspended: This interface has been be suspended from the V4L2 APIimplemented in Linux 2.6 until we have more experience with codecdevice interfaces.

A V4L2 codec can compress, decompress, transform, or otherwiseconvert video data from one format into another format, in memory.Applications send data to be converted to the driver through awrite() call, and receive the converted data through aread() call. For efficiency a driver may also support streamingI/O.

[to do]


4.6. Effect Devices Interface

Suspended: This interface has been be suspended from the V4L2 APIimplemented in Linux 2.6 until we have more experience with effectdevice interfaces.

A V4L2 video effect device can do image effects, filtering, orcombine two or more images or image streams. For example videotransitions or wipes. Applications send data to be processed andreceive the result data either withread() andwrite()functions, or through the streaming I/O mechanism.

[to do]


4.7. Raw VBI Data Interface

VBI is an abbreviation of Vertical Blanking Interval, a gapin the sequence of lines of an analog video signal. During VBIno picture information is transmitted, allowing some time while theelectron beam of a cathode ray tube TV returns to the top of thescreen. Using an oscilloscope you will find here the verticalsynchronization pulses and short data packages ASKmodulated[22]onto the video signal. These are transmissions of services such asTeletext or Closed Caption.

Subject of this interface type is raw VBI data, as sampled offa video signal, or to be added to a signal for output.The data format is similar to uncompressed video images, a number oflines times a number of samples per line, we call this a VBI image.

Conventionally V4L2 VBI devices are accessed through characterdevice special files named/dev/vbi and/dev/vbi0to/dev/vbi31 withmajor number 81 and minor numbers 224 to 255./dev/vbi is typically a symbolic link to thepreferred VBI device. This convention applies to both input and outputdevices.

To address the problems of finding related video and VBIdevices VBI capturing and output is also available as device functionunder/dev/video. To capture or output raw VBIdata with these devices applications must call theVIDIOC_S_FMTioctl. Accessed as/dev/vbi, raw VBI capturingor output is the default device function.


4.7.1. Querying Capabilities

Devices supporting the raw VBI capturing or output API setthe V4L2_CAP_VBI_CAPTURE orV4L2_CAP_VBI_OUTPUT flags, respectively, in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl. At least one of theread/write, streaming or asynchronous I/O methods must besupported. VBI devices may or may not have a tuner or modulator.


4.7.2. Supplemental Functions

VBI devices shall support videoinput or output, tuner ormodulator, and controls ioctlsas needed. The video standardioctls provideinformation vital to program a VBI device, therefore must besupported.


4.7.3. Raw VBI Format Negotiation

Raw VBI sampling abilities can vary, in particular thesampling frequency. To properly interpret the data V4L2 specifies anioctl to query the sampling parameters. Moreover, to allow for someflexibility applications can also suggest different parameters.

As usual these parameters are notreset atopen() time to permit Unix tool chains, programming adevice and then reading from it as if it was a plain file. Wellwritten V4L2 applications should always ensure they really get whatthey want, requesting reasonable parameters and then checking if theactual parameters are suitable.

To query the current raw VBI capture parametersapplications set the type field of astruct v4l2_formattoV4L2_BUF_TYPE_VBI_CAPTURE orV4L2_BUF_TYPE_VBI_OUTPUT, and call theVIDIOC_G_FMT ioctl with a pointer to this structure. Drivers fillthe struct v4l2_vbi_formatvbi member of thefmt union.

To request different parameters applications set thetype field of a struct v4l2_format as above andinitialize all fields of the struct v4l2_vbi_formatvbi member of thefmt union, or better just modify theresults ofVIDIOC_G_FMT, and call theVIDIOC_S_FMT ioctl with a pointer to this structure. Drivers returnanEINVAL error code only when the given parameters are ambiguous, otherwisethey modify the parameters according to the hardware capabilites andreturn the actual parameters. When the driver allocates resources atthis point, it may return an EBUSY error code to indicate the returnedparameters are valid but the required resources are currently notavailable. That may happen for instance when the video and VBI areasto capture would overlap, or when the driver supports multiple opensand another process already requested VBI capturing or output. Anyway,applications must expect other resource allocation points which mayreturnEBUSY, at theVIDIOC_STREAMON ioctland the first read(), write() and select() call.

VBI devices must implement both theVIDIOC_G_FMT andVIDIOC_S_FMT ioctl, even ifVIDIOC_S_FMT ignores all requests and alwaysreturns default parameters asVIDIOC_G_FMT does.VIDIOC_TRY_FMT is optional.

Table 4-4. struct v4l2_vbi_format

__u32 sampling_rate Samples per second, i. e. unit 1 Hz.
__u32 offset

Horizontal offset of the VBI image,relative to the leading edge of the line synchronization pulse andcounted in samples: The first sample in the VBI image will be locatedoffset /sampling_rate seconds following the leadingedge. See also Figure 4-1.

__u32 samples_per_line  
__u32 sample_format

Defines the sample format as in Chapter 2, a four-character-code.a Usually this isV4L2_PIX_FMT_GREY, i. e. each sampleconsists of 8 bits with lower values oriented towards the black level.Do not assume any other correlation of values with the signal level.For example, the MSB does not necessarily indicate if the signal is'high' or 'low' because 128 may not be the mean value of thesignal. Drivers shall not convert the sample format by software.

__u32 start[2] This is the scanning system line numberassociated with the first line of the VBI image, of the first and thesecond field respectively. SeeFigure 4-2 andFigure 4-3 for valid values. VBI input drivers canreturn start values 0 if the hardware cannot reliable identifyscanning lines, VBI acquisition may not require thisinformation.
__u32 count[2] The number of lines in the first and secondfield image, respectively.

Drivers should be asflexibility as possible. For example, it may be possible to extend ormove the VBI capture window down to the picture area, implementing a'full field mode' to capture data service transmissions embedded inthe picture.

An application can set the first or secondcount value to zero if no data is requiredfrom the respective field;count[1] if thescanning system is progressive, i. e. not interlaced. Thecorresponding start value shall be ignored by the application anddriver. Anyway, drivers may not support single field capturing andreturn both count values non-zero.

Bothcount values set to zero, or line numbersoutside the bounds depicted inFigure 4-2 andFigure 4-3, or a field image coveringlines of two fields, are invalid and shall not be returned by thedriver.

To initialize the startand count fields, applications must firstdetermine the current video standard selection. Thev4l2_std_id ortheframelines field of struct v4l2_standard canbe evaluated for this purpose.

__u32 flags See Table 4-5 below. Currentlyonly drivers set flags, applications must set this field tozero.
__u32 reserved[2] This array is reserved for future extensions.Drivers and applications must set it to zero.
Notes:
a. A few devices may be unable tosample VBI data at all but can extend the video capture window to theVBI region.

Table 4-5. Raw VBI Format Flags

V4L2_VBI_UNSYNC 0x0001

This flag indicates hardware which does notproperly distinguish between fields. Normally the VBI image stores thefirst field (lower scanning line numbers) first in memory. This may bea top or bottom field depending on the video standard. When this flagis set the first or second field may be stored first, however thefields are still in correct temporal order with the older field firstin memory.a

V4L2_VBI_INTERLACED 0x0002 By default the two field images will be passedsequentially; all lines of the first field followed by all lines ofthe second field (compareSection 3.6V4L2_FIELD_SEQ_TB andV4L2_FIELD_SEQ_BT, whether the top or bottomfield is first in memory depends on the video standard). When thisflag is set, the two fields are interlaced (cf.V4L2_FIELD_INTERLACED). The first line of thefirst field followed by the first line of the second field, then thetwo second lines, and so on. Such a layout may be necessary when thehardware has been programmed to capture or output interlaced videoimages and is unable to separate the fields for VBI capturing atthe same time. For simplicity setting this flag implies that bothcount values are equal and non-zero.
Notes:
a. Most VBI services transmit on both fields, butsome have different semantics depending on the field number. Thesecannot be reliable decoded or encoded whenV4L2_VBI_UNSYNC is set.

Figure 4-1. Line synchronization

Figure 4-2. ITU-R 525 line numbering (M/NTSC and M/PAL)

(1) For the purpose of this specification field 2starts in line 264 and not 263.5 because half line capturing is notsupported.

Figure 4-3. ITU-R 625 line numbering

(1) For the purpose of this specification field 2starts in line 314 and not 313.5 because half line capturing is notsupported.

Remember the VBI image format depends on the selectedvideo standard, therefore the application must choose a new standard orquery the current standard first. Attempts to read or write data aheadof format negotiation, or after switching the video standard which mayinvalidate the negotiated VBI parameters, should be refused by thedriver. A format change during active I/O is not permitted.


4.7.4. Reading and writing VBI images

To assure synchronization with the field number and easierimplementation, the smallest unit of data passed at a time is oneframe, consisting of two fields of VBI images immediately following inmemory.

The total size of a frame computes as follows:

(count[0] + count[1]) *
samples_per_line * sample size in bytes

The sample size is most likely always one byte,applications must check the sample_formatfield though, to function properly with other drivers.

A VBI device may support read/write and/or streaming (memory mapping oruser pointer) I/O. The latter bears thepossibility of synchronizing video andVBI data by using buffer timestamps.

Remember the VIDIOC_STREAMON ioctl and the first read(),write() and select() call can be resource allocation points returninganEBUSY error code if the required hardware resources are temporarilyunavailable, for example the device is already in use by anotherprocess.


4.8. Sliced VBI Data Interface

VBI stands for Vertical Blanking Interval, a gap in thesequence of lines of an analog video signal. During VBI no pictureinformation is transmitted, allowing some time while the electron beamof a cathode ray tube TV returns to the top of the screen.

Sliced VBI devices use hardware to demodulate data transmittedin the VBI. V4L2 drivers shallnot do this bysoftware, see also theraw VBIinterface. The data is passed as short packets of fixed size,covering one scan line each. The number of packets per video frame isvariable.

Sliced VBI capture and output devices are accessed through thesame character special files as raw VBI devices. When a driversupports both interfaces, the default function of a/dev/vbi device israw VBIcapturing or output, and the sliced VBI function is only availableafter calling theVIDIOC_S_FMT ioctl as defined below. Likewise a/dev/videodevice may support the sliced VBI API,however the default function here is video capturing or output.Different file descriptors must be used to pass raw and sliced VBIdata simultaneously, if this is supported by the driver.


4.8.1. Querying Capabilities

Devices supporting the sliced VBI capturing or output APIset the V4L2_CAP_SLICED_VBI_CAPTUREorV4L2_CAP_SLICED_VBI_OUTPUT flag respectively, inthecapabilities field of struct v4l2_capabilityreturned by theVIDIOC_QUERYCAP ioctl. At least one of theread/write, streaming or asynchronousI/Omethods must be supported. Sliced VBI devices may have a tuneror modulator.


4.8.2. Supplemental Functions

Sliced VBI devices shall support videoinput or output and tuner ormodulator ioctls if they have these capabilities, and they maysupportcontrol ioctls. Thevideo standard ioctls provide informationvital to program a sliced VBI device, therefore must besupported.


4.8.3. Sliced VBI Format Negotiation

To find out which data services are supported by thehardware applications can call theVIDIOC_G_SLICED_VBI_CAPioctl.All drivers implementing the sliced VBI interface must support thisioctl. The results may differ from those of theVIDIOC_S_FMT ioctlwhen the number of VBI lines the hardware can capture or output perframe, or the number of services it can identify on a given line arelimited. For example on PAL line 16 the hardware may be able to lookfor a VPS or Teletext signal, but not both at the same time.

To determine the currently selected services applicationsset the type field of struct v4l2_format toV4L2_BUF_TYPE_SLICED_VBI_CAPTURE orV4L2_BUF_TYPE_SLICED_VBI_OUTPUT, and theVIDIOC_G_FMTioctl fills thefmt.sliced member, astruct v4l2_sliced_vbi_format.

Applications can request different parameters byinitializing or modifying the fmt.slicedmember and calling theVIDIOC_S_FMT ioctl with a pointer to thev4l2_format structure.

The sliced VBI API is more complicated than the raw VBI APIbecause the hardware must be told which VBI service to expect on eachscan line. Not all services may be supported by the hardware on alllines (this is especially true for VBI output where Teletext is oftenunsupported and other services can only be inserted in one specificline). In many cases, however, it is sufficient to just set theservice_set field to the required servicesand let the driver fill theservice_linesarray according to hardware capabilities. Only if more precise controlis needed should the programmer set theservice_lines array explicitly.

The VIDIOC_S_FMT ioctl returns anEINVAL error code only when thegiven parameters are ambiguous, otherwise it modifies the parametersaccording to hardware capabilities. When the driver allocatesresources at this point, it may return anEBUSY error code if the requiredresources are temporarily unavailable. Other resource allocationpoints which may returnEBUSY can be theVIDIOC_STREAMON ioctl and the firstread(),write() andselect() call.

Table 4-6. structv4l2_sliced_vbi_format

__u32 service_set

Ifservice_set is non-zero when passed withVIDIOC_S_FMTorVIDIOC_TRY_FMT, theservice_lines array will be filled by thedriver according to the services specified in this field. For example,ifservice_set is initialized withV4L2_SLICED_TELETEXT_B | V4L2_SLICED_WSS_625, adriver for the cx25840 video decoder sets lines 7-22 of bothfieldsa toV4L2_SLICED_TELETEXT_Band line 23 of the first field toV4L2_SLICED_WSS_625. Ifservice_set is set to zero, then the valuesofservice_lines will be used instead.

On return the driver sets this field to the union of allelements of the returnedservice_linesarray. It may contain less services than requested, perhaps just one,if the hardware cannot handle more services simultaneously. It may beempty (zero) if none of the requested services are supported by thehardware.

__u16 service_lines[2][24]

Applications initialize thisarray with sets of data services the driver shall look for or inserton the respective scan line. Subject to hardware capabilities driversreturn the requested set, a subset, which may be just a singleservice, or an empty set. When the hardware cannot handle multipleservices on the same line the driver shall choose one. No assumptionscan be made on which service the driver chooses.

Dataservices are defined in Table 4-7. Array indicesmap to ITU-R line numbers (see also Figure 4-2 and Figure 4-3) as follows:

    Element 525 line systems 625 line systems
    service_lines[0][1] 1 1
    service_lines[0][23] 23 23
    service_lines[1][1] 264 314
    service_lines[1][23] 286 336
    Drivers must setservice_lines[0][0] andservice_lines[1][0] to zero.
__u32 io_size Maximum number of bytes passed byone read() orwrite() call, and the buffer size in bytes fortheVIDIOC_QBUF andVIDIOC_DQBUF ioctl. Drivers set this field tothe size of struct v4l2_sliced_vbi_data times the number of non-zeroelements in the returnedservice_linesarray (that is the number of lines potentially carrying data).
__u32 reserved[2] This array is reserved for futureextensions. Applications and drivers must set it to zero.
Notes:
a. According to ETS 300 706 lines 6-22 of thefirst field and lines 5-22 of the second field may carry Teletextdata.

Table 4-7. Sliced VBI services

Symbol Value Reference Lines, usually Payload
V4L2_SLICED_TELETEXT_B(Teletext System B) 0x0001 ETS 300 706,ITU BT.653 PAL/SECAM line 7-22, 320-335 (second field 7-22) Last 42 of the 45 byte Teletext packet, that iswithout clock run-in and framing code, lsb first transmitted.
V4L2_SLICED_VPS 0x0400 ETS 300 231 PAL line 16 Byte number 3 to 15 according to Figure 9 ofETS 300 231, lsb first transmitted.
V4L2_SLICED_CAPTION_525 0x1000 EIA 608-B NTSC line 21, 284 (second field 21) Two bytes in transmission order, including paritybit, lsb first transmitted.
V4L2_SLICED_WSS_625 0x4000 ITU BT.1119,EN 300 294 PAL/SECAM line 23
Byte         0                 1
      msb         lsb  msb           lsb
 Bit  7 6 5 4 3 2 1 0  x x 13 12 11 10 9
V4L2_SLICED_VBI_525 0x1000 Set of services applicable to 525line systems.
V4L2_SLICED_VBI_625 0x4401 Set of services applicable to 625line systems.

Drivers may return an EINVAL error code when applications attempt toread or write data without prior format negotiation, after switchingthe video standard (which may invalidate the negotiated VBIparameters) and after switching the video input (which may change thevideo standard as a side effect). The VIDIOC_S_FMT ioctl may returnan EBUSY error code when applications attempt to change the format while i/o isin progress (between aVIDIOC_STREAMON andVIDIOC_STREAMOFF call,and after the firstread() orwrite() call).


4.8.4. Reading and writing sliced VBI data

A single read() or write() call must pass all databelonging to one video frame. That is an array ofv4l2_sliced_vbi_data structures with one ormore elements and a total size not exceedingio_size bytes. Likewise in streaming I/Omode one buffer of io_size bytes mustcontain data of one video frame. Theidofunusedv4l2_sliced_vbi_data elements must bezero.

Table 4-8. structv4l2_sliced_vbi_data

__u32 id A flag from Table 2identifying the type of data in this packet. Only a single bit must beset. When theid of a captured packet iszero, the packet is empty and the contents of other fields areundefined. Applications shall ignore empty packets. When theid of a packet for output is zero thecontents of thedata field are undefinedand the driver must no longer insert data on the requestedfieldandline.
__u32 field The video field number this data has been capturedfrom, or shall be inserted at.0 for the firstfield,1 for the second field.
__u32 line The field (as opposed to frame) line number thisdata has been captured from, or shall be inserted at. SeeFigure 4-2 andFigure 4-3 for validvalues. Sliced VBI capture devices can set the line number of allpackets to0 if the hardware cannot reliablyidentify scan lines. The field number must always be valid.
__u32 reserved This field is reserved for future extensions.Applications and drivers must set it to zero.
__u8 data[48] The packet payload. See Table 2 for the contents and number ofbytes passed for each data type. The contents of padding bytes at theend of this array are undefined, drivers and applications shall ignorethem.

Packets are always passed in ascending line number order,without duplicate line numbers. Thewrite() function and theVIDIOC_QBUF ioctl must return an EINVAL error code when applications violatethis rule. They must also return anEINVAL error code when applications pass anincorrect field or line number, or a combination offield,line andidwhich has not been negotiated with theVIDIOC_G_FMT orVIDIOC_S_FMT ioctl. When the line numbers areunknown the driver must pass the packets in transmitted order. Thedriver can insert empty packets withid setto zero anywhere in the packet array.

To assure synchronization and to distinguish from framedropping, when a captured frame does not carry any of the requesteddata services drivers must pass one or more empty packets. When anapplication fails to pass VBI data in time for output, the drivermust output the last VPS and WSS packet again, and disable the outputof Closed Caption and Teletext data, or output data which is ignoredby Closed Caption and Teletext decoders.

A sliced VBI device may support read/write and/or streaming (memory mapping and/oruserpointer) I/O. The latter bears the possibility of synchronizingvideo and VBI data by using buffer timestamps.


4.9. Teletext Interface

This interface aims at devices receiving and demodulatingTeletext data [ETS 300 706,ITU BT.653], evaluating theTeletext packages and storing formatted pages in cache memory. Suchdevices are usually implemented as microcontrollers with serialinterface (I2C) and can be found on olderTV cards, dedicated Teletext decoding cards and home-brew devicesconnected to the PC parallel port.

The Teletext API was designed by Martin Buck. It is defined inthe kernel header filelinux/videotext.h, thespecification is available fromhttp://home.pages.de/~videotext/. (Videotext is the name ofthe German public television Teletext service.) Conventional characterdevice file names are/dev/vtx and/dev/vttuner, with device number 83, 0 and 83, 16respectively. A similar interface exists for the Philips SAA5249Teletext decoder [specification?] with character device file names/dev/tlkN, device number 102, N.

Eventually the Teletext API was integrated into the V4L APIwith character device file names/dev/vtx0 to/dev/vtx31, device major number 81, minor numbers192 to 223. For reference the V4L Teletext API specification isreproduced here in full: "Teletext interfaces talk the existing VTXAPI." Teletext devices with major number 83 and 102 will be removed inLinux 2.6.

There are no plans to replace the Teletext API or to integrateit into V4L2. Please write to the Video4Linux mailing list:https://listman.redhat.com/mailman/listinfo/video4linux-list when the need arises.


4.10. Radio Interface

This interface is intended for AM and FM (analog) radioreceivers.

Conventionally V4L2 radio devices are accessed throughcharacter device special files named/dev/radioand/dev/radio0 to/dev/radio63 with major number 81 and minornumbers 64 to 127.


4.10.1. Querying Capabilities

Devices supporting the radio interface set theV4L2_CAP_RADIO andV4L2_CAP_TUNER flag in thecapabilities field of struct v4l2_capabilityreturned by the VIDIOC_QUERYCAP ioctl. Other combinations ofcapability flags are reserved for future extensions.


4.10.2. Supplemental Functions

Radio devices can support controls, and must support the tuner ioctls.

They do not support the video input or output, audio inputor output, video standard, cropping and scaling, compression andstreaming parameter, or overlay ioctls. All other ioctls and I/Omethods are reserved for future extensions.


4.10.3. Programming

Radio devices may have a couple audio controls (as discussedin Section 1.8) such as a volume control, possibly customcontrols. Further all radio devices have one tuner (these arediscussed inSection 1.6) with index number zero to selectthe radio frequency and to determine if a monaural or FM stereoprogram is received. Drivers switch automatically between AM and FMdepending on the selected frequency. TheVIDIOC_G_TUNER ioctlreports the supported frequency range.


4.11. RDS Interface

The Radio Data System transmits supplementaryinformation in binary format, for example the station name or travelinformation, on a inaudible audio subcarrier of a radio program. Thisinterface aims at devices capable of receiving and decoding RDSinformation.

The V4L API defines its RDS API as follows.

From radio devices supporting it, RDS data can be readwith the read() function. The data is packed in groups of three,as follows:

  1. First Octet Least Significant Byte of RDS Block

  2. Second Octet Most Significant Byte of RDS Block

  3. Third Octet Bit 7: Error bit. Indicates that anuncorrectable error occurred during reception of this block. Bit 6:Corrected bit. Indicates that an error was corrected for this datablock. Bits 5-3: Received Offset. Indicates the offset received by thesync system. Bits 2-0: Offset Name. Indicates the offset applied tothis data.

It was argued the RDS API should beextended before integration into V4L2, no new API has been devised yet.Please write to the Video4Linux mailing list for discussion:https://listman.redhat.com/mailman/listinfo/video4linux-list. Meanwhile no V4L2 driver should set theV4L2_CAP_RDS_CAPTURE capability flag.

I. Function Reference

Table of Contents
V4L2 close() -- Close a V4L2 device
V4L2 ioctl() -- Program a V4L2 device
ioctl VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities
ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers
ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command
ioctl VIDIOC_ENUMAUDIO -- Enumerate audio inputs
ioctl VIDIOC_ENUMAUDOUT -- Enumerate audio outputs
ioctl VIDIOC_ENUM_FMT -- Enumerate image formats
ioctl VIDIOC_ENUM_FRAMESIZES -- Enumerate frame sizes
ioctl VIDIOC_ENUM_FRAMEINTERVALS -- Enumerate frame intervals
ioctl VIDIOC_ENUMINPUT -- Enumerate video inputs
ioctl VIDIOC_ENUMOUTPUT -- Enumerate video outputs
ioctl VIDIOC_ENUMSTD -- Enumerate supported video standards
ioctl VIDIOC_G_AUDIO, VIDIOC_S_AUDIO -- Query or select the current audio input and itsattributes
ioctl VIDIOC_G_AUDOUT, VIDIOC_S_AUDOUT -- Query or select the current audio output
ioctl VIDIOC_G_CHIP_IDENT -- Identify the chips on a TV card
ioctl VIDIOC_G_CROP, VIDIOC_S_CROP -- Get or set the current cropping rectangle
ioctl VIDIOC_G_CTRL, VIDIOC_S_CTRL -- Get or set the value of a control
ioctl VIDIOC_G_ENC_INDEX -- Get meta data about a compressed video stream
ioctl VIDIOC_G_EXT_CTRLS, VIDIOC_S_EXT_CTRLS,VIDIOC_TRY_EXT_CTRLS -- Get or set the value of several controls, try controlvalues
ioctl VIDIOC_G_FBUF, VIDIOC_S_FBUF -- Get or set frame buffer overlay parameters
ioctl VIDIOC_G_FMT, VIDIOC_S_FMT,VIDIOC_TRY_FMT -- Get or set the data format, try a format
ioctl VIDIOC_G_FREQUENCY, VIDIOC_S_FREQUENCY -- Get or set tuner or modulator radiofrequency
ioctl VIDIOC_G_INPUT, VIDIOC_S_INPUT -- Query or select the current video input
ioctl VIDIOC_G_JPEGCOMP, VIDIOC_S_JPEGCOMP -- 
ioctl VIDIOC_G_MODULATOR, VIDIOC_S_MODULATOR -- Get or set modulator attributes
ioctl VIDIOC_G_OUTPUT, VIDIOC_S_OUTPUT -- Query or select the current video output
ioctl VIDIOC_G_PARM, VIDIOC_S_PARM -- Get or set streaming parameters
ioctl VIDIOC_G_PRIORITY, VIDIOC_S_PRIORITY -- Query or request the access priority associated with afile descriptor
ioctl VIDIOC_G_SLICED_VBI_CAP -- Query sliced VBI capabilities
ioctl VIDIOC_G_STD, VIDIOC_S_STD -- Query or select the video standard of the current input
ioctl VIDIOC_G_TUNER, VIDIOC_S_TUNER -- Get or set tuner attributes
ioctl VIDIOC_LOG_STATUS -- Log driver status information
ioctl VIDIOC_OVERLAY -- Start or stop video overlay
ioctl VIDIOC_QBUF, VIDIOC_DQBUF -- Exchange a buffer with the driver
ioctl VIDIOC_QUERYBUF -- Query the status of a buffer
ioctl VIDIOC_QUERYCAP -- Query device capabilities
ioctl VIDIOC_QUERYCTRL, VIDIOC_QUERYMENU -- Enumerate controls and menu control items
ioctl VIDIOC_QUERYSTD -- Sense the video standard received by the currentinput
ioctl VIDIOC_REQBUFS -- Initiate Memory Mapping or User Pointer I/O
ioctl VIDIOC_STREAMON, VIDIOC_STREAMOFF -- Start or stop streaming I/O
V4L2 mmap() -- Map device memory into application address space
V4L2 munmap() -- Unmap device memory
V4L2 open() -- Open a V4L2 device
V4L2 poll() -- Wait for some event on a file descriptor
V4L2 read() -- Read from a V4L2 device
V4L2 select() -- Synchronous I/O multiplexing
V4L2 write() -- Write to a V4L2 device

V4L2 close()

Name

v4l2-close -- Close a V4L2 device

Synopsis

#include 

int close(int fd);

Arguments

fd

File descriptor returned by open().

Description

Closes the device. Any I/O in progress is terminated andresources associated with the file descriptor are freed. However dataformat parameters, current input or output, control values or otherproperties remain unchanged.

Return Value

The function returns 0 onsuccess, -1 on failure and theerrno is set appropriately. Possible errorcodes:

EBADF

fd is not a valid open filedescriptor.

V4L2 ioctl()

Name

v4l2-ioctl -- Program a V4L2 device

Synopsis

#include 

int ioctl(int fd, int request, void *argp);

Arguments

fd

File descriptor returned by open().

request

V4L2 ioctl request code as defined in the videodev.h header file, for exampleVIDIOC_QUERYCAP.

argp

Pointer to a function parameter, usually a structure.

Description

The ioctl() function is used to programV4L2 devices. The argumentfd must be an openfile descriptor. An ioctlrequest has encodedin it whether the argument is an input, output or read/writeparameter, and the size of the argumentargp inbytes. Macros and defines specifying V4L2 ioctl requests are locatedin thevideodev.h header file.Applications should use their own copy, not include the version in thekernel sources on the system they compile on. All V4L2 ioctl requests,their respective function and parameters are specified inReference I,Function Reference.

Return Value

On success the ioctl() function returns0 and does not reset theerrno variable. On failure-1 is returned, when the ioctl takes anoutput or read/write parameter it remains unmodified, and theerrno variable is set appropriately. See below forpossible error codes. Generic errors likeEBADForEFAULT are not listed in the sectionsdiscussing individual ioctl requests.

Note ioctls may return undefined error codes. Since errorsmay have side effects such as a driver reset applications shouldabort on unexpected errors.

EBADF

fd is not a valid open filedescriptor.

EBUSY

The property cannot be changed right now. Typicallythis error code is returned when I/O is in progress or the driversupports multiple opens and another process locked the property.

EFAULT

argp references an inaccessiblememory area.

ENOTTY

fd is not associated with acharacter special device.

EINVAL

The request or the data pointedto by argp is not valid. This is a very commonerror code, see the individual ioctl requests listed inReference I,Function Reference for actual causes.

ENOMEM

Not enough physical or virtual memory was available tocomplete the request.

ERANGE

The application attempted to set a control with theVIDIOC_S_CTRL ioctl to a value which is out of bounds.

ioctl VIDIOC_CROPCAP

Name

VIDIOC_CROPCAP -- Information about the video cropping and scaling abilities

Synopsis

int ioctl(int fd, int request, struct v4l2_cropcap*argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_CROPCAP

argp

Description

Applications use this function to query the croppinglimits, the pixel aspect of images and to calculate scale factors.They set thetype field of a v4l2_cropcapstructure to the respective buffer (stream) type and call theVIDIOC_CROPCAP ioctl with a pointer to thisstructure. Drivers fill the rest of the structure. The results areconstant except when switching the video standard. Remember thisswitch can occur implicit when switching the video input oroutput.

Table 1. struct v4l2_cropcap

enum v4l2_buf_type type Type of the data stream, set by the application.Only these types are valid here:V4L2_BUF_TYPE_VIDEO_CAPTURE,V4L2_BUF_TYPE_VIDEO_OUTPUT,V4L2_BUF_TYPE_VIDEO_OVERLAY, and custom (driverdefined) types with code V4L2_BUF_TYPE_PRIVATEand higher.
struct v4l2_rect bounds Defines the window within capturing or output ispossible, this may exclude for example the horizontal and verticalblanking areas. The cropping rectangle cannot exceed these limits.Width and height are defined in pixels, the driver writer is free tochoose origin and units of the coordinate system in the analogdomain.
struct v4l2_rect defrect Default cropping rectangle, it shall cover the"whole picture". Assuming pixel aspect 1/1 this could be for example a640 × 480 rectangle for NTSC, a768 × 576 rectangle for PAL and SECAM centered overthe active picture area. The same co-ordinate system as for bounds is used.
struct v4l2_fract pixelaspect

This is the pixel aspect (y / x) when noscaling is applied, the ratio of the actual samplingfrequency and the frequency required to get squarepixels.

When cropping coordinates refer to square pixels,the driver sets pixelaspect to 1/1. Othercommon values are 54/59 for PAL and SECAM, 11/10 for NTSC sampledaccording to [ITU BT.601].

Table 2. struct v4l2_rect

__s32 left Horizontal offset of the top, left corner of therectangle, in pixels.
__s32 top Vertical offset of the top, left corner of therectangle, in pixels.
__s32 width Width of the rectangle, in pixels.
__s32 height Height of the rectangle, in pixels. Widthand height cannot be negative, the fields are signed forhysterical reasons.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The struct v4l2_cropcaptype isinvalid or the ioctl is not supported. This is not permitted forvideo capture, output and overlay devices, which must supportVIDIOC_CROPCAP.

ioctl VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER

Name

VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER -- Read or write hardware registers

Synopsis

int ioctl(int fd, int request, struct v4l2_register *argp);

int ioctl(int fd, int request, const struct v4l2_register*argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_DBG_G_REGISTER, VIDIOC_DBG_S_REGISTER

argp

Description

Experimental: This is an experimentalinterface and may change in the future.

For driver debugging purposes these ioctls allow testapplications to access hardware registers directly. Regularapplications should not use them.

Since writing or even reading registers can jeopardize thesystem security, its stability and damage the hardware, both ioctlsrequire superuser privileges. Additionally the Linux kernel must becompiled with theCONFIG_VIDEO_ADV_DEBUG optionto enable these ioctls.

To write a register applications must initialize all fieldsof a struct v4l2_register and callVIDIOC_DBG_S_REGISTER with a pointer to thisstructure. The match_type andmatch_chip fields select a chip on the TVcard, thereg field specifies a registernumber and theval field the value to bewritten into the register.

To read a register applications must initialize thematch_type,match_chip andreg fields, and callVIDIOC_DBG_G_REGISTER with a pointer to thisstructure. On success the driver stores the register value in thevalfield. On failure the structure remainsunchanged.

When match_type isV4L2_CHIP_MATCH_HOST,match_chip selects the nth non-I2C chipon the TV card. Drivers may also interpretmatch_chip as a random ID, but we recommendagainst that. The number zero always selects the host chip, e. g. thechip connected to the PCI bus. You can find out which chips arepresent with theVIDIOC_G_CHIP_IDENT ioctl.

When match_type isV4L2_CHIP_MATCH_I2C_DRIVER,match_chip contains a driver ID as definedin thelinux/i2c-id.hheader file. For instanceI2C_DRIVERID_SAA7127 will match any chipsupported by the saa7127 driver, regardless of its I2C bus address.When multiple chips supported by the same driver are present, theeffect of these ioctls is undefined. Again with theVIDIOC_G_CHIP_IDENT ioctl you can find out which I2C chips arepresent.

When match_type isV4L2_CHIP_MATCH_I2C_ADDR,match_chip selects a chip by its 7 bit I2Cbus address.

Success not guaranteed: Due to a flaw in the Linux I2C bus driver these ioctls mayreturn successfully without actually reading or writing a register. Tocatch the most likely failure we recommend aVIDIOC_G_CHIP_IDENTcall confirming the presence of the selected I2C chip.

These ioctls are optional, not all drivers may support them.However when a driver supports these ioctls it must also supportVIDIOC_G_CHIP_IDENT. Conversely it may supportVIDIOC_G_CHIP_IDENT but not these ioctls.

VIDIOC_DBG_G_REGISTER andVIDIOC_DBG_S_REGISTER were introduced in Linux2.6.21.

We recommended the v4l2-dbgutility over calling these ioctls directly. It is available from theLinuxTV v4l-dvb repository; seehttp://linuxtv.org/repo/ foraccess instructions.

Table 1. struct v4l2_register

__u32 match_type See Table 2 for a list of possible types.  
__u32 match_chip Match a chip by this number, interpreted accordingto thematch_type field.  
__u64 reg A register number.  
__u64 val The value read from, or to be written into theregister.  

Table 2. Chip Match Types

V4L2_CHIP_MATCH_HOST 0 Match the nth chip on the card, zero for the host chip. Does not match I2C chips.
V4L2_CHIP_MATCH_I2C_DRIVER 1 Match an I2C chip by its driver ID from thelinux/i2c-id.hheader file.
V4L2_CHIP_MATCH_I2C_ADDR 2 Match a chip by its 7 bit I2C bus address.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The driver does not support this ioctl, or the kernelwas not compiled with theCONFIG_VIDEO_ADV_DEBUGoption, or thematch_type is invalid, or theselected chip or register does not exist.

EPERM

Insufficient permissions. Root privileges are requiredto execute these ioctls.

ioctl VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD

Name

VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD -- Execute an encoder command

Synopsis

int ioctl(int fd, int request, struct v4l2_encoder_cmd *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENCODER_CMD, VIDIOC_TRY_ENCODER_CMD

argp

Description

Experimental: This is an experimentalinterface and may change in the future.

These ioctls control an audio/video (usually MPEG-) encoder.VIDIOC_ENCODER_CMD sends a command to theencoder,VIDIOC_TRY_ENCODER_CMD can be used totry a command without actually executing it.

To send a command applications must initialize all fields of a struct v4l2_encoder_cmd and callVIDIOC_ENCODER_CMDorVIDIOC_TRY_ENCODER_CMD with a pointer to this structure.

The cmd field must contain thecommand code. Theflags field is currentlyonly used by the STOP command and contains one bit: If theV4L2_ENC_CMD_STOP_AT_GOP_END flag is set,encoding will continue until the end of the currentGroupOf Pictures, otherwise it will stop immediately.

read() call sends a START command tothe encoder if it has not been started yet. After a STOP command,read() calls will read the remaining databuffered by the driver. When the buffer is empty,read() will return zero and the nextread() call will restart the encoder.

close() call sends an immediate STOPto the encoder, and all buffered data is discarded.

These ioctls are optional, not all drivers may supportthem. They were introduced in Linux 2.6.21.

Table 1. struct v4l2_encoder_cmd

__u32 cmd The encoder command, see Table 2.
__u32 flags Flags to go with the command, see Table 3. If no flags are defined forthis command, drivers and applications must set this field tozero.
__u32 data[8] Reserved for future extensions. Drivers andapplications must set the array to zero.

Table 2. Encoder Commands

V4L2_ENC_CMD_START 0 Start the encoder. When the encoder is alreadyrunning or paused, this command does nothing. No flags are defined forthis command.
V4L2_ENC_CMD_STOP 1 Stop the encoder. When theV4L2_ENC_CMD_STOP_AT_GOP_END flag is set,encoding will continue until the end of the currentGroupOf Pictures, otherwise encoding will stop immediately.When the encoder is already stopped, this command doesnothing.
V4L2_ENC_CMD_PAUSE 2 Pause the encoder. When the encoder has not beenstarted yet, the driver will return anEPERM error code. When the encoder isalready paused, this command does nothing. No flags are defined forthis command.
V4L2_ENC_CMD_RESUME 3 Resume encoding after a PAUSE command. When theencoder has not been started yet, the driver will return anEPERM error code.When the encoder is already running, this command does nothing. Noflags are defined for this command.

Table 3. Encoder Command Flags

V4L2_ENC_CMD_STOP_AT_GOP_END 0x0001 Stop encoding at the end of the current Group OfPictures, rather than immediately.

Return Value

On success 0 is returned, on error -1 and the errno variable is set appropriately:

EINVAL

The driver does not support this ioctl, or thecmd field is invalid.

EPERM

The application sent a PAUSE or RESUME command whenthe encoder was not running.

ioctl VIDIOC_ENUMAUDIO

Name

VIDIOC_ENUMAUDIO -- Enumerate audio inputs

Synopsis

int ioctl(int fd, int request, struct v4l2_audio *argp);

Arguments

fd

File descriptor returned by open().

request

VIDIOC_ENUMAUDIO

argp

Description

To query the attributes of an audio input applicationsinitialize the index field and zero out thereserved array of a struct 

你可能感兴趣的:(linux)