Linux操作系统支持实时视频和音频硬件,如摄像头,TV调谐器,视频采集卡,FM广播调谐器,视频输出设备等的主要API的应用程序来访问这些设备是video4linux的。
Video4Linux的是一个内核 API,所以必须为每个受支持的设备的内核驱动程序。在用户层面,设备的访问是通过设备文件标准化。的情况下,视频捕获设备,如摄像头,这是本文件的重点,文件/ dev/video0的,/ dev/video1等,尽可能多的设备连接。
每一个设备类的设备文件和用户级应用程序之间交换数据的标准格式。这使得应用程序的每一个视频捕获设备的Linux驱动程序,有一个立刻就可以兼容。
内置摄像头目前在一些诺基亚互联网平板设备是兼容视频-4-Linux版本2 API http://www.thedirks.org/v4l2/的。原则上,任何与此API兼容的应用程序是很容易移植到Maemo平台。
由于Maemo平台将所有的多媒体处理GStreamer框架,应用程序需要访问的内置摄像头采用的GStreamer对于这一点,而不是直接访问video4linux的设备,通过的v4l2src的 GStreamer的模块。
的GStreamer的灵活性,开发人员可以充分测试任何给定的应用程序在一个普通的台式PC与连接的摄像头,然后在互联网平板电脑本身进行最后的测试,没有一个单一的源代码中的变化,因为GStreamer是指模块的文字名称。
摄像头在平板的一个重要注意事项:只有一个申请者可以用它在任何给定的时间。所以,在应用程序中使用相机时,可能利用它的任务(如视频通话)将被封锁。
为了演示如何进行相机操作,提供一个示例应用程序和讨论。
在该示例中,功能initialize_pipeline()是最有趣的,因为它是负责创建GStreamer的管道,采购数据从Video4Linux的和下沉到 xvimagesink, (其中X是一个优化的帧缓冲)。的管道方案如下:
|Screen| |Screen| ->|queue |->|sink |-> Display |Camera| |CSP | |Tee|/ |src |->|Filter|->| |\ |Image| |Image | |Image| ->|queue|-> |filter|->|sink |-> JPEG file
This sample application is not different from other GStreamerapplications, be it Linux generic or maemo specific apps:
static gboolean initialize_pipeline(AppData *appdata, int *argc, char ***argv) { GstElement *pipeline, *camera_src, *screen_sink, *image_sink; GstElement *screen_queue, *image_queue; GstElement *csp_filter, *image_filter, *tee; GstCaps *caps; GstBus *bus; /* Initialize Gstreamer */ gst_init(argc, argv); /* Create elements */ /* Camera video stream comes from a Video4Linux driver */ camera_src = gst_element_factory_make(VIDEO_SRC, "camera_src"); /* Colorspace filter is needed to make sure that sinks understands * the stream coming from the camera */ csp_filter = gst_element_factory_make("ffmpegcolorspace", "csp_filter"); /* Tee that copies the stream to multiple outputs */ tee = gst_element_factory_make("tee", "tee"); /* Queue creates new thread for the stream */ screen_queue = gst_element_factory_make("queue", "screen_queue"); /* Sink that shows the image on screen. Xephyr doesn't support XVideo * extension, so it needs to use ximagesink, but the device uses * xvimagesink */ screen_sink = gst_element_factory_make(VIDEO_SINK, "screen_sink"); /* Creates separate thread for the stream from which the image * is captured */ image_queue = gst_element_factory_make("queue", "image_queue"); /* Filter to convert stream to use format that the gdkpixbuf library * can use */ image_filter = gst_element_factory_make("ffmpegcolorspace", "image_filter"); /* A dummy sink for the image stream. Goes to bitheaven */ image_sink = gst_element_factory_make("fakesink", "image_sink"); /* Check that elements are correctly initialized */ if(!(pipeline && camera_src && screen_sink && csp_filter && screen_queue && image_queue && image_filter && image_sink)) { g_critical("Couldn't create pipeline elements"); return FALSE; } /* Set image sink to emit handoff-signal before throwing away * it's buffer */ g_object_set(G_OBJECT(image_sink), "signal-handoffs", TRUE, NULL); /* Add elements to the pipeline. This has to be done prior to * linking them */ gst_bin_add_many(GST_BIN(pipeline), camera_src, csp_filter, tee, screen_queue, screen_sink, image_queue, image_filter, image_sink, NULL); /* Specify what kind of video is wanted from the camera */ caps = gst_caps_new_simple("video/x-raw-rgb", "width", G_TYPE_INT, 640, "height", G_TYPE_INT, 480, NULL); /* Link the camera source and colorspace filter using capabilities * specified */ if(!gst_element_link_filtered(camera_src, csp_filter, caps)) { return FALSE; } gst_caps_unref(caps); /* Connect Colorspace Filter -> Tee -> Screen Queue -> Screen Sink * This finalizes the initialization of the screen-part of the pipeline */ if(!gst_element_link_many(csp_filter, tee, screen_queue, screen_sink, NULL)) { return FALSE; } /* gdkpixbuf requires 8 bits per sample which is 24 bits per * pixel */ caps = gst_caps_new_simple("video/x-raw-rgb", "width", G_TYPE_INT, 640, "height", G_TYPE_INT, 480, "bpp", G_TYPE_INT, 24, "depth", G_TYPE_INT, 24, "framerate", GST_TYPE_FRACTION, 15, 1, NULL); /* Link the image-branch of the pipeline. The pipeline is * ready after this */ if(!gst_element_link_many(tee, image_queue, image_filter, NULL)) return FALSE; if(!gst_element_link_filtered(image_filter, image_sink, caps)) return FALSE; gst_caps_unref(caps); /* As soon as screen is exposed, window ID will be advised to the sink */ g_signal_connect(appdata->screen, "expose-event", G_CALLBACK(expose_cb), screen_sink); gst_element_set_state(pipeline, GST_STATE_PLAYING); return TRUE; }
当用户已经按下“拍照”按钮和图像接收器的数据,下面的函数被称为回。它将图像缓冲器转发create_jpeg() :
static gboolean buffer_probe_callback( GstElement *image_sink, GstBuffer *buffer, GstPad *pad, AppData *appdata) { GstMessage *message; gchar *message_name; /* This is the raw RGB-data that image sink is about * to discard */ unsigned char *data_photo = (unsigned char *) GST_BUFFER_DATA(buffer); /* Create a JPEG of the data and check the status */ if(!create_jpeg(data_photo)) message_name = "photo-failed"; else message_name = "photo-taken"; /* Disconnect the handler so no more photos * are taken */ g_signal_handler_disconnect(G_OBJECT(image_sink), appdata->buffer_cb_id); /* Create and send an application message which will be * catched in the bus watcher function. This has to be * sent as a message because this callback is called in * a gstreamer thread and calling GUI-functions here would * lead to X-server synchronization problems */ message = gst_message_new_application(GST_OBJECT(appdata->pipeline), gst_structure_new(message_name, NULL)); gst_element_post_message(appdata->pipeline, message); /* Returning TRUE means that the buffer can is OK to be * sent forward. When using fakesink this doesn't really * matter because the data is discarded anyway */ return TRUE; }
xvimagesink GStreamer的模块通常会只是为自己创建一个新的窗口。由于视频应该是主应用程序窗口内所示,X-Window的窗口ID需要被传递到该模块,尽快ID存在:static gboolean expose_cb(GtkWidget * widget, GdkEventExpose * event, gpointer data) { /* Tell the xvimagesink/ximagesink the x-window-id of the screen * widget in which the video is shown. After this the video * is shown in the correct widget */ gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(data), GDK_WINDOW_XWINDOW(widget->window)); return FALSE; }
为了完整性的缘故,它如下JPEG编码功能。这是值得一提的是的缓冲区,来自GStreamer的是一个简单的线性帧缓冲:static gboolean create_jpeg(unsigned char *data) { GdkPixbuf *pixbuf = NULL; GError *error = NULL; guint height, width, bpp; const gchar *directory; GString *filename; guint base_len, i; struct stat statbuf; width = 640; height = 480; bpp = 24; /* Define the save folder */ directory = SAVE_FOLDER_DEFAULT; if(directory == NULL) { directory = g_get_tmp_dir(); } /* Create an unique file name */ filename = g_string_new(g_build_filename(directory, PHOTO_NAME_DEFAULT, NULL)); base_len = filename->len; g_string_append(filename, PHOTO_NAME_SUFFIX_DEFAULT); for(i = 1; !stat(filename->str, &statbuf); ++i) { g_string_truncate(filename, base_len); g_string_append_printf(filename, "%d%s", i, PHOTO_NAME_SUFFIX_DEFAULT); } /* Create a pixbuf object from the data */ pixbuf = gdk_pixbuf_new_from_data(data, GDK_COLORSPACE_RGB, /* RGB-colorspace */ FALSE, /* No alpha-channel */ bpp/3, /* Bits per RGB-component */ width, height, /* Dimensions */ 3*width, /* Number of bytes between lines (ie stride) */ NULL, NULL); /* Callbacks */ /* Save the pixbuf content's in to a jpeg file and check for * errors */ if(!gdk_pixbuf_save(pixbuf, filename->str, "jpeg", &error, NULL)) { g_warning("%s\n", error->message); g_error_free(error); gdk_pixbuf_unref(pixbuf); g_string_free(filename, TRUE); return FALSE; } /* Free allocated resources and return TRUE which means * that the operation was succesful */ g_string_free(filename, TRUE); gdk_pixbuf_unref(pixbuf); return TRUE; }
更多的内容请到官网查看
官网说明地址:http://maemo.org/development/documentation/manuals/4-0-x/how_to_use_camera_api/