Streaming OpenCV Videos Over the Network


http://mjpg-streamer.svn.sourceforge.net/viewvc

Table of Contents:

  1. Introduction
  2. Design of the System
  3. Implementation of the Server-side
    1. Frame Grabber
    2. Stream Server
  4. Implementation of the Client-side
    1. Stream Client
    2. Video Player
  5. Compiling
  6. Experiments & Gallery
  7. Summary
  8. Resources

1. Introduction

One of the most asked questions by OpenCV users is, "How can I stream videos over the network so I can process it on different computer?" OpenCV does not provide such a function by nature (or so I thought), so we have to write custom code to accomplish this task.

If you have experiences with network programming before, this should be quite easy. Just like when you send text files over the network, the same apply for this one. only this time we need to convert the received data into OpenCV's IplImage format.

In this tutorial, I will explain to you the system I built to stream OpenCV videos over TCP/IP network. Keep in mind that there are many ways to achieve this. Not to mention the freely available video streaming library such as ffmpeg and VLC . This is not about "this one is better", it is just about sharing the knowledge.

2. Design of the System

The system follows the client-server model. The computer that has the video input acts as the server. It waits for a client to connect and stream the videos once the connection has established. The diagram is shown below.

Streaming OpenCV videos over the network.
Fig 1. Streaming OpenCV videos over the network.

The diagram above shows several clients connect to the server and receive the streaming video simultaneously. However, to keep things simple, I made it that the server only accepts one client at a time.

If we look deeper into the server side, it should consists from two parts. One who read the video input in a loop, and one who waits for the client and send the video frames. It is impossible to have both parts as a single block of code, since they have to run simultaneously at the same time . To overcome this, we have to write a multi-threaded program.

The same also apply for the client side.

But another problem occurs, Windows and Unix-like systems have different way for handling with threads. While it is possible to write a code that compile and runs on both systems (using C preprocessor), it doesn't necessarily to. Let's just use Unix and throw Windows away.

In addition, I use Berkeley Sockets that is widely available on Unix-like systems for the networking code.

In summary, to make this as simplest as possible we keep these things in mind:

  • The Operating System is Unix-like. Therefore we're using POSIX Threads and Berkeley Sockets . (If you use Windows, install Cygwin first).
  • Only one client connected at a time.
  • The client knows the width and the height of the expected frame.
  • The client receives grayscaled (single channel) frames.

3. Implementation of the Server-side

The server side is the computer that has the video input to be streamed. And like I mentioned before, it consists of two parts. One who read the video input in a loop, and the other waits for the client to connect and send the video frames.

Stream server diagram.
Fig 2. Stream server diagram.

In the diagram above, we see two threads running on the server side: Frame Grabber and Stream Server . The input is taken from a webcam, but you can use other resources too, like an avi file.

Frame Grabber grabs a frame from the webcam and stores it to a global variable img . Stream Server waits for a client to connect. Once the connection has established, it sends img to the client repeatedly whenever a newer version of img is available.

The full listing of the server side is in stream_server.c . Next we'll see the detail of both threads.

3.a. Frame Grabber

This is the main thread of the server-side. Its just like the usual code to display video from webcam . Below is the code snippet from stream_server.c .

你可能感兴趣的:(Stream,server,video,input,NetWork,preprocessor)