In this article we demonstrate 7 technologically different ways to display a video stream from an IP camera with RTSP support on a web page in a browser.
As a rule, browsers do not support RTSP, so the video stream is converted for a browser using an intermediate server.
Browsers do not support the RTMP protocol, but guess who does? The old faithful Flash Player that works enough well even though it does not support all browsers, so it can display the video stream.
The code of the player is built on Action Script 3 and looks as follows:
1
2
3
|
var
nc:NetConnection = nc.connect(
"rtmp://192.168.88.59/live"
,obj);
var
subscribeStream:NetStream =
new
NetStream(nc);
subscribeStream.play(
"rtsp://192.168.88.5/live.sdp"
);
|
In this example:
rtmp://192.168.88.59/live – is the address of the intermediate server that fetches the RTSP video stream from the camera and converts it to RTMP.
rtsp://192.168.88.5/live.sdp – is the RTSP address of the camera.
A little bit superfluous variant of the player on Flex and AS3 is available here.
This method looks as follows:
It is hard to find those willing to keep coding on Action Script 3 these days. So, there is a method with an HTML wrapping that allows controlling the RTMP player from JavaScript. In this variant the flash is loaded to the HTML page only to display picture and play sound.
1
2
|
var
session = Flashphoner.createSession({urlServer:
"wss://192.168.88.59:8443"
});
session.createStream({name:
"rtsp://192.168.88.5/live.sdp"
, display:myVideo}).play();
|
The full source code of the player is here. And the method looks like this:
The RTMFP protocol also works inside the Flash Player. The difference from RTMP is that RTMFP works on top of the UDP, so it is much more suitable for low latency broadcasting.
The AS3 code of the player is identical to that of RTMP except for one letter F added in the line of code where the connection to the server is established.
1
2
3
|
var
nc:NetConnection = nc.connect(
"rtmfp://192.168.88.59/live"
,obj);
var
subscribeStream:NetStream =
new
NetStream(nc);
subscribeStream.play(
"rtsp://192.168.88.5/live.sdp"
);
|
Nevertheless, here is a screenshot using RTMFP
This way is identical to method 2 except that during initialization we set the RTMFP protocol for the underlying Flash (swf object).
1
2
|
Var session = Flashphoner.createSession({urlServer:
"wss://192.168.88.59:8443"
, flashProto:
"rtmfp"
});
session.createStream({name:
"rtsp://192.168.88.5/live.sdp"
, display:myVideo}).play();
|
Player picture:
In this case we do not use Flash at all, and the video stream is played using means of the browser itself, without using third-party plugins. This method works both in Chrome and Firefox Android browsers, where Flash is not available. WebRTC results in the lowest latency, less than 0.5 seconds.
The source code of the player is the same:
1
2
|
var
session = Flashphoner.createSession({urlServer:
"wss://192.168.88.59:8443"
});
session.createStream({name:
"rtsp://192.168.88.5/live.sdp"
, display:myVideo}).play();
|
The script automatically detects WebRTC support, and if ti is supported, the stream is played using WebRTC.
WebRTC and Flash do not cover all browsers and platforms. For instance, the iOS Safari browser does not support them.
You can deliver a video stream to iOS Safari using Websocket transport (a TCP connection between the browser and the server). Then, the RTSP video stream is channelled through Websockets. After the binary data are received, they can be decoded using JavaScript and rendered on Canvas HTML5 element.
This is what Websocket player does on the iOS Safari browser. The code of the player looks the same:
1
2
|
var
session = Flashphoner.createSession({urlServer:
"wss://192.168.88.59:8443"
});
session.createStream({name:
"rtsp://192.168.88.5/live.sdp"
, display:myVideo}).play();
|
This is somewhat similar to the Flash-based methods when the swf element lies under HTML5. Here, we have a JavaScript application under HTML5 that fetches data via Websockets, decodes them and renders them on Canvas in multiple threads.
Here is how an RTSP stream rendered on Canvas in the iOS Safari browser looks like:
When RTSP is converted to HLS, a video stream is divided to segments that are happily downloaded from the server and displayed in the HLS player.
As an HLS player we use video.js. The source code of the player can be downloaded here.
The player looks as follows:
The application retrieves the stream from the server via WebRTC. To goal of the server here is to convert RTSP to WebRTC and feed the result to the mobile application.
The Java-code of the player for Android is here and looks like this:
1
2
3
4
5
|
SessionOptions sessionOptions =
new
SessionOptions(
"wss://192.168.88.59:8443"
);
Session session = Flashphoner.createSession(sessionOptions);
StreamOptions streamOptions =
new
StreamOptions(
"rtsp://192.168.88.5/live.sdp"
);
Stream playStream = session.createStream(streamOptions);
playStream.play();
|
A test mobile app of the player can be installed from Google Play, and the sources of the application can be downloaded from here.
Here is how RTSP stream playback via WebRTC looks on Asus Android tablet:
Just like its Android brethren, the iOS application fetches a video stream from the server via WebRTC.
The Objective-C code of the player looks as shown below:
1
2
3
4
5
6
7
|
FPWCSApi2SessionOptions *options = [[FPWCSApi2SessionOptions alloc] init];
options.urlServer = @
"wss://192.168.88.59:8443"
;
FPWCSApi2Session *session = [FPWCSApi2 createSession:options error:&error];
FPWCSApi2StreamOptions *options = [[FPWCSApi2StreamOptions alloc] init];
options.name = @
"rtsp://192.168.88.5/live.sdp"
;
FPWCSApi2Stream *stream = [session createStream:options error:
nil
];
stream play:&error;
|
You can download the source code of the player for iOS here.
And you can install the test application that uses the above code chunks from App Store. Operation of the player with the RTSP stream looks as follows:
Let’s put the results together into a summary table:
Display method | Best for | Latency | |
1 | RTMP | Legacy Flash, Flex or Adobe Air applications | medium |
2 | RTMP + HTML5 | IE, Edge, Mac Safari browsers if Flash Player is installed | medium |
3 | RTMFP | Legacy Flash, Flex or Adobe Air applications that require low latency | low |
4 | RTMFP + HTML5 | IE, Edge, Mac Safari browsers if Flash Player is installed and when low latency is crucial | low |
5 | WebRTC | Chrome, Firefox, Opera browsers on mobile devices and desktops on Android and when real-time playback is crucial. | real-time |
6 | Websocket | Browsers that lack support for Flash and WebRTC, but the task requires low to medium latency. | medium |
7 | HLS | Any browser as long as latency is not important. | high |
8 | Android app, WebRTC | Native mobile applications for Android that require real-time latency. | real-time |
9 | iOS app, WebRTC | Native mobile applications for iOS that require real-time latency. | real-time |
For testing the methods we used Web Call Server 5 that is capable of converting an RTSP stream and transmitting it to all nine above described directions.
Web Call Server 5 – a server to broadcast an RTSP stream.
Flash Streaming – an example swf application playing streams via RTMP and RTMFP. Corresponds to methods 1 and 3.
Source – the source code of the swf application on Flex / AS3.
Player – an example web-application that plays an RTSP stream via RTMP, RTMFP, WebRTC, Websocket. Methods 2,4,5,6.
Source – the source code of the web player.
HLS player – an example web player playing HLS. Corresponds to method 7.
Source – the source code of the HLS player.
Android WebRTC player – the example of a mobile application that plays a video stream via WebRTC. Method 8.
Source – the source code of the mobile application.
iOS WebRTC player – the example of a mobile application that plays a video stream via WebRTC. Method 9.
Source – the source code of the mobile application.