2
votes

I need to stream live h.264-encoded video from an IP camera to the browser, while supporting all common browsers and mobile devices (i.e. Android, Firefox, Chrome, IE, Safari (Mac OS and iOS)), and while keeping bandwidth requirements and latency to a minimum.

MPEG-DASH requires browser support for Media Source Extensions, which are NOT supported by iOS. So that's out.

HLS is only supported by Safari and Edge.

Also DASH seems to impose a latency of several seconds, which is not preferable.

I would like to be able to chunk the incoming h.264 data (i.e. fragmented MP4), pass the chunked data to the browser via Websockets, then dump the chunks into some sort of player as they arrive.

Broadway and its forks are a javascript h.264 decoder, and there is a Broadway-stream project that supports streams instead of files, but the docs are poor and I can only find examples of streaming when the source is not live.

The most pressing question is: how do I hand the "chunked data" to a player or Video HTML element as it arrives at the browser?

I think the ideal setup would be to

  1. Use ffmpeg to transcode the original video to a chunked format (fMP4)
  2. Pipe the chunked output to a Node JS app which emits each chunk out through a Websocket to all connected viewers
  3. Viewers' browsers dump each incoming chunk into some sort of decoder which renders the video.

I'm clear up to the point of handing the received chunks to a video decoder. How can that be done without depending on Media Source Extensions, and allowing viewers to join the stream at random times?

1
Low bandwidth, Low Latency, All platform. Thats the holy grail of streaming. What exactly d you mean by "chunk" in your terminology? The term chunk as different meaning in HTTP chunked-transfer and in low latency CMAF (If you don't know low latency CMAF, reading up on that may answer a lot of your questions)szatmary
Also, in browsers, there is not "some sort of decoder". MSE and WebRTC is all you get.szatmary
Yes there are software decoders like Broadway.Ryan Griggs
Regarding chunking, see fMP4 which I referenced above.Ryan Griggs
Make sure you understand the h.264 license. You will likely have to pay mpeg LA to use such a decoder (ever download counts, even if the same user downloaded it multiple times). So by chunk you mean one fmp4 fragment? Because each fragment can include several CMAF chunks, and each CMAF chunk can be split into HTTP chunks.szatmary

1 Answers

1
votes

You are a contradicting yourself a little, because these two things:

Low latency and Chunked data can't go together, it's either one or the other. When you accumulate a chunk of N seconds length, you introduce N seconds latency.

So if you need low latency playback of your H.264 live stream which will play in browsers on all devices, your only choice is WebRTC. Chunked-based streaming like HLS or Dash will not help you.

Media Source Extensions via Websockets is another alternative, and it works with chunks being streamed over Websocket to MSE in browsers; but you need send chunks of 30-100 ms length to stay with sub-second latency. It also doesn't work on iOS.

Here you can compare the latency of WebRTC and MSE with live H.264 IP Camera: http://umediaserver.net/umediaserver/demohtml5WebRTCplayer.html http://umediaserver.net/umediaserver/demohtml5MSEplayer.html