I need to stream live h.264-encoded video from an IP camera to the browser, while supporting all common browsers and mobile devices (i.e. Android, Firefox, Chrome, IE, Safari (Mac OS and iOS)), and while keeping bandwidth requirements and latency to a minimum.
MPEG-DASH requires browser support for Media Source Extensions, which are NOT supported by iOS. So that's out.
HLS is only supported by Safari and Edge.
Also DASH seems to impose a latency of several seconds, which is not preferable.
I would like to be able to chunk the incoming h.264 data (i.e. fragmented MP4), pass the chunked data to the browser via Websockets, then dump the chunks into some sort of player as they arrive.
Broadway and its forks are a javascript h.264 decoder, and there is a Broadway-stream project that supports streams instead of files, but the docs are poor and I can only find examples of streaming when the source is not live.
The most pressing question is: how do I hand the "chunked data" to a player or Video HTML element as it arrives at the browser?
I think the ideal setup would be to
- Use ffmpeg to transcode the original video to a chunked format (fMP4)
- Pipe the chunked output to a Node JS app which emits each chunk out through a Websocket to all connected viewers
- Viewers' browsers dump each incoming chunk into some sort of decoder which renders the video.
I'm clear up to the point of handing the received chunks to a video decoder. How can that be done without depending on Media Source Extensions, and allowing viewers to join the stream at random times?