125
votes

PROBLEM:

WebRTC gives us peer-to-peer video/audio connections. It is perfect for p2p calls, hangouts. But what about broadcasting (one-to-many, for example, 1-to-10000)?

Lets say we have a broadcaster "B" and two attendees "A1", "A2". Of course it seems to be solvable: we just connect B with A1 and then B with A2. So B sends video/audio stream directly to A1 and another stream to A2. B sends streams twice.

Now lets imagine there are 10000 attendees: A1, A2, ..., A10000. It means B must send 10000 streams. Each stream is ~40KB/s which means B needs 400MB/s outgoing internet speed to maintain this broadcast. Unacceptable.

ORIGINAL QUESTION (OBSOLETE)

Is it possible somehow to solve this, so B sends only one stream on some server and attendees just pull this stream from this server? Yes, this means the outgoing speed on this server must be high, but I can maintain it.

Or maybe this means ruining WebRTC idea?

NOTES

Flash is not working for my needs as per poor UX for end customers.

SOLUTION (NOT REALLY)

26.05.2015 - There is no such a solution for scalable broadcasting for WebRTC at the moment, where you do not use media-servers at all. There are server-side solutions as well as hybrid (p2p + server-side depending on different conditions) on the market.

There are some promising techs though like https://github.com/muaz-khan/WebRTC-Scalable-Broadcast but they need to answer those possible issues: latency, overall network connection stability, scalability formula (they are not infinite-scalable probably).

SUGGESTIONS

  1. Decrease CPU/Bandwidth by tweaking both audio and video codecs;
  2. Get a media server.
11
"The only way to build a scalable app is to use a server side solution." That seems pretty clear... As for WebRTC, it was never intended for large-scale broadcasts. Use something that supports multicast for that, or if you have to go over the Internet, a server-based one-to-one connection, as ISPs do not route multicast.Dark Falcon
Why not use WebRTC from client to server? The issue is in distribution, in that the client's connection can't handle it, so send one steam to the server and stream to clients from there. Bandwidth is going to be expensive, but you cannot get around either sending a single stream to each user or having the user send a stream to other users.Dark Falcon
There are at least two companies that I am aware of that are trying to do webrtc-based p2p video delivery: affovi.com/rtcplayer.html - mostly for live video; and peer5.com - mostly for VOD.Svetlin Mladenov
@igorpavlov You may wanna check: github.com/muaz-khan/WebRTC-Scalable-Broadcast Though it works only in chrome, and no audio-broadcast yet.Muaz Khan
There is no way to reach that scalability without a MCU of some sort. WebRTC is designed to be Peer-to-Peer. You cannot broadcast from it without absolutely slamming your broadcaster(with a unique peer connection for each stream, which interns, is another stream being encoded). As for relaying the media from peer-to-peer, that could be possible, but of course, this would incur additional latency for every peer added to the stream later. For quality, and scalability, having a webrtc MCU server is the only realistic solution.Benjamin Trent

11 Answers

73
votes

As it was pretty much covered here, what you are trying to do here is not possible with plain, old-fashionned WebRTC (strictly peer-to-peer). Because as it was said earlier, WebRTC connections renegotiate encryption keys to encrypt data, for each session. So your broadcaster (B) will indeed need to upload its stream as many times as there are attendees.

However, there is a quite simple solution, which works very well: I have tested it, it is called a WebRTC gateway. Janus is a good example. It is completely open source (github repo here).

This works as follows: your broadcaster contacts the gateway (Janus) which speaks WebRTC. So there is a key negotiation: B transmits securely (encrypted streams) to Janus.

Now, when attendees connect, they connect to Janus, again: WebRTC negotiation, secured keys, etc. From now on, Janus will emit back the streams to each attendees.

This works well because the broadcaster (B) only uploads its stream once, to Janus. Now Janus decodes the data using its own key and have access to the raw data (that it, RTP packets) and can emit back those packets to each attendee (Janus takes care of encryption for you). And since you put Janus on a server, it has a great upload bandwidth, so you will be able to stream to many peer.

So yes, it does involve a server, but that server speaks WebRTC, and you "own" it: you implement the Janus part so you don't have to worry about data corruption or man in the middle. Well unless your server is compromised, of course. But there is so much you can do.

To show you how easy it is to use, in Janus, you have a function called incoming_rtp() (and incoming_rtcp()) that you can call, which gives you a pointer to the rt(c)p packets. You can then send it to each attendee (they are stored in sessions that Janus makes very easy to use). Look here for one implementation of the incoming_rtp() function, a couple of lines below you can see how to transmit the packets to all attendees and here you can see the actual function to relay an rtp packet.

It all works pretty well, the documentation is fairly easy to read and understand. I suggest you start with the "echotest" example, it is the simplest and you can understand the inner workings of Janus. I suggest you edit the echo test file to make your own, because there is a lot of redundant code to write, so you might as well start from a complete file.

Have fun! Hope I helped.

11
votes

As @MuazKhan noted above:

https://github.com/muaz-khan/WebRTC-Scalable-Broadcast

works in chrome, and no audio-broadcast yet, but it seems to be a 1st Solution.

A Scalable WebRTC peer-to-peer broadcasting demo.

This module simply initializes socket.io and configures it in a way that single broadcast can be relayed over unlimited users without any bandwidth/CPU usage issues. Everything happens peer-to-peer!

enter image description here

This should definitely be possible to complete.
Others are also able to achieve this: http://www.streamroot.io/

7
votes

AFAIK the only current implementation of this that is relevant and mature is Adobe Flash Player, which has supported p2p multicast for peer to peer video broadcasting since version 10.1.

http://tomkrcha.com/?p=1526.

6
votes

"Scalable" broadcasting is not possible on the Internet, because the IP UDP multicasting is not allowed there. But in theory it's possible on a LAN.
The problem with Websockets is that you don't have access to RAW UDP by design and it won't be allowed.
The problem with WebRTC is that it's data channels use a form of SRTP, where each session has own encryption key. So unless somebody "invents" or an API allows a way to share one session key between all clients, the multicast is useless.

5
votes

There is the solution of peer-assisted delivery, meaning the approach is hybrid. Both server and peers help distribute the resource. That's the approach peer5.com and peercdn.com have taken.

If we're talking specifically about live broadcast it'll look something like this:

  1. Broadcaster sends the live video to a server.
  2. The server saves the video (usually also transcodes it to all the relevant formats).
  3. A metadata about this live stream is being created, compatible with HLS or HDS or MPEG_DASH
  4. Consumers browse to the relevant live stream there the player gets the metadata and knows which chunks of the video to get next.
  5. At the same time the consumer is being connected to other consumers (via WebRTC)
  6. Then the player downloads the relevant chunk either directly from the server or from peers.

Following such a model can save up to ~90% of the server's bandwidth depending on bitrate of the live stream and the collaborative uplink of the viewers.

disclaimer: the author is working at Peer5

5
votes

My masters is focused on the development of a hybrid cdn/p2p live streaming protocol using WebRTC. I've published my first results at http://bem.tv

Everything is open source and I'm seeking for contributors! :-)

2
votes

The answer from Angel Genchev seems to be correct, however, there is a theoretical architecture, that allows low-latency broadcasting via WebRTC. Imagine B (broadcaster) streams to A1 (attendee 1). Then A2 (attendee 2) connects. Instead of streaming from B to A2, A1 starts streaming video being received from B to A2. If A1 disconnects then A2 starts receiving from B.

This architecture could work if there are no latencies and connection timeouts. So theoretically it is right, but not practically.

At the moment I am using server side solution.

2
votes

I'm developing WebRTC broadcasting system using the Kurento Media Server. Kurento Supports several kinds of streaming protocol such as RTSP, WebRTC, HLS. It works as well in term of real-time and scaling.

Hence, Kurento doesn't support RTMP which is used in Youtube or Twitch now. One of the problem with me is the number of user concurrent with this.

Hope it help.

1
votes

You are describing using WebRTC with a one-to-many requirement. WebRTC is designed for peer-to-peer streaming, however there are configurations that will let you benefit from the low latency of WebRTC while delivering video to many viewers.

The trick is to not tax the streaming client with every viewer and, like you mentioned, have a "relay" media server. You can build this yourself but honestly the best solution is often to use something like Wowza's WebRTC Streaming product.

To stream efficiently from a phone you can use Wowza's GoCoder SDK but in my experience a more advanced SDK like StreamGears works best.

0
votes

As peer1 is only the peer who invokes getUserMedia() i.e. peer1 creates a room.

  1. So, peer1 captures media and starts room.
  2. peer2 joins the room and get stream(data) from peer1 and also open parallel connection named as "peer2-connection"
  3. When peer3 joins the room and get stream(data) from peer2 and also open parallel connection named as 'peer3-connection" and so on.

This process continous as many peer gets connected to each other.

Hence, by this a single broadcast can be transfered over unlimited users without any bandwidth/ CPU usages problems.

Finally, all above contained is reference from Link.

0
votes

I am working on a Relay version of WebRTC but I am not sure if it will work. My test is just for one user Johnny and see if that stream can be relayed to other users.

  1. We have two browser windows open. The first is user Johnny, the 2nd is special user Relay
  2. In the display you will have a local and a remote video element for testing.
  3. When start browsing the connected users to the Hub are automatically displayed in the browser window.
  4. Click in the first window on user Relay and that user will show up in the remote video element of the first browser window and Johnny will show in the remote window of the second browser window.
  5. Now comes the big trick all other users that want to connect to Johnny will have to connect to the remote window of the special user relay. This example is with one user but the relay window could have more windows (RTCPeerConnections) for more users to connect.
  6. The relay browser window will be the server to other users. All users connect to the relay browser window. RTCPeerConnections will be created for each connected user.

In my example I visualize it with <video> elements but in the relay browser window RTCPeerConnections should be enough.

Is this Idea logic or am I missing something?