3
votes

I am trying to build mobile app which can stream video from both sides(i.e something like video calling).

I looked into webrtc, but thats not yet ready for mobile native apps as such and anyways what webrtc was doing was allowing browser to capture camera and audio directly without requiring plugins etc. But in native mobile apps capturing camera and audio isn't a issue and basically a very low latency and dual transport layer is needed. In many articles and places I read about using webrtc over websockets.

So I thought I can stream the video using websockets. Is it correct or am I missing anything?

I understand that there is one more difference that webrtc is directly client to client...whereas websocket would be client-server-client, is there anyway to avoid it. And what would that mean it terms of latency.

1

1 Answers

1
votes

You're missing something.

  • webRTC works very well on mobile. There are examples/white clients in the reference code (appRTCDEMO) at webrtc.org for both iOS and android, and multiple apps out there. Last one to have been announced was appear.in

  • getting the video and audio stream from the stream is part of the media API and not webRTC API per say (getusermedia).

  • webRTC is really the p2p connection (RTCPeerConnection: transport and firewall traversal) and the media engine (encoding, packeting, encrypting) of the equation and exactly what you re looking for.

  • webSockets is just a transport mechanism. It does not handle firewall/nat traversal, media processing, and packeting/chunking that you would have then to implement at application level.

  • as far as the signaling is concerned, webRTC does not specify/impose any protocol, and the seemingly usual way to do it is to set up a signaling server. The app usually connects to that server using web socket or XHR, or something else to do the original handshake/call set up. Many app abstracts this by using libraries like socket.io.