I am trying to build mobile app which can stream video from both sides(i.e something like video calling).
I looked into webrtc, but thats not yet ready for mobile native apps as such and anyways what webrtc was doing was allowing browser to capture camera and audio directly without requiring plugins etc. But in native mobile apps capturing camera and audio isn't a issue and basically a very low latency and dual transport layer is needed. In many articles and places I read about using webrtc over websockets.
So I thought I can stream the video using websockets. Is it correct or am I missing anything?
I understand that there is one more difference that webrtc is directly client to client...whereas websocket would be client-server-client, is there anyway to avoid it. And what would that mean it terms of latency.