I'd like to stream a user's webcam (from the browser) to a server and I need the server to be able to manipulate the stream (run some C algorithms on that video stream) and send the user back information.
I have heavily looked at WebRTC and MediaCapture and read the examples here : https://bitbucket.org/webrtc/codelab/overview .
However this is made for peer-to-peer video chat. From what I have understood, the MediaStream from getUserMedia is transmitted via a RTCPeerConnection (with addStream) ; what I'd like to know is : can I use this, but process the video stream on the server ?
Thanks in advance for your help
pc.onsignalingstatechange
triggers when the signalling has been done right.pc.signalingState
contains the current status. The same for the ICE engine:pc.oniceconnectionstatechange
andpc.iceGatheringState
andpc.iceConnectionState
. You can find all this in the w3 spec. – MarijnS95