4
votes

I am using WebRTC for peer-to-peer video communication, and I would like to apply video filters to local webcam video before sending it to a remote peer.

The approach that I am considering is to send the local webcam video to a canvas element, where I will apply javascript filters to the video. Then I would like to stream the video from the canvas element to the peer using WebRTC. However, it is not clear to me if this is possible.

Is it possible to stream video from a canvas element using WebRTC? If so, how can this be done? Alternatively, are there any other approaches that I might consider to accomplish my objective?

3
Instead of sending processed stream you may apply CSS3 filters on the remote peer's side if standard filters (such as grayscale, blur, sepia, etc.) are enough for you.Oleg

3 Answers

1
votes

It's April 2020; you can achieve this with the canvas.captureStream() method.

There is an excellent article on how to use it, along with several demos on github. See the following links:

Capture Stream

Stream from a canvas element to peer connection

So, basically, you can apply all the transformations on the canvas and stream from the canvas to remote peer.

0
votes

mozCaptureStreamUntilEnded is supported on firefox but resulting stream can't be attached to peer connection.

Playing over <canvas> is easier however streaming media from a <video> element requires Media Processing API (capture-stream-until-ended) along with RTCPeerConnection (with all features support).

We can get images from <canvas> however I'm not sure if we can generate MediaStream from <canvas>.

So, mozCaptureStreamUntilEnded is useful only wth pre-recorded media streaming.

0
votes

my solution would be, send the normal stream to the peer, also transmit, how it has to be modified, so on the other side, instead of showing in a video element directly( play the video n hide the element), you would keep drawing in a canvas( after processing) with settimeout/requestAnimationFrame.