1
votes

Well, the problem is very unusual. I create Angular app to communicate between two Peers via WebRTC. The architecture was simple 2 peers which sends video stream from camera and receives stream from other peer. Simple and working.

Now I want to add some processing to stream so to first peer I add the canvas element like this:

    this.localCameraVideoStream = document.createElement('video');
    this.localCameraVideoStream.srcObject = stream;
    this.localCameraVideoStream.muted = true;
    this.localCameraVideoStream.play();

    this.canvas = document.createElement('canvas');
    this.canvas.width = 1280;
    this.canvas.height = 720;
    this.canvasStream = this.canvas.captureStream();

    this.localVideoElement.nativeElement.srcObject = this.canvasStream;
    this.localVideoElement.nativeElement.muted = true;
    this.localVideoElement.nativeElement.play();
    this.redrawStreamToCanvas();

And redraw method to draw stream from <video> element to canvas:

 private redrawStreamToCanvas(){
    const ctx = this.canvas.getContext('2d');
    const width = 1280;
    const height = 720;
    const combinedVideoStream = this.localCameraVideoStream;

    function drawVideo() {
      ctx.clearRect(0, 0, width, height);
      ctx.drawImage(combinedVideoStream, 0, 0, width, height);
      requestAnimationFrame(drawVideo);
    }

    requestAnimationFrame(drawVideo);
  }

Just to clear any doubts. localCameraVideoStream is <video> element created to save camera stream in a way that <canvas> can get stream from <video>. canvas is <canvas> localVideoElement is just a <video> in .html and displays current "local" stream.

Problem is that in local preview everything works properly. When I send canvasStream to other peer and display it on some <video> I only get one frame and thats it.

Do you know whats hapenning? It's weird that in preview is ok and without this combinations rtc connection is also ok.

1
Kind of blind shot, but what if you wait to do a first draw on the canvas before getting and doing anything with its MediaStream? IIRC CanvasMediaStream are muted if the canvas has nothing drawn on it, and I can very well see how the MediaElement initialisation could fail if requested to load a muted stream.Kaiido

1 Answers

3
votes

This is most probably because of a bug [feature] of Chrome, that will pause muted <video> once they get out of the viewport.

If it is indeed the case, this has nothing to do with the MediaStream, and already just happens in the drawing-to-canvas operation:

var ctx = canvas.getContext('2d');
if ((videoin.buffered && !videoin.buffered.length) || videoin.paused) {
  videoin.onloadedmetadata = videoin.onplaying = begin;
} else {
  begin();
}

function begin() {
  videoin.onloadedmetadata = videoin.onplaying = null;
  canvas.width = videoin.videoWidth;
  canvas.height = videoin.videoHeight;
  drawToCanvas();
}

function drawToCanvas() {
  ctx.drawImage(videoin, 0, 0);
  requestAnimationFrame(drawToCanvas);
}
body {
  margin-bottom: 100vh;
}
<p>
  Scroll down until the &lt;video&gt; element be out.
</p>
<video crossorigin id="videoin" src="https://upload.wikimedia.org/wikipedia/commons/transcoded/2/22/Volcano_Lava_Sample.webm/Volcano_Lava_Sample.webm.360p.webm" muted autoplay></video>
<canvas id="canvas"></canvas>

So if you didn't set the muted property, it would have worked:

var ctx = canvas.getContext('2d');
if ((videoin.buffered && !videoin.buffered.length) || videoin.paused) {
  videoin.onloadedmetadata = videoin.onplaying = begin;
} else {
  begin();
}

function begin() {
  videoin.volume = 0; // does the same you'd say?
  videoin.onloadedmetadata = videoin.onplaying = null;
  canvas.width = videoin.videoWidth;
  canvas.height = videoin.videoHeight;
  drawToCanvas();
}

function drawToCanvas() {
  ctx.drawImage(videoin, 0, 0);
  requestAnimationFrame(drawToCanvas);
}
body {
  margin-bottom: 100vh;
}
<p>
  Scroll down until the &lt;video&gt; element be out.
</p>
<video crossorigin id="videoin" src="https://upload.wikimedia.org/wikipedia/commons/transcoded/2/22/Volcano_Lava_Sample.webm/Volcano_Lava_Sample.webm.360p.webm" autoplay></video>
<canvas id="canvas"></canvas>

Or even, if you didn't appended it in the document, it would also have worked (after an user-gesture..).

var ctx = canvas.getContext('2d');
var videoin = document.createElement('video');
videoin.onloadedmetadata = videoin.onplaying = begin;
videoin.muted = true; // even if 'muted'
videoin.src = 'https://upload.wikimedia.org/wikipedia/commons/transcoded/2/22/Volcano_Lava_Sample.webm/Volcano_Lava_Sample.webm.360p.webm';

function begin() {
  videoin.onplay = null;
  canvas.width = videoin.videoWidth;
  canvas.height = videoin.videoHeight;
  drawToCanvas();
}

function drawToCanvas() {
  ctx.drawImage(videoin, 0, 0);
  requestAnimationFrame(drawToCanvas);
}
play_btn.onclick = e => {
  videoin.play();
}
<button id="play_btn">click to start playing the video</button>
<p>
  This contains only the canvas element
</p>
<canvas id="canvas"></canvas>

Now, I can't refrain from saying in this answer that you must have already initialized your canvas context (and even probably have already drawn on it) before calling its captureStream method. Failing to do this will result in an NS Exception in Firefox, and IIRC it is in accordance with the specs.