3
votes

As I'm playing with WebRTC in Chrome, I'm noticing that the durability of these streams is still somewhat shaky. I need to create a video stream before the element displaying it is shown (technically I only need the audio track initially, but renegotiation without replaceTrack() seems to be an issue all in its own, so I'm enabling both at once for now).

The element is then rendered dynamically by JavaScript and needs to start receiving WebRTC video. The problem is that at the time of WebRTC creation this video element where I want to show it does not yet exist. I don't see a way to tell WebRTC to change the video element being rendered to after the stream starts, is that possible? I was mainly playing with SimpleWebRTC, but am open to using WebRTC directly - from looking at the docs I couldn't find a way to do it with raw WebRTC either. I also tried moving the original video element into the new element, but this causes the video stream to break/stop:

newElement.appendChild(originalWebRTCVideoTag);

Short of killing the entire stream and restarting, what are my options?

UPDATE:

For both approaches, videoTag is a generic DOM video tag, webrtc is an instance of WebRTC object with a working connection established via SimpleWebRTC (simpleWebRtc.webrtc, which SimpleWebRTC wraps around). I'm putting together a JSFiddle right now for those who want to see the actual code but this should be enough information to reproduce this.

// this doesn't seem to be working in stackoverflow, probably because it rejects video camera capture

var simplertc = new SimpleWebRTC({
  localVideoEl: 'webrtc-local',
  remoteVideosEl: 'webrtc-remote',
  media: {"audio": true, "video": {
    "optional": [{"minWidth": "640"}, {"minHeight": "480"}], "mandatory": {}
  }},
  autoRequestMedia: true
});
var webrtc = simplertc.webrtc;

// this portion is overly simplified, in this case there is no point
// in creating this dynamically, in the app I'm working on this element 
// is generated much later
$('#dynamic').appendTo('<video id="dynamic-video"></video>');
var videoTag = $('#dynamic-video')[0];

simplertc.on('readyToCall', function() {
  simplertc.joinRoom('my-room-875385864'); // random name
  
  // by this time the local video should be ready, we don't need remote ones for our test
  // test case 1 (replace with logic from test case 2 if needed)
  videoTag.srcObject = webrtc.localStreams[0];
  // end test case
});
video {
  border: 1px solid red;
  width: 200px;
}

/* overlap with original video is intentional to show hardware acceleration effect */
#dynamic {
  position: absolute;
  border: 1px solid black;
  width: 200px;
  height: 200px;
  left: 100px;
  top:50px;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.0.0/jquery.min.js"></script>
<script src="https://simplewebrtc.com/latest-v2.js"></script>
<div id='webrtc'>
  <video id='webrtc-local'></video>
  <div id='webrtc-remote'></div>
</div>
<div id='dynamic'>
</div>

Approach 1, stumbled upon this by accident while attempting approach 2

Tried the following, it works but much slower than I'd like, about 5 FPS:

// note that I can just as easily use remote streams here
videoTag.srcObject = webrtc.localStreams[0]

Ironically, while messing with this approach more I accidentally overlapped the video regions of the webRTC element and the one generated (videoTag), and even though webRTC is on the background, that corner of videoTag where it overlaps does run in real time, unlike the rest of the element which continues running at 3-5 FPS. This is leading me to believe that the issue here is hardware acceleration. Can I enable it for the videoTag somehow?

Approach 2

var media = new MediaSource();
videoTag.src = URL.createObjectURL(media);
// guessing mimetype from a few WebRTC tutorials I stumbled upon
var srcBuf = media.addSourceBuffer(‘video/webm;codecs=”vp8, vorbis”’);

// need to convert webrtc.localStreams[0] or its video track to a buffer somehow???
srcBuf.appendBuffer(/* buffer */);

FURTHER RESEARCH

This may be a bug in Chrome, a hackerish workaround that seems to work is to make sure the newly generated video elements are completely overlapped by the original video element (even if the original video element is set to render on the background behind all other elements (and behind a non-transparent background). This seems to kick in hardware acceleration.

1
At chrome, chromium you should be able to enable hardware acceleration at Settingsguest271314
It's already enabled, this issue is specific to this element, not other video tags.Alexander Tsepkov
You're doing something wrong. A stream, remote or otherwise, lives wholly independently of whatever video element(s) may be playing it, or whether any are playing it at all. This has nothing to do with WebRTC. See this example.jib
srcObject is a DOM property, not a jQuery property, use videoTag[0].srcObject. Still not sure what issue is?guest271314
@guest271314 thanks, that was a typo, I'm aware that it has nothing to do with jQuery. I forgot the [0] when typing above example, in my original code it's there.Alexander Tsepkov

1 Answers

1
votes

You can use MediaSource, sourceopen event, .addSourceBuffer(), .appendBuffer(). See HTML5 audio streaming: precisely measure latency?, Unable to stream video over a websocket to Firefox