I've implemented a video mute and unmute feature that seems to mute the video just fine, but upon unmuting the video, it does not appear to resume streaming the video to the other peer.
To mute, I stop the track as oppose to toggling the enabled
property, as stopping it turns off the "video on" indicator light:
function muteVideo() {
localStream.getVideoTracks()[0].stop();
peerConnection.removeTrack(peerConnection.getSenders()[0]);
}
To unmute, I request the camera again (getting a new stream):
async function unmuteVideo() {
localStream = await navigator.mediaDevices.getUserMedia({ video: true });
document.getElementById('local-video').srcObject = localStream;
await peerConnection.getSenders()[0].replaceTrack(localStream.getVideoTracks()[0]);
peerConnection.getSenders()[0].setStreams(localStream);
}
And here's a CodePen link to my current WebRTC setup: https://codepen.io/robkom/pen/MWewjLm. I'm testing it in Chrome 85.
My signalling server is a simple websocket server built with ws
and @hapi/hapi
. For negotiation, I have the following set up:
peerConnection.addEventListener('negotiationneeded', async () => {
const offer = await peerConnection.createOffer();
await peerConnection.setLocalDescription(offer);
webSocketConnection.send(JSON.stringify({ offer: peerConnection.localDescription }));
});
When testing, I open the app in one Chrome window, and also the same app in an incognito Chrome window. Everything works as expected and I see video synced between the two windows just fine. When I press mute on one window, it stops the stream and the video element goes black. For the peer in the other window, my video appears frozen. That is also fine. The issue is when I unmute the video, I see my local stream re-appear, but it no longer transmits the new stream to the peer. The peer still sees my frozen face.
What am I doing wrong?