1
votes

I am taking the mediaStream from WebRTC and doing some audio processing and monitoring. It works on FireFox but is silent on Chrome.

Here is a simplified version with a single gainNode as an example.

    const AudioContext = window.AudioContext || window.webkitAudioContext;
    let myAudioCtx = new AudioContext();
    let mySource = myAudioCtx.createMediaStreamSource(stream);
    let gainNode = myAudioCtx.createGain();
    gainNode.gain.value = 2;
    mySource.connect(gainNode);
    gainNode.connect(myAudioCtx.destination);

Whereas if I instead assign stream directly to srcObject I hear the sound.

It appears that createMediaStreamSource() is not returning any audio because my monitoring shows silence. However if I assign the stream from WebRTC to srcObect as well as run though my monitoring then the monitoring detects sound.

myAudioCtx.state says 'running'

Can't think of where else to check. Any help would be appreciated

1

1 Answers

2
votes

Found the solution after a good nights sleep and looking at MDN docs again.

You must assign the stream to the audio element

audio.srcObject = stream;

but you then have to mute the output so it doesn't go directly to the speakers

audio.muted = true;

this doesn't stop your web audio from working

const AudioContext = window.AudioContext || window.webkitAudioContext;
let myAudioCtx = new AudioContext();
let mySource = myAudioCtx.createMediaStreamSource(stream);
let gainNode = myAudioCtx.createGain();
gainNode.gain.value = 2;
mySource.connect(gainNode);
gainNode.connect(myAudioCtx.destination);

This works on Chrome, Safari and Firefox.