0
votes

I am using Web Audio API to do nothing more than show a microphone activity light to show a user that their microphone is connected and input is being received.

It's an Angular web project, and I'm using a timer to take a snapshot of the input using getByteTimeDomainData().

When I run, every snapshot I take is 128 (zero), no matter how much noise I'm making.

Code:

private setupMicInputListener(stream: MediaStream): void {
    const threshold = 10;
    const timerSource = timer(0, 100);
    timerSource.subscribe((val) => {
      const audioLevel = this.getCurrentAverageMicInputLevel(stream);
      console.log(audioLevel);
      this.audioInputDetected = audioLevel > threshold;
    });
  }

  private getCurrentAverageMicInputLevel(stream: MediaStream): number {
    const audioContext = new window.AudioContext();
    const analyser = audioContext.createAnalyser();
    const microphone = audioContext.createMediaStreamSource(stream);
    microphone.connect(analyser);

    analyser.smoothingTimeConstant = 0.8;
    analyser.fftSize = 2048;
    const buffer = new Uint8Array(analyser.fftSize);
    analyser.getByteTimeDomainData(buffer);
    let values = 0;

    const length = buffer.length;
    for (let i = 0; i < length; i++) {
      values += (buffer[i]);
    }

    const averageLevel = values / length;

    return averageLevel;
  }

I've read that if it's not connected to an output, the signal has nowhere to "go", but when I try

microphone.connect(audioContext.destination)

I get crazy feedback through my speakers. I know the stream I'm passing in is correct. Outside this code, I get the users devices, and I can observe that the device is my microphone. I create a MediaStream and add my microphone audio track to it before passing the stream to this component. Plus when I use the line of code above, I can hear the noise I'm making in all the feedback.

1

1 Answers

1
votes

I'm not familiar with angular, but from looking at the code it seems you periodically call getCurrentAverageMicInputLevel. This create a new audio context each time with the corresponding AnalyserNode. And then you immediately ask for the 2048 samples of the time domain data. At this point, it's highly unlikely that the analyser has actually received any data from the microphone so you just get the a buffer of 2048 zeroes.

To fix this, create the context, analyser (including setting up the parameters of the analyser), and microphone just once, perhaps in setupMicInputListener. Then have getCurrentAverageMicInputLevel call getByteTimeDomainData. I think then you'll see non-zero values if the microphone is not silent.

For Chrome, in general, you do kind of need to have a node eventually connected to the destination, but there are exceptions and the analyser node is one of the exceptions. But be sure the output is not connected at all to anything for this to work.