0
votes

I'm currently playing around with the Web Audio API in Chrome (60.0.3112.90) to possibly build a sound wave of a given file via FilerReader, AudioContext, createScriptProcessor, and createAnalyser. I have the following code:

const visualize = analyser => {
  analyser.fftSize = 256;
  let bufferLength = analyser.frequencyBinCount;
  let dataArray = new Float32Array(bufferLength);
  analyser.getFloatFrequencyData(dataArray);
}    

loadAudio(file){
  // creating FileReader to convert audio file to an ArrayBuffer
  const fileReader = new FileReader();

  navigator.getUserMedia = (navigator.getUserMedia ||
                      navigator.webkitGetUserMedia ||
                      navigator.mozGetUserMedia ||
                      navigator.msGetUserMedia);

  fileReader.addEventListener('loadend', () => {
    const fileArrayBuffer = fileReader.result;

    let audioCtx = new (window.AudioContext || window.webkitAudioContext)();
    let processor = audioCtx.createScriptProcessor(4096, 1, 1);
    let analyser = audioCtx.createAnalyser();

    analyser.connect(processor);
    let data = new Float32Array(analyser.frequencyBinCount);

    let soundBuffer;
    let soundSource = audioCtx.createBufferSource();

    // loading audio track into buffer
    audioCtx.decodeAudioData( 
      fileArrayBuffer, 
      buffer => {
        soundBuffer = buffer;
        soundSource.buffer = soundBuffer;

        soundSource.connect(analyser);
        soundSource.connect(audioCtx.destination);

        processor.onaudioprocess = () => {
          // data becomes array of -Infinity values after call below
          analyser.getFloatFrequencyData(data);
        };

        visuaulize(analyser);
      },
      error => 'error with decoding audio data: ' + error.err
    );
  });

  fileReader.readAsArrayBuffer(file);
}

Upon loading a file, I get all the way to analyser.getFloatFrequencyData(data). Upon reading the Web audio API docs, it says that the parameter is:

The Float32Array that the frequency domain data will be copied to. 
For any sample which is silent, the value is -Infinity.

In my case, I have both an mp3 and wav file I'm using to test this and after invoking analyser.getFloatFrequency(data), both files end up giving me data which becomes an array of `-Infinity' values.

This may be due to my ignorance with Web Audio's API, but my question is why are both files, which contain loud audio, giving me an array that represents silent samples?

2
Are you using Chrome? - Raymond Toy
Not totally clear what you're trying to do - but I don't see you actually calling start() on the audiobuffersource, unless it's inside iHateMyselfForThis.recordAudio(audioCtx, analyser)? - cwilso
@RaymondToy yes, I've updated my question with that information - antihero989
@cwilso I've revised my code to omit setting context as I'm using this in a stateful react component. As you can see, I'm not invoking start(). I'm trying to get frequencies without having to play the file itself (if possible. - antihero989
How else do you think the analyser is going to non-zero samples to analyse if you don't actually start the source that feeds the analyser? - Raymond Toy

2 Answers

2
votes

The Web Audio AnalyserNode is only designed to work in realtime. (It used to be called RealtimeAnalyser.) Web Audio doesn't have the ability to do analysis on buffers; take a look at another library, like DSP.js.

1
votes

Instead of:

soundSource.connect(analyser);
soundSource.connect(audioCtx.destination);

try:

soundSource.connect(analyser);
analyser.connect(audioCtx.destination);

Realising I sould do a source ==> anlalsyser ==>> destination chain solved this problem when I encountered it.