0
votes

I am using C# WPF to make a real-time FFT.

I am using NAudio's WaveIn and BufferedWaveProvider to capture any sound recorded by Stereo Mix. I take the FFT of the buffer many times per second and display it using a bitmap so that the display shows a real-time fourier transform of any audio playing through the speakers.

My problem is that, as expected, the displayed FFT lags behind the audio coming from the speakers by a small amount (maybe 200 ms).

Is there any way I can record the current audio that is supposed to be playing from the speakers so that I can perform the FFT on it and then play it back a small amount of time later (ex. 200 ms) while muting the original real-time audio.

The end result would simply be to effectively remove the perceived delay from the displayed FFT. Audio from a youtube video, for example, would lag slightly behind the video while my program is running.

Here are the relevant methods from what I have right now:

    public MainWindow()
    {
        sampleSize = (int)Math.Pow(2, 13);
        BUFFERSIZE = sampleSize * 2;

        InitializeComponent();

        // get the WaveIn class started
        WaveIn wi = new WaveIn();
        wi.DeviceNumber = deviceNo;
        wi.WaveFormat = new NAudio.Wave.WaveFormat(RATE, WaveIn.GetCapabilities(wi.DeviceNumber).Channels);

        // create a wave buffer and start the recording
        wi.DataAvailable += new EventHandler<WaveInEventArgs>(wi_DataAvailable);
        bwp = new BufferedWaveProvider(wi.WaveFormat);

        bwp.BufferLength = BUFFERSIZE; //each sample is 2 bytes
        bwp.DiscardOnBufferOverflow = true;
        wi.StartRecording();
    }

    public void UpdateFFT()
    {
        // read the bytes from the stream            
        byte[] buffer = new byte[BUFFERSIZE];
        bwp.Read(buffer, 0, BUFFERSIZE);
        if (buffer[BUFFERSIZE - 2] == 0) return;

        Int32[] vals = new Int32[sampleSize];
        Ys = new double[sampleSize];
        for (int i = 0; i < vals.Length; i++)
        {
            // bit shift the byte buffer into the right variable format
            byte hByte = buffer[i * 2 + 1];
            byte lByte = buffer[i * 2 + 0];
            vals[i] = (short)((hByte << 8) | lByte);
            Ys[i] = vals[i];
        }

            FFT(Ys);
    }

I am still new to audio processing - any help would be appreciated.

1
Are you using a WriteableBitmap?Clemens
@Clemens Yes and I have the WriteableBitmapEx package installedbig infinitesimal

1 Answers

0
votes

The cause of your delay is the latency of WaveIn which is about 200ms by default. You can reduce that but at the risk of dropouts.

Whilst you can capture audio being played by the system with WasapiCapture there is no way to change it with NAudio, or delay its playback.