I have a FFT algorithm in C# and I generate sine wave in a buffer at a frequency of 440, FS=1600 and Window length of 2048.
Before sending the signal to the FFT, I double the window length and put imaginary values( 0es) between buffer data. After the FFT, I compute the amplitude and take the index of the max amplitude and multiply it by the bin size. And it works it returns something like 442 Hz :)
Now I put the same generated sine a recorded .wav file with Matlab. When I run FFT from C# it returns 884 Hz double as I expected. Why?.
I checked the .wav file with Audacity and they got 440 the corrected value.
So any ideea why i got doubled value?