0
votes

I have a FFT algorithm in C# and I generate sine wave in a buffer at a frequency of 440, FS=1600 and Window length of 2048.

Before sending the signal to the FFT, I double the window length and put imaginary values( 0es) between buffer data. After the FFT, I compute the amplitude and take the index of the max amplitude and multiply it by the bin size. And it works it returns something like 442 Hz :)

Now I put the same generated sine a recorded .wav file with Matlab. When I run FFT from C# it returns 884 Hz double as I expected. Why?.

I checked the .wav file with Audacity and they got 440 the corrected value.

So any ideea why i got doubled value?

1
Are you performing the one-sided or the full FFT? If you're only doing the one-sided version, then the last element in the transformed vector corresponds to a frequency FS/2, not FS. - wakjah
don't know which version is, how can I figure it out? And if it's one sided why my generated C# signal shows correct? - Sergiu Craitoiu
Assuming your original signal is real and not complex, the two-sided Fourier transform will be symmetric about the middle element (FS/2). E.g., if your signal is a pure tone at frequency f0, there will be a spike at f0 and FS-f0. - wakjah
yes it is symmetric there are 2 spikes in my plot and I compute Amplitude only to FFT length/2 becouse of the symmetry redundancy - Sergiu Craitoiu

1 Answers

-1
votes

I figured out it, seems I didn't read the wav file correctly.