I am using the CMSIS DSP FFT functions to convert a known signal from time to frequency domain. The signal in question is a 1 KHz sine wave of peak-peak amplitude of 1V with a DC offset of 1.25V. I am sampling the input signal at 10 KHz with a 16-bit ADC and then doing the processing on a Cortex M4F MCU in floating point.
When I run a 1024 point FFT the DC value comes up at Bin-0 as ~1.24. When I run the same signal for a 2048 point FFT the DC values is ~2.5. So I ran a 512 point FFT and the value became half or ~0.62. To double check my signal, I did the same in Matlab and for no matter what FFT point I use, Matlab shows Bin-0 or DC as ~1.25.
It seems that the CMSIS DSP library from ARM is somehow doing a scaling function, which I checked in the code execution and is nowhere to be seen/executed. Any idea for debug is welcome.