I have been developing a small software in .NET that takes a signal from a sensor in real time and takes the FFT of that signal which is also shown in real time.
I have used the alglib library for the FFT function. Now my purpose is to observe the intensity of some particular frequency in time.
In order to check the software, I provided a sine wave to its input having a frequency of 1 Hz. The following image shows the screen shot from the software. The upper graph shows the frequency spectrum showing the peak at 1 Hz. However, when this peak is observed in time, as shown in lower graph, the intensity behaves like a sine wave.
My sampling frequency is 30kHz. What I do not understand is how am I getting this sine signal and why is the magnitude of frequency behaving like this?