I'm making an application that displays an FFT of sound data from a microphone. One thing I need to be able to support is calibrating to the frequency response of the microphone, which will be given to the program via a calibration file. The calibration file contains + or - dB values for different frequencies, like this:
20 -2.7
50 +0.5
100 +0.7
135 +0.7
190 +1.4
250 +1
370 +0.9
550 +1
700 +0.6
1000 +0.5
1500 +0.4
2000 +0.5
2800 +0.6
2900 +0.4
3000 +0.5
4000 -0.2
4300 -0.2
5600 +0.7
6150 +0.6
12000 +3.5
13000 +3.5
20000 -1.5
I can just apply the calibration after the FFT and before displaying it on the screen.
My problem is this: how should I interpolate between those values, which are essentially just select points of the whole frequency response of the microphone? A naive approach might be to define rigid rectangular bands around those points and, for each frequency in an FFT, pick one or another calibration line to apply to that frequency. This would cause visible jumps in the FFT graph, however. Another solution might be to use linear interpolation, but I'm still not sure that's the best way.
Is there a "standard" way to do this, that programs like Smaart or FFT devices do? What would be the best way to generate a continuous curve from those few fixed points?