I have acquired a 10 seconds raw PPG (Photoplethysmogram) signal from my TI AFE4490. My hardware is calibrated and I'm using 250 samples per second to record those signal. I acquired 2500 points at the end.
I used Butterworth bandpass filter with lowcut=0.5 , highcut=15 and order=2 . You can see my raw and filtered signals bellow:
I also tried to filter this using a Butterworth lowpass filter with lowcut=15 and order=2 . As you can see, my raw and filtered signals are bellow:
I read at some articles that 0.5Hz and 15Hz are the good lowcut and highcut frequencies to this type of signal.
Before I apply the filters I used an Scipy Butterworth (from scipy docs ) algorithm to show me the filter response, and it was good.
My filtered signal seems to be good after that "start" (elevation at the beginning), but I don't know why that beginning. Anyone can tell me if that "start" it's normal at Butterworth filters? If yes, there is some method to fix it?
I appreciate your help.
My code:
RED, IR, nSamples, sRate = getAFESignal()
period = 1/sRate # Signal period.
# Desired cutoff frequency (in Hz) and filter order.
lowcut = 0.5
highcut = 15
orders = 2
plt.figure(1)
x = np.linspace(0, nSamples*period, nSamples, endpoint=True)
plt.subplot(2,1,1)
y = IR
plt.xlabel('Time (s)')
plt.ylabel('Voltage (V)')
plt.plot(x,y, label='Noisy signal')
plt.subplot(2,1,2)
yf = butter_bandpass_filter(IR, lowcut, highcut, nSamples, order=orders)
plt.xlabel('Time (s)')
plt.ylabel('Voltage (V)')
plt.plot(x, yf, label='Filtered signal')
plt.grid()
plt.show()
The function getAFEsignal()
is just a function to read a .txt file and put all into two numpy arrays.