I am using a FIR filter in MATLAB to filter a signal. In the figure below, applying my filter to the top plot creates the bottom plot:
The effect this has is to successfully low-pass filter the data, but to shift everything along by 500 msec.
Here is the routine I use to low-pass filter the data:
% (Start with any vector called 'inputData')
samplingRate = 1000;
filterLength = 1000;
filterCutOff = 90;
filterType = fir1(filterLength , filterCutOff/(samplingRate/2), 'low'); % define the low pass filter
inputData = filter(filterType,1,inputData); % filter the data
I'm aware that the 500 msec shift of my data relates to half the filter length (1000ms), but why does this happen, and am I doing something fundamentally wrong? I know that I could just delete the first 500 msec of my filtered data, but I am also missing the last 500 msec of the data.
Note that this example requires the signal processing toolbox.