0
votes

I would like to convolve a 1D signal with the first derivative kernel with variable window sizes. In other words, the linear regression of a moving window with a variable size. In python code:

def derivative_convolution(aSignal, iWindowSize):
    """
    derivative of a signal by window size using kernel operator
    """
    import numpy as np

    aKernel = #???

    return np.convolve(aSignal, aKernel, 'same')

Where aKernel is the kernel of variable window size that I am looking for.

For example, for 1D signals, the 1st derivative kernel is [-1,0,1]. Can the slope be calculated for a window size of 5 ([a,b,c,d,e])?

1
Why not to make the use of the Sobel operator (1st derivative) and then for the second derivative simply convolve the Sobel operator with itself, producing the kernel for 2nd order derivative? So for example for the first derivative your kernel (x direction) would be: [[-1, 0, 1],[-2,0,2],[-1,0,1]], and for the second one: [[1, 0, -2, 0, 1],[4,0,-8,0,4],[6,0,-12,0,6],[4,0,-8,0,4],[1, 0, -2, 0, 1]]. - jojek
Perhaps I misunderstand, but surely there is a generic way to calculate the 1st derivative kernel for different window sizes? Different window sizes would mean an average slope over larger amount of data points; analogous to different window sizes for the Gaussian smoothing kernel? - Roman
I guess I am asking for a kernel version of a linear regression for n amount of datapoints. Hmm... - Roman

1 Answers

-1
votes

You should take a look at finite-difference approximations to the derivatives: http://en.wikipedia.org/wiki/Finite_difference

Based on Taylor series expansions of a function up to a given order, formulas can be obtained that approximate the derivatives at a given point. Alternatively you can just refer to standard results for the first and second derivatives (using the central approximation):

http://en.wikipedia.org/wiki/Finite_difference_coefficient