0
votes

I have a set of points, which when connected by straight lines looks as shown below: Linear Fit

As seen in the linear fit, the points to which the curves are fit, are meant to be extremas in the curve, that is, local maxima and minima.

I want to fit a spline or a smooth curve through these points such that these points still remain the local maxima/minima. I was using python, but any algorithm to do the above in any language or even plain maths is appreciated.


I tried using InterpolatedUnivariateSpline from scipy.interpolate, and the results for degree 2 and 3 are showing below:

Order 2 Fit

Order 3 Fit

Order 2 curve comes close to the desired result, but is there a way I can impose the condition that these points remain the extrema?

2
Hi Warren, thanks for that! That's almost exactly what I wanted. I was excited to learn about this new function. Although, it had an issue of making the derivatives for the end points also 0 which I didn't want. @ev-br's answer below, which also uses the same Hermite method takes care of that.Rithwik

2 Answers

1
votes

For this data as shown, PchpInterpolator would probably do.

The reason is that the algorithm forces a zero slope at the data point if the linear slopes to the left and to the right have different signs.

Demo:

>>> from scipy.interpolate import pchip
>>> xx = np.arange(9)
>>> yy = np.cos(xx*2*np.pi/8) + 0.4*(-1)**(xx)
>>> pch = pchip(xx, yy)
>>> pch(xx, nu=1)   # derivative at data points

array([-1.68578644,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  1.68578644])
0
votes

Given your constraints, you will need to function fit, not curve fit.

so you need a polynomial with constraints

P(xi) = yi 
P'(xi) = 0