I have a set of points, which when connected by straight lines looks as shown below:
As seen in the linear fit, the points to which the curves are fit, are meant to be extremas in the curve, that is, local maxima and minima.
I want to fit a spline or a smooth curve through these points such that these points still remain the local maxima/minima. I was using python, but any algorithm to do the above in any language or even plain maths is appreciated.
I tried using InterpolatedUnivariateSpline
from scipy.interpolate
, and the results for degree 2 and 3 are showing below:
Order 2 curve comes close to the desired result, but is there a way I can impose the condition that these points remain the extrema?
0
which I didn't want. @ev-br's answer below, which also uses the same Hermite method takes care of that. – Rithwik