0
votes

Below is a code for the least-square fitting of parameters for an ODE. Python "minimize" as well as "least-square" functions have been used. Different methods and ODE solvers/steps have been tried (scipy ode/odeint). This is a problem that has been solved in MATLAB easily, but Python keeps returning the initial guess. I hope you find a coding mistake or I would be disappointed by Python optimization functions. Obj shows the objective (residual sum of squares) and ode function (firstorder) shows the equations with unknown parameters. Data set is attached.

import numpy as np

from scipy.integrate import ode

from scipy.optimize import least_squares

from scipy.optimize import minimize

from scipy.optimize import SR1

import matplotlib.pyplot as plt

import math


Minput=np.loadtxt('C:\\Users\\Ladan\\Documents\\Python Scripts\\Python\\moisturesmoothopt.txt') 


Minput=Minput.flatten()

time=np.linspace(0,1800,901) 

A=np.zeros(3)

XC,RC,alpha=A

#bnds=([0,0,0],[Minput[0],math.inf,math.inf])

bnds=((0,Minput[0]),(0,math.inf),(0,math.inf))

def firstorder(X,time,A):


     if X>=XC:


        dX=-RC


     if X<XC:


        dX=-RC*(X/XC)**alpha

     return dX


def obj(A):


    X0=Minput[0]

   # Xpred=odeint(firstorder,X0,time,args=(A,))

    Xpred=ode(firstorder).set_integrator('vode', method='bdf', 
    order=15).set_initial_value(Minput[0],0).set_f_params(A)

    #Xpred=ode(firstorder).set_integrator('lsoda').set_initial_value(Minput[0],0).set_f_params(A)

    EPR=Xpred

    EPR2=EPR.y.flatten()

    ERRone=np.sum(np.power((EPR2-Minput),2))


    ERR=ERRone/((901-3))    # residual sum of squares deivided by dof


    return ERR


XC=1
RC=0.005
alpha=1.5

A0=[XC,RC,alpha]


Parameters=minimize(obj,A0,method='SLSQP',bounds=bnds,options={'ftol':1e-10, 
'maxiter': 1000}) 


print('parameters',Parameters)   

Data for Minput array is shared online:

https://1drv.ms/t/s!AoVu1vtlAOiLasJxR7rzubDr8YE

1

1 Answers

0
votes

Although I have used the newer ode solvers in scipy, I tend to favor the good ol' odeint function which is a bit older, but still pretty solid and in many cases gives better performance for reasons that I don't entirely understand. Anyway, I largely restructured your analysis code to use both scipy.optimize.least_squares and scipy.integrate.odeint. Progress is made but I do get some kind of warning about invalid values in the power. You will have to investigate further, but this should get you on the right track

%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
from scipy.integrate import odeint
from scipy.optimize import least_squares

Minput=np.loadtxt('Data.txt').T
time=np.linspace(0,1800,901)
bnds=([0,0,0],[Minput[0],np.inf,np.inf])


def firstorder(X,time, XC, RC, alpha):
     if X >= XC:
        dX = -RC
     else:
        dX = -RC * (X/XC)**alpha
     return dX

XC=1
RC=0.005
alpha=1.5
A0=(XC, RC, alpha) 


def residuals(x0):
    Mcalc = odeint(firstorder, y0=Minput[0], t=time, args=tuple(x0))[:,0]
    return Mcalc - Minput

Mcalc = odeint(firstorder, y0=Minput[0], t=time, args=A0)[:,0]
result = least_squares(residuals, x0=A0, bounds=bnds)
print(result)
Mfit = odeint(firstorder, y0=Minput[0], t=time, args=tuple(result.x))[:,0]

plt.plot(time, Minput, label='data')
plt.plot(time, Mcalc, label='initial values')
plt.plot(time, Mfit, label='fit')
plt.legend()

prints out:

    /---/python3.6/site-packages/ipykernel_launcher.py:20: RuntimeWarning: invalid value encountered in power

 active_mask: array([0, 0, 0])
        cost: 50.571520689865935
         fun: array([ 0.00000000e+00,  8.14148814e-03,  1.61829763e-02,  2.44244644e-02,
        ...])
        grad: array([-1.18907831,         nan, -7.75389712])
         jac: array([[ 0.        ,  0.        ,  0.        ],
       [ 0.        , -2.        ,  0.        ],
       [ 0.        , -3.99999994,  0.        ],
       ...,
       [ 0.07146036,         nan,  0.1084222 ],
       [ 0.07130456,         nan,  0.10827456],
       [ 0.07114924,         nan,  0.1081272 ]])
     message: '`xtol` termination condition is satisfied.'
        nfev: 30
        njev: 12
  optimality: nan
      status: 3
     success: True
           x: array([1.0000002 , 0.00717926, 1.50000032])

enter image description here