1
votes

I have a collection, the number of which may vary, of non-linear equations with constraints that I'd like to solve using some numerical approach.

I've been able to solve a simple (one equation) case in Excel using Solver, but haven't put anything like this together in Python before so would appreciate suggestions on approach.

Having done a bit of digging, it looks like fsolve is a popular approach for solving systems like these. For a simple, two equation case, my problem takes the following form, broken out into parts for clarity:

enter image description here enter image description here

And the same form for the second, b, equation.

A is a constant, variable Z, S and x are constants for each entity i, and the only values that are independent are exponent a and b; two equations, two unknowns, so there should be a single unique solution.

As I'd said, I set up the simple one equation case in Excel and successfully solved for using Solver. Any guidance on setting this up in Python is appreciated.

3
You should really supply your sample inputs, as well as the constants you are using. You need to provide some context to your question. - rahlf23
@LutzL, misspecified, thank you. - Chris
Then the numerator of w_i has only S_i, as j is the bound variable of the denominator? Why can you not combine both into S_i^{a+b}? - Lutz Lehmann
Root finding isn't exactly the same as optimization (which is what excel solver does). If you want to add constraints, you should use the optimization; root finding (as far as I know, doesn't explicitly support it). Or, in your case, you can use root finding and set zn = 1 -sum(z1, z2.. z(n-1)). There are a lot of methods (see here). If your function is a scalar you can use simpler algorithms like brentq or fixed_point - Mstaino
Ok, now the system makes sense. However,I fail to see the second equation, the existing ones can be condensed to $$\sum_i(Z_i-A)x_iS_{1i}^aS_{2i}^b=0.$$ Is there some minimization going on, do you seek the minimal value of $A$ or something like that? - Lutz Lehmann

3 Answers

2
votes

The problem you're describing is one of root finding. You want to find (a,b) for which f(a,b)=0

A simple approach would be fixed point iteration. Since you have an analytical expression for f(a,b), you could calculate the derivatives and use Newton's method. To set this up using fsolve you'll need to define a function:

def myfunc(x):
    val1 = #evaluate your first expression here using Z and S
    val2 = #evaluate your second expression here
    return np.ndarray([val1 val2])

You can optionally pass in your values for S and Z using the *args argument.

Then solve using:

fsolve(myfunc,x0)

where x0 is an initial guess.

Note that fsolve may not respect your condition on w. If that isn't satisfied identically for your problem, I'd look into a method that supports constrained optimization such as fmin_slsqp. The syntax should be very similar to what I described for fsolve in either case.

1
votes

Was able to put together a solution with help from the above, appreciate it. I've accepted John's answer; solution code below for reference.

import pandas as pd
import numpy as np
from scipy.optimize import fsolve

Aq = .6
Av = .6

def eqs(p):
    a, b = p    
    return(np.dot(x*(qS**a*vS**b)/np.dot(x,qS**a*vS**b),qZ)-Aq 
            , np.dot(x*(qS**a*vS**b)/np.dot(x,qS**a*vS**b),vZ)-Av)

sol = fsolve(eqs, (1,1), full_output=True)
x, y = sol[0]
0
votes

Here is an example of how to setup a Python solution for non-linear equations:

import numpy as np
from scipy.optimize import fsolve
from math import cos

# non-linear equations:
#  x0 cos(x1) = 4.
#  x0x1-x1 = 5.
def func2(x):
    out = [x[0]*cos(x[1]) - 4]
    out.append(x[1]*x[0] - x[1] - 5)
    return out


x02 = fsolve(func2, [1, 1])
print("x02: "+str(x02))

Prints: x02: [6.50409711 0.90841421]