1
votes

I am trying to solve a problem of finding the roots of a function using the Newton-Raphson (NR) method in the C language. The functions in which I would like to find the roots are mostly polynomial functions but may also contain trigonometric and logarithmic.

The NR method requires finding the differential of the function. There are 3 ways to implement differentiation:

  • Symbolic
  • Numerical
  • Automatic (with sub types being forward mode and reverse mode. For this particular question, I would like to focus on forward mode)

I have thousands of these functions all requiring finding roots in the quickest time possible.

From the little that I do know, Automatic differentiation is in general quicker than symbolic because it handles the problem of "expression swell" alot more efficiently.

My question therefore is, all other things being equal, which method of differentiation is more computationally efficient: Automatic Differentiation (and more specifically, forward mode) or Numeric differentiation?

2
I might be missing something but if you only handle polynomial functions, it's straight forward to find the differential function. aX^n -> naX^n-1 - 4386427
My apologies, post edited. They mostly contain polynomial but can contain other more complex expressions - Dean P
How to the trigs and logs appear? Are there trig functions of polynomials, or logarithms of polynomials, etc.? (e.g. is this the complete space of functions involving x, sin(a+bx), cos(a+bx) and log(a+b*x) under a finite number of addition and multiplication steps)? Or are they simple trigs, logs -- maybe the space of functions of powers of x, simple trigs, and simple logs complete under addition? - jwimberley
In short, yes to all of the above, i.e. the chain rule will need to be applied to evaluate it. That is, trig functions can contain polynomials, e.g. sin(2x^2 + 3x) and logarithmic may contain polynomial, e.g. ln|2x^2|. Correct, a complete space of functions involving polynomial, trig and log - Dean P
Identify pure polynomial functions and handle them as a special case as the differential function is easy to calculate. For more complex function I'm not sure but I guess I would start out trying the numerical approach - 4386427

2 Answers

2
votes

If your functions are truly all polynomials, then symbolic derivatives are dirt simple. Letting the coefficients of the polynomial be stored in an array with entries p[k] = a_k, where index k corresponds to the coefficient of x^k, then the derivative is represented by the array with entries dp[k] = (k+1) p[k+1]. For multivariable polynomial, this extends straightforwardly to multidimensional arrays. If your polynomials are not in standard form, e.g. if they include terms like (x-a)^2 or ((x-a)^2-b)^3 or whatever, a little bit of work is needed to transform them into standard form, but this is something you probably should be doing anyways.

1
votes

If the derivative is not available, you should consider using the secant or regula falsi methods. They have very decent convergence speed (φ-order instead of quadratic). An additional benefit of regula falsi, is that the iterations remains confined to the initial interval, which allows reliable root separation (which Newton does not).

Also note than in the case of numerical evaluation of the derivatives, you will require several computations of the functions, at best two of them. Then the actual convergence speed drops to √2, which is outperformed by the derivative-less methods.

Also note that the symbolic expression of the derivatives is often more costly to evaluate than the functions themselves. So one iteration of Newton costs at least two function evaluations, spoiling the benefit of the convergence rate.