I need to optimize a set of variables with respect to an objective function. I have the analytical gradient of the function, and would like to use it in the optimization routine. The objective and gradient have some common computations, and I would like to define the functions in the most efficient way possible. The below example demonstrates the issue.
Let f_obj
, f_grad
and f_common
be functions for the objective, gradient and common computations, respectively. The optimization is over the vector x
. The below code finds a root of the polynomial y^3 - 3*y^2 + 6*y + 1
, where y
is a function of c(x[1], x[2])
. Note that the function f_common
is called in both f_obj
and f_grad
. In my actual problem the common computation is much longer, so I'm looking for a way to define f_obj
and f_grad
so that the number of calls to f_common
is minimized.
f_common <- function(x) x[1]^3*x[2]^3 - x[2]
f_obj <- function(x) {
y <- f_common(x)
return ( (y^3 - 3*y^2 + 6*y + 1)^2 )
}
f_grad <- function(x) {
y <- f_common(x)
return ( 2 * (y^3 - 3*y^2 + 6*y + 1) * (3*y^2 - 6*y + 6)* c(3*x[1]^2*x[2]^3, 3*x[1]^3*x[2]^2 - 1) )
}
optim(par = c(100,100), fn = f_obj, gr = f_grad, method = "BFGS")
UPDATE
I find that the package nloptr
offers the facility to input the objective function and its gradient as a list. Is there a way to define other optimizers, (optim
, optimx
, nlminb
, etc.) in a similar manner?
Thanks.
f_obj
at a pointx
and find its gradient at that point, it would need to callf_common
twice, but we only need to computey
once. – user3294195f_common
. – sgibb