2
votes

I'm a R novice but I'm looking for a way to determine the three parameters A, B and C related by the following function in R:

y = A * (x1^B) * (x2^C)

Can someone give me some hints about R method(s) that would help me to achieve such a fitting?

3
What is the error distribution that you can assume? i.e. is it normal, log-normal, Cauchy, etc. etc.? Are errors in different observations correlated with each other? While nls may fit your bill, it may also give you biased and inefficient estimates. Without the error model, you're literally groping in the dark.Deer Hunter

3 Answers

6
votes

One option is the nls function as @SvenHohenstein suggested. Another option is to convert your nonlinear regression into a linear regression. In the case of this equation just take the log of both sides of the equation and do a little algebra and you will have a linear equation. You can run the regression using something like:

fit <- lm( log(y) ~ log(x1) + log(x2), data=mydata)

The intercept will be log(A) so use exp to get the value, the B and C parameters will be the 2 slopes.

The big difference here is that nls will fit the model with normal errors added to the original equation and the lm fit with logs assumes that the errors in the original model are from a lognormal distribution and are multiplied instead of added to the model. Many datasets will give similar results for the 2 methods.

5
votes

You can fit a nonlinear least-squares model with the function nls.

nls(y ~ A * (x1^B) * (x2^C))
-1
votes

Why don´t you use SVM (Suppor Vector Machines) Regression? there´s a package in CRAN named e1071 that can handle regression with SVM.

You can check this tutorial: http://www.svm-tutorial.com/2014/10/support-vector-regression-r/

I hope it can help you