I've been using nls()
to fit a custom model to my data, but I don't like how the model is fitting and I would like to use an approach that minimizes residuals in both x and y axes.
I've done a lot of searching, and have found solutions for fitting linear models such as the deming package (http://cran.r-project.org/web/packages/deming/index.html) and these stackoverflow posts: Total Least Square method using R, How to calculate Total least squares in R? (Orthogonal regression). I've also found matlab solutions (https://stats.stackexchange.com/questions/110772/total-least-squares-curve-fit-problem), but these fit a second order polynomial and not a custom, user-defined model.
What I would like is something similar to nls()
that does the x and y residual minimization. This would allow me to enter my custom model. Is anyone aware of any solution in R?
Many thanks!
EDIT
Here's an example, but please note that I'm seeking suggestions on a general solution for nonlinear total least squares regression, and not something specific to this dataset (this is just an example data based on Modifying a curve to prevent singular gradient matrix at initial parameter estimates):
df <- structure(list(x = c(3, 4, 5, 6, 7, 8, 9, 10, 11), y = c(1.0385,
1.0195, 1.0176, 1.01, 1.009, 1.0079, 1.0068, 1.0099, 1.0038)), .Names = c("x",
"y"), row.names = c(NA, -9L), class = "data.frame")
(nlsfit <- nls(y ~ a^b^x, data = df, start = c(a=0.9, b=0.6)))
library(ggplot2)
ggplot(df, aes(x=x, y=y)) +
geom_point() +
geom_smooth(method="nls", formula = y ~ a^b^x, se=F, start = list(a=0.9, b=0.6))
library(robustbase); nlrob(y ~ a^b^x, df, start = coef(nlsfit), method = "tau", lower = c(a = 0, b = 0), upper = 2)
. For orthogonal least squares you could try the onls package but I am not so sure that would give a fit much different than that shown in the question. – G. Grothendieck