With this post I want to introduce my newly bred ‘onls’ package which conducts Orthogonal Nonlinear Least-Squares Regression (ONLS):
Orthogonal nonlinear least squares (ONLS) is a not so frequently applied and maybe overlooked regression technique that comes into question when one encounters an “error in variables” problem. While classical nonlinear least squares (NLS) aims to minimize the sum of squared vertical residuals, ONLS minimizes the sum of squared orthogonal residuals. The method is based on finding points on the fitted line that are orthogonal to the data by minimizing for each the Euclidean distance to some point on the fitted curve. There is a 25 year old FORTRAN implementation for ONLS available (ODRPACK, http://www.netlib.org/toms/869.zip), which has been included in the ‘scipy’ package for Python (http://docs.scipy.org/doc/scipy-0.14.0/reference/odr.html). Here,
onls has been developed for easy future algorithm tweaking in R. The results obtained from
onls are exactly similar to those found in the original implementation [1, 2]. It is based on an inner loop using
optimize for each to find within some border and an outer loop for the fit parameters using
nls.lm of the ‘minpack’ package. Sensible starting parameters for
onls are obtained by prior fitting with standard
nls, as parameter values for ONLS are usually fairly similar to those from NLS.
There is a package vignette available with more details in the “/onls/inst” folder, especially on what to do if fitting fails or not all points are orthogonal. I will work through one example here, the famous DNase 1 dataset of the
nls documentation, with 10% added error. The semantics are exactly as in
nls, albeit with a (somewhat) different output:
> DNase1 <- subset(DNase, Run == 1)
> DNase1$density <- sapply(DNase1$density, function(x) rnorm(1, x, 0.1 * x))
> mod1 <- onls(density ~ Asym/(1 + exp((xmid - log(conc))/scal)),
data = DNase1, start = list(Asym = 3, xmid = 0, scal = 1))
Obtaining starting parameters from ordinary NLS...
Relative error in the sum of squares is at most `ftol'.
Optimizing orthogonal NLS...
Passed... Relative error in the sum of squares is at most `ftol'.
print.onls method gives, as in
nls, the parameter values and the vertical residual sum-of-squares. However, the orthogonal residual sum-of-squares is also returned and MOST IMPORTANTLY, information on how many points are actually orthogonal to after fitting:
Nonlinear orthogonal regression model
model: density ~ Asym/(1 + exp((xmid - log(conc))/scal))
Asym xmid scal
2.422 1.568 1.099
vertical residual sum-of-squares: 0.2282
orthogonal residual sum-of-squares: 0.2234
PASSED: 16 out of 16 fitted points are orthogonal.
Number of iterations to convergence: 2
Achieved convergence tolerance: 1.49e-08
Checking all points for orthogonality is accomplished using the independent checking routine
check_o which calculates the angle between the slope of the tangent obtained from the first derivative at and the slope of the
onls-minimized Euclidean distance between and :
=> which should be , if the Euclidean distance has been minimized.
When plotting an ONLS model with the
plot.onls function, it is important to know that orthogonality is only evident with equal scaling of both axes:
> plot(mod1, xlim = c(0, 0.5), ylim = c(0, 0.5))
nls, all generics work:
print(mod1), plot(mod1), summary(mod1), predict(mod1, newdata = data.frame(conc = 6)), logLik(mod1), deviance(mod1), formula(mod1), weights(mod1), df.residual(mod1), fitted(mod1), residuals(mod1), vcov(mod1), coef(mod1), confint(mod1).
residuals deliver the vertical, standard NLS values. To calculate orthogonal deviance and obtain orthogonal residuals, use
 ALGORITHM 676 ODRPACK: Software for Weighted Orthogonal Distance Regression.
Boggs PT, Donaldson JR, Byrd RH and Schnabel RB.
ACM Trans Math Soft (1989), 15: 348-364.
 User’s Reference Guide for ODRPACK Version 2.01.
Software for Weighted Orthogonal Distance Regression.
Boggs PT, Byrd RH, Rogers JE and Schnabel RB.\
NISTIR (1992), 4834: 1-113.