Previous: Nonlinear Programming, Up: Optimization


25.4 Linear Least Squares

Octave also supports linear least squares minimization. That is, Octave can find the parameter b such that the model y = x*b fits data (x,y) as well as possible, assuming zero-mean Gaussian noise. If the noise is assumed to be isotropic the problem can be solved using the ‘\’ or ‘/’ operators, or the ols function. In the general case where the noise is assumed to be anisotropic the gls is needed.

— Function File: [beta, sigma, r] = ols (y, x)

Ordinary least squares estimation for the multivariate model y = x*b + e with mean (e) = 0 and cov (vec (e)) = kron (s, I). where y is a t by p matrix, x is a t by k matrix, b is a k by p matrix, and e is a t by p matrix.

Each row of y and x is an observation and each column a variable.

The return values beta, sigma, and r are defined as follows.

beta
The OLS estimator for b. beta is calculated directly via inv (x'*x) * x' * y if the matrix x'*x is of full rank. Otherwise, beta = pinv (x) * y where pinv (x) denotes the pseudoinverse of x.
sigma
The OLS estimator for the matrix s,
               sigma = (y-x*beta)'
                 * (y-x*beta)
                 / (t-rank(x))

r
The matrix of OLS residuals, r = y - x*beta.

See also: gls, pinv.

— Function File: [beta, v, r] = gls (y, x, o)

Generalized least squares estimation for the multivariate model y = x*b + e with mean (e) = 0 and cov (vec (e)) = (s^2) o, where y is a t by p matrix, x is a t by k matrix, b is a k by p matrix, e is a t by p matrix, and o is a t*p by t*p matrix.

Each row of y and x is an observation and each column a variable. The return values beta, v, and r are defined as follows.

beta
The GLS estimator for b.
v
The GLS estimator for s^2.
r
The matrix of GLS residuals, r = y - x*beta.

See also: ols.

— Function File: x = lsqnonneg (c, d)
— Function File: x = lsqnonneg (c, d, x0)
— Function File: x = lsqnonneg (c, d, x0, options)
— Function File: [x, resnorm] = lsqnonneg (...)
— Function File: [x, resnorm, residual] = lsqnonneg (...)
— Function File: [x, resnorm, residual, exitflag] = lsqnonneg (...)
— Function File: [x, resnorm, residual, exitflag, output] = lsqnonneg (...)
— Function File: [x, resnorm, residual, exitflag, output, lambda] = lsqnonneg (...)

Minimize norm (c*x - d) subject to x >= 0. c and d must be real. x0 is an optional initial guess for x. Currently, lsqnonneg recognizes these options: "MaxIter", "TolX". For a description of these options, see optimset.

Outputs:

See also: optimset, pqpnonneg.

— Function File: optimset ()
— Function File: optimset (par, val, ...)
— Function File: optimset (old, par, val, ...)
— Function File: optimset (old, new)

Create options struct for optimization functions.

Valid parameters are:

AutoScaling
ComplexEqn
Display
Request verbose display of results from optimizations. Values are:
"off" [default]
No display.
"iter"
Display intermediate results for every loop iteration.
"final"
Display the result of the final loop iteration.
"notify"
Display the result of the final loop iteration if the function has failed to converge.

FinDiffType
FunValCheck
When enabled, display an error if the objective function returns an invalid value (a complex number, NaN, or Inf). Must be set to "on" or "off" [default]. Note: the functions fzero and fminbnd correctly handle Inf values and only complex values or NaN will cause an error in this case.
GradObj
When set to "on", the function to be minimized must return a second argument which is the gradient, or first derivative, of the function at the point x. If set to "off" [default], the gradient is computed via finite differences.
Jacobian
When set to "on", the function to be minimized must return a second argument which is the Jacobian, or first derivative, of the function at the point x. If set to "off" [default], the Jacobian is computed via finite differences.
MaxFunEvals
Maximum number of function evaluations before optimization stops. Must be a positive integer.
MaxIter
Maximum number of algorithm iterations before optimization stops. Must be a positive integer.
OutputFcn
A user-defined function executed once per algorithm iteration.
TolFun
Termination criterion for the function output. If the difference in the calculated objective function between one algorithm iteration and the next is less than TolFun the optimization stops. Must be a positive scalar.
TolX
Termination criterion for the function input. If the difference in x, the current search point, between one algorithm iteration and the next is less than TolX the optimization stops. Must be a positive scalar.
TypicalX
Updating

— Function File: optimget (options, parname)
— Function File: optimget (options, parname, default)

Return a specific option from a structure created by optimset. If parname is not a field of the options structure, return default if supplied, otherwise return an empty matrix.