I think Marquardt's original paper appeared in a SIAM (Society for Industrial and Applied Mathematics) Journal somewhere around 1975.
As I recall:
Basically this is a compromise between steepest descent (very slow convergence, but converges from anywhere) and Newton's method (very fast convergence, but converges only close to the optimum). Newton's method also has the disadvantage of requiring computation of the Hessian matrix (the second derivative with respect to all pairs of variables x_i, x_j), or actually it's inverse. This is a very expensive computation.
The Newton direction is often not the same direction as the steepest descent. Levenberg-Marquardt attempts to use Newton (or perhaps a variant of Newton, using a positive definite approximation to the Hessian) wherever it gives good descent results, and steepest descent where it doesn't. There is a parameter which causes the update direction to vary between the two directions, and which also controls step size. The algorithm continuously varies this parameter to maintain descent as speedy as possible while ensuring that the steps are indeed descents.
Tuesday, February 27, 2007
Levernberg-Marquadt
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment