Preconditioning and Globalizing Conjugate Gradients in Dual Space
for Quadratically Penalized Nonlinear-Least Squares Problems
S. Gratton, S. Gurol and Ph. L. Toint
Report NAXYS-10-2010 15 December 2010
Abstract.
When solving nonlinear least-squares problems, it is often useful to
regularize the problem using a quadratic term, a practice which is especially
common in applications arising in inverse calculations. A solution method
derived from a trust-region Gauss-Newton algorithm is analyzed for such
applications, where, contrary to the standard algorithm, the least-squares
subproblem solved at each iteration of the method is rewritten as a quadratic
minimization subject to linear equality constraints. This allows the
exploitation of duality properties of the associated linearized problems.
This paper considers a recent conjugate-gradient-like method which performs
the quadratic minimization in the dual space and produces, in exact
arithmetic, the same iterates as those produced by a standard
conjugate-gradients method in the primal space. This dual algorithm is
computationally interesting whenever the dimension of the dual space is
significantly smaller than that of the primal space, yielding gains in terms
of both memory usage and computational cost. The relation between this dual
space solver and PSAS (Physical-space Statistical Analysis System), another
well-known dual space technique used in data assimilation problems, is
explained.
The use of an effective preconditioning technique is proposed and refined
convergence bounds derived, which results in a practical solution method.
Finally, stopping rules adequate for a trust-region solver are
proposed in the dual space, providing iterates that are equivalent to those
obtained with a Steihaug-Toint truncated conjugate-gradient method in the
primal space.