In this paper we discuss the Anderson's type acceleration method for numerical optimizations. Most mathematical modeling problems can be formulated as constrained optimization. The necessary optimality condition is written as a fixed point problem in a Banach space. Anderson's acceleration method improves the convergence of the standard fixed point iteration by minimizing the total sum of residuals and updating solutions through an optimal linear combination of a sequence of iterates. Thus, it is a form of iterative method of retard which uses the history of solutions as a reduced order element method (ROM). The weights are determined optimally by the least squares problem based on the total residual. We analyze Anderson's method and the reduced order method (ROM) for nonlinear least squares problems of minimizing |F(x)| squared. That is, the solution is approximated by a linear combination of sequentially generated solutions, and then we minimize the equation error on the linear manifold spanned by the iterates. We use the reduced order Gauss Newton method to solve the least squares problem for |F(x)| squared on the linear solution manifold. For the linear equation case it is similar to Anderson's method. Anderson's method approximates the solution to the nonlinear ROM. We consider variable step size gradient and quasi Newton methods and the variable fixed point iteration to generate the solution basis, then combine these with ROM acceleration. It converges very rapidly if the condition number of matrix A is not large. The variable iterate with ROM acceleration is nearly optimal, and ROM also regularizes and stabilizes the convergence. We also consider a randomly overlapped Kaczmarz type method and develop an acceleration approach for large scale ill posed linear systems. Finally, we analyze the convergence of the ROM to the operator equation.
翻译:暂无翻译