题目：Hessian matrix VS. Gauss-Newton Hessian matrix
--- why the LM method is perferred in nonlinear LS problems
报告人：Prof. Pei Chen, Sun Yat-sen University
In this talk, we investigate how the Gauss–Newton Hessian matrix affects the basin of convergence in Newton-type methods. Although the Newton algorithm is theoretically superior to the Gauss–Newton algorithm and the Levenberg–Marquardt (LM) method as far as their asymptotic convergence rate is concerned, the LM method is often preferred in nonlinear least squares problems in practice. This paper presents a theoretical analysis of the advantage of the Gauss–Newton Hessian matrix. It is proved that the Gauss–Newton approximation function is the only nonnegative convex quadratic approximation that retains a critical property of the original objective function: taking the minimal value of zero on an (n − 1)-dimensional manifold (or affine subspace). Due to this property, the Gauss–Newton approximation does not change the zero-on-(n − 1)-D “structure” of the original problem, explaining the reason why the Gauss–Newton Hessian matrix is preferred for nonlinear least squares problems, especially when the initial point is far from the solution.
Bio：He worked as a postdoctor at Monash University for about half year, as a senior research engineer at Motorola Labs for about two years, then as a research professor at Shenzhen Institute of Advanced Integration Technology, Chinese Academy of Sciences, for about two years. Since Sept. 2008, he has been a professor at School of Information Science & Technology, Sun Yat-sen University.