Who: Mokhwa Lee
Abstract: When dealing with a large-scale optimization problem, quasi-Newton (QN) methods provide an efficient alternative to second-order techniques for solving minimization problems by approximating the curvature of a given target function. Quasi-Newton methods generally compose a Hessian estimate via low-rank updates, such as rank-2 update, that involve successive differences in variables and gradients. This approach reduces computational complexity by relying solely on first-order information while satisfying the secant condition(s). We focus on multisecant (MS) extensions of QN, which enhance the Hessian approximation at low cost for both quadratic and non-quadratic problems. By considering MS conditions, we aim to improve the Hessian inverse estimation of traditional QN methods. This presentation emphasizes multisecant extensions of QN methods to achieve better stability and effectiveness, proposing three key strategies: the rejection method, positive semidefinite updates, and limited memory extensions.
Where: Zoom https://stonybrook.zoom.us/j/3777993918?pwd=SUwwanQ0ZjNVTS9FN0tScHBwb0V5dz09