- Main
Operator Splitting Methods for Convex and Nonconvex Optimization
- Liu, Yanli
- Advisor(s): Yin, Wotao
Abstract
This dissertation focuses on a family of optimization methods called operator splitting methods. They solve complicated problems by decomposing the problem structure into simpler pieces and make progress on each of them separately. Over the past two decades, there has been a resurgence of interests in these methods as the demand for solving structured large-scale problems grew. One of the major challenges for splitting methods is their sensitivity to ill-conditioning, which often makes them struggle to achieve a high order of accuracy. Furthermore, their classical analyses are restricted to the nice settings where solutions do exist, and everything is convex. Much less is known when either of these assumptions breaks down.
This work aims to address the issues above. Specifically, we propose a novel acceleration technique called inexact preconditioning, which exploits second-order information at relatively low computation cost. We also show that certain splitting methods still work on problems without solutions, in the sense that their iterates provide information on what goes wrong and how to fix. Finally, for nonconvex problems with saddle points, we show that almost surely, splitting methods will only converge to the local minimums under certain assumptions.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-