Nonconvex Lagrangian-based optimization: Monitoring schemes and global convergence

Bolte, Jérôme, Sabach, Shoham and Teboulle, Marc (2018) Nonconvex Lagrangian-based optimization: Monitoring schemes and global convergence. Mathematics of Operations Research, vol. 43 (n° 4). pp. 1051-1404.

Full text not available from this repository.
Official URL:


We introduce a novel approach addressing global analysis of a difficult class of nonconvexnonsmooth optimization problems within the important framework of Lagrangian-based methods. This genuine nonlinear class captures many problems in modern disparate fields of applications. It features complex geometries, qualification conditions, and other regularity properties do not hold everywhere. To address these issues we work along several research lines to develop an original general Lagrangian methodology which can deal, all at once, with the above obstacles. A first innovative feature of our approach is to introduce the concept of Lagrangian sequences for a broad class of algorithms. Central to this methodology is the idea of turning an arbitrary descent method into a multiplier method. Secondly, we provide these methods with a transitional regime allowing us to identify in finitely many steps a zone where we can tune the step-sizes of the algorithm for the final converging regime. Then, despite the min-max nature of Lagrangian methods, using an original Lyapunov method we prove that each bounded sequence generated by the resulting monitoring schemes are globally convergent to a critical point for some fundamental Lagrangian-based methods in the broad semialgebraic setting, which to the best of our knowledge, are the first of this kind.

Item Type: Article
Language: English
Date: November 2018
Refereed: Yes
Divisions: TSE-R (Toulouse)
Site: UT1
Date Deposited: 18 Jan 2019 08:21
Last Modified: 24 Jul 2019 11:42
["eprint_fieldname_oai_identifier" not defined]:

Actions (login required)

View Item View Item