Skip to content

Latest commit

 

History

History
48 lines (48 loc) · 1.77 KB

2024-06-11-latafat24a.md

File metadata and controls

48 lines (48 loc) · 1.77 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
On the convergence of adaptive first order methods: Proximal gradient and alternating minimization algorithms
Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG, a framework that unifies and extends existing results by providing larger stepsize policies and improved lower bounds. Different choices of the parameters are discussed and the efficacy of the resulting methods is demonstrated through numerical simulations. In an attempt to better understand the underlying theory, its convergence is established in a more general setting that allows for time-varying parameters. Finally, an adaptive alternating minimization algorithm is presented by exploring the dual setting. This algorithm not only incorporates additional adaptivity but also expands its applicability beyond standard strongly convex settings.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
latafat24a
0
On the convergence of adaptive first order methods: proximal gradient and alternating minimization algorithms
197
208
197-208
197
false
Latafat, Puya and Themelis, Andreas and Patrinos, Panagiotis
given family
Puya
Latafat
given family
Andreas
Themelis
given family
Panagiotis
Patrinos
2024-06-11
Proceedings of the 6th Annual Learning for Dynamics & Control Conference
242
inproceedings
date-parts
2024
6
11