Acceleration Methods
Alexandre d'Aspremont, Damien Scieur, Adrien Taylor
This monograph covers some recent advances on a range of acceleration
techniques frequently used in convex optimization. We first use quadratic
optimization problems to introduce two key families of methods, momentum and
nested optimization schemes, which coincide in the quadratic case to form the
Chebyshev method whose complexity is analyzed using Chebyshev polynomials.
We discuss momentum methods in detail, starting with the seminal work of
Nesterov (1983) and structure convergence proofs using a few master templates,
such as that of \emph{optimized gradient methods} which have the key benefit of
showing how momentum methods maximize convergence rates.
We further cover proximal acceleration techniques, at the heart of the
\emph{Catalyst} and \emph{Accelerated Hybrid Proximal Extragradient}
frameworks, using similar algorithmic patterns.
Common acceleration techniques directly rely on the knowledge of some
regularity parameters of the problem at hand, and we conclude by discussing
\emph{restart} schemes, a set of simple techniques to reach nearly optimal
convergence rates while adapting to unobserved regularity parameters.