Acceleration with a Ball Optimization Oracle

Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian

Consider an oracle which takes a point $x$ and returns the minimizer of a
convex function $f$ in an $\ell_2$ ball of radius $r$ around $x$. It is
straightforward to show that roughly $r^{-1}\log\frac{1}{\epsilon}$ calls to
the oracle suffice to find an $\epsilon$-approximate minimizer of $f$ in an
$\ell_2$ unit ball. Perhaps surprisingly, this is not optimal: we design an
accelerated algorithm which attains an $\epsilon$-approximate minimizer with
roughly $r^{-2/3} \log \frac{1}{\epsilon}$ oracle queries, and give a matching
lower bound. Further, we implement ball optimization oracles for functions with
locally stable Hessians using a variant of Newton's method. The resulting
algorithm applies to a number of problems of practical and theoretical import,
improving upon previous results for logistic and $\ell_\infty$ regression and
achieving guarantees comparable to the state-of-the-art for $\ell_p$
regression.