Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free Optimization
Kaiwen Zhou, Anthony Man-Cho So, James Cheng
We show that stochastic acceleration can be achieved under the perturbed
iterate framework (Mania et al., 2017) in asynchronous lock-free optimization,
which leads to the optimal incremental gradient complexity for finite-sum
objectives. We prove that our new accelerated method requires the same linear
speed-up condition as the existing non-accelerated methods. Our core
algorithmic discovery is a new accelerated SVRG variant with sparse updates.
Empirical results are presented to verify our theoretical findings.