Parallelizing Stochastic Approximation Through Mini-Batching and Averaging
Wednesday, May 18, 2016 - 10:00am - 10:50am
We consider the problem of stochastic approximation which involves minimization of a function available through access to an unbiased estimate of its gradient. In the particular case of least squares regression, we seek to understand how fast we can parallelize these algorithms using simple averaging and mini-batching procedures. We provide sharp rates and practical algorithms for this problem, which we believe may be helpful more generally. In the process, we adress questions such as How much does mini-batching help, on an instance by instance basis?