Campuses:

Some Recent Developments in Large-Scale Convex Optimization

Tuesday, February 24, 2015 - 3:15pm - 4:05pm
Keller 3-180
Niao He (Georgia Institute of Technology)
In the modern era of large-scale machine learning and high-dimensional statistics, using mixing regularization and kernelization become increasingly popular and important modeling strategies. However, they often lead to very complex optimization models with extremely large scale and nonsmooth objective functions, which bring new challenges to the traditional first-order methods, due to the expensive computation or memory cost of proximity operators and even gradients. In this talk, I will discuss some recent algorithmic advances that cope with these challenges by taking advantage of the underlying structures and using randomization techniques. I will present (i) my work on the composite mirror prox algorithm for a broad class of variational inequalities, allowing to cover the composite minimization problem with multiple nonsmooth regularization tems, (ii) my work on the doubly stochastic gradient descent algorithm for stochastic optimization problems over reproducing kernel Hilbert spaces. These algorithms exhibit the optimal convergence rates and make it practical to handle problems with extremely large dimensions and large datasets. Besides the theoretical efficiency, the algorithms are also proven useful in a wide range of interesting applications in machine learning, image processing, and statistical inferences.

Based on recent works with (1) A. Juditsky and A. Nemirovski (2) B. Dai, B. Xie, Y. Liang, M.F. Balcan and L. Song.
MSC Code: 
47N10