Tutorial - Algorithms for Sparse Optimization and Machine Learning

Tuesday, March 27, 2012 - 10:00am - 10:45am
Keller 3-180
Stephen Wright (University of Wisconsin, Madison)
Machine learning, computational statistics, and signal reconstruction
in compressed sensing have proved to be a rich source of interesting
and challenging optimization problems in recent years. In these and
other fields, the optimization formulations require the computed
solution to satisfy some structural requirements (such as sparsity or
low total variation), as well as the traditional requirements of
feasibility and low objective value. Although the optimization
problems arising from these areas are quite diverse, there are common
features such as large underlying data sets and high-dimensional
variable spaces that influence the choice of algorithmic tools most
suitable for solving them.

In this tutorial, we start by surveying the space of optimization
problems that arise in machine learning and sparse optimization,
highlighting the features of each application and the requirements
they place on algorithms. We then sketch several fundamental tools
from optimization that have proved useful in these applications. These
include first-order methods and their accelerated variants, stochastic
gradient and incremental gradient methods, augmented Lagrangian,
shrinking / thresholding approaches, coordinate relaxation, and
higher-order approaches. We also discuss the role of duality and the
opportunities for parallel computing.
MSC Code: