Survey/Tutorial lecture - Sparsity, Regularization, and Applications<br/><br/><br/><br/>

Thursday, September 8, 2011 - 2:00pm - 3:00pm
Keller 3-180
Joel Tropp (California Institute of Technology)
The purpose of this tutorial is to describe the intellectual apparatus that supports some modern techniques in statistics, machine learning, signal processing, and related areas. The main ingredient is the observation that many types of data admit parsimonious representations, i.e., there are far fewer degrees of freedom in the data than the ambient dimension would suggest. The second ingredient is a collection of tractable algorithms that can effectively search for a parsimonious solution to a data analysis problem, even though these types of constraints tend to be nonconvex. Together, the theory of sparsity and sparse regularization can be viewed as a framework for treating a huge variety of computational problems in data analysis. We conclude with some applications where these two ideas play a dominant role.
MSC Code: