Main navigation | Main content

HOME » PROGRAMS/ACTIVITIES » New Directions

PROGRAMS/ACTIVITIES

Annual Thematic Program »Postdoctoral Fellowships »Hot Topics and Special »Public Lectures »New Directions »PI Programs »Math Modeling »Seminars »Be an Organizer »Annual »Hot Topics »PI Summer »PI Conference »Applying to Participate »

Abstracts and Talk Materials

Douglas N. Arnold (University of Minnesota) http://www.ima.umn.edu/~arnold/

Welcome and introduction

Mon Jun 04 08:50:00 - 09:00:00

Leon Axel (New York University) , Steen Moeller (University of Minnesota)

Introduction to MRI

Tue Jun 05 14:00:00 - 15:00:00

Richard Baraniuk (Rice University) http://dsp.rice.edu/~richb

An introduction to transform coding

Mon Jun 11 14:00:00 - 15:00:00

Richard Baraniuk (Rice University) http://dsp.rice.edu/~richb

Compressive sensing for time signals: Analog to information conversion

Mon Jun 11 15:30:00 - 16:30:00

Richard Baraniuk (Rice University) http://dsp.rice.edu/~richb

Compressive sensing for detection and classification problems

Tue Jun 12 14:00:00 - 15:00:00

Richard Baraniuk (Rice University) http://dsp.rice.edu/~richb

Multi-signal, distributed compressive sensing

Tue Jun 12 15:30:00 - 16:30:00

Richard Baraniuk (Rice University) http://dsp.rice.edu/~richb

Compressive imaging with a single pixel camera

Wed Jun 13 11:00:00 - 12:30:00

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Sparsity

Mon Jun 04 09:00:00 - 10:30:00

After a rapid and glossy introduction to compressive sampling–or compressed sensing as this is also called–the lecture will introduce sparsity as a key modeling tool; the lecture will review the crucial role played by sparsity in various areas such as data compression, statistical estimation and scientific computing.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Sparsity and the l1 norm

Tue Jun 05 09:00:00 - 10:30:00

In many applications, one often has fewer equations than unknowns. While this seems hopeless, we will show that the premise that the object we wish to recover is sparse or compressible radically changes the problem, making the search for solutions feasible. This lecture discusses the importance of the l1-norm as a sparsity promoting functional and will go through a series of examples touching on many areas of data processing.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Compressive sampling: sparsity and incoherence

Wed Jun 06 09:00:00 - 10:30:00

Compressed sensing essentially relies on two tenets: the first is that the object we wish to recover is compressible in the sense that it has a sparse expansion in a set of basis functions; the second is that the measurements we make (the sensing waveforms) must be incoherent with these basis functions. This lecture will introduce key results in the field such as a new kind of sampling theorem which states that one can sample a spectrally sparse signal at a rate close to the information rate---and this without information loss.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

The uniform uncertainty principle

Thu Jun 07 09:00:00 - 10:30:00

We introduce a strong form of uncertainty relation and discuss its fundamental role in the theory of compressive sampling. We give examples of random sensing matrices obeying this strong uncertainty principle; e.g. Gaussian matrices.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

The role of probability in compressive sampling

Fri Jun 08 09:00:00 - 10:30:00

This lecture will discuss the crucial role played by probability in compressive sampling; we will discuss techniques for obtaining nonasymptotic results about extremal eigenvalues of random matrices. Of special interest is the role played by high- dimensional convex geometry and techniques from geometric functional analysis such as the Rudelson's selection lemma and the role played by powerful results in the probabilistic theory of Banach space such as Talagrand's concentration inequality.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Robust compressive sampling and connections with statistics

Mon Jun 11 09:00:00 - 10:30:00

We show that compressive sampling is–perhaps surprisingly–robust vis a vis modeling and measurement errors.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Robust compressive sampling and connections with statistics (continued)

Tue Jun 12 09:00:00 - 10:30:00

We show that accurate estimation from noisy undersampled data is sometimes possible and connect our results with a large literature in statistics concerned with high dimensionality; that is, situations in which the number of observations is less than the number of parameters.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Connections with information and coding theory

Wed Jun 13 09:00:00 - 10:30:00

We morph compressive sampling into an error correcting code, and explore the implications of this sampling theory for lossy compression and some of its relationship with universal source coding.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Modern convex optimization

Thu Jun 14 09:00:00 - 10:30:00

We will survey the literature on interior point methods which are very efficient numerical algorithms for solving large scale convex optimization problems.

Emmanuel J. Candès (California Institute of Technology) http://www.acm.caltech.edu/~emmanuel/

Applications, experiments and open problems

Fri Jun 15 09:00:00 - 10:30:00

We discuss several applications of compressive sampling in the area of analog-to-digital conversion and biomedical imaging and review some numerical experiments in new directions. We conclude by exposing the participants to some important open problems.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Signal encoding

Mon Jun 04 11:00:00 - 12:30:00

Shannon-Nyquist Theory, Pulse Code Modulation, Sigma-Delta Modulation, Kolmogorov entropy, optimal encoding.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Compression

Tue Jun 05 11:00:00 - 12:30:00

Best k-term approximation for bases and dictionaries, decay rates, approximation classes, application to image compression via wavelet decompositions.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Discrete compressed sensing

Wed Jun 06 11:00:00 - 12:30:00

The problem, best matrices for classes, Gelfand widths and their connection to compressed sensing.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

The restricted isometry property (RIP)

Thu Jun 07 11:00:00 - 12:30:00

Performance of compressed sensing under RIP.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Construction of CS matrices with best RIP

Fri Jun 08 11:00:00 - 12:30:00

Bernoulli and Gaussian random variables.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Performance of CS matrices revisited

Mon Jun 11 11:00:00 - 12:30:00

Proofs of the Kashin-Gluskin theorems.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Performance in probability

Tue Jun 12 11:00:00 - 12:30:00

Examples of performance for Gaussian and Bernoulli ensembles.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Decoders

Wed Jun 13 14:00:00 - 15:00:00

l1 minimization, greedy algorithms, iterated least squares.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Performance of iterated least squares

Thu Jun 14 11:00:00 - 12:30:00

Convergence and exponential convergence.

Ronald DeVore (University of South Carolina) http://www.math.sc.edu/~devore/

Deterministic constructions of CS Matrices

Fri Jun 15 10:45:00 - 11:45:00

Constructions from finite fields, circulant matrices.

Anna Gilbert (University of Michigan)

Algorithms for Compressed Sensing, I

Thu Jun 07 14:00:00 - 15:00:00

hat algorithmic problem do we mean by Compressed Sensing? There are a variety of alternatives, each with different algorithmic solutions (both theoretical and practical). I will discuss some of the different types of results from the combinatorial to the probabilistic.

Anna Gilbert (University of Michigan)

Algorithms for Compressed Sensing, II

Fri Jun 08 14:00:00 - 15:00:00

What do these algorithms all have in common? What are the common goals of the problems and how do they achieve them? I will discuss several known techniques and open problems.

Short presentations by participants

Mon Jun 04 14:00:00 - 15:30:00

Discussion

Wed Jun 06 15:00:00 - 15:30:00

Discussion

Thu Jun 07 15:00:00 - 15:30:00

Short presentations by participants

Thu Jun 14 14:00:00 - 15:00:00