HOME    »    SCIENTIFIC RESOURCES    »    Volumes
Abstracts and Talk Materials
Compressive Sampling and Frontiers in Signal Processing
June 4 - 15, 2007


Douglas N. Arnold (University of Minnesota, Twin Cities)
http://umn.edu/~arnold/

Welcome and introduction
June 4, 2007


Leon Axel (New York University)
Steen Moeller (University of Minnesota, Twin Cities)

Introduction to MRI
June 5, 2007


Richard G. Baraniuk (Rice University)
http://dsp.rice.edu/~richb

Compressive sensing for time signals: Analog to information conversion
June 11, 2007


Richard G. Baraniuk (Rice University)
http://dsp.rice.edu/~richb

An introduction to transform coding
June 11, 2007


Richard G. Baraniuk (Rice University)
http://dsp.rice.edu/~richb

Compressive sensing for detection and classification problems
June 12, 2007


Richard G. Baraniuk (Rice University)
http://dsp.rice.edu/~richb

Multi-signal, distributed compressive sensing
June 12, 2007


Richard G. Baraniuk (Rice University)
http://dsp.rice.edu/~richb

Compressive imaging with a single pixel camera
June 13, 2007


Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Connections with information and coding theory
June 13, 2007

We morph compressive sampling into an error correcting code, and explore the implications of this sampling theory for lossy compression and some of its relationship with universal source coding.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Modern convex optimization
June 14, 2007

We will survey the literature on interior point methods which are very efficient numerical algorithms for solving large scale convex optimization problems.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Applications, experiments and open problems
June 15, 2007

We discuss several applications of compressive sampling in the area of analog-to-digital conversion and biomedical imaging and review some numerical experiments in new directions. We conclude by exposing the participants to some important open problems.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Sparsity
June 4, 2007

After a rapid and glossy introduction to compressive sampling–or compressed sensing as this is also called–the lecture will introduce sparsity as a key modeling tool; the lecture will review the crucial role played by sparsity in various areas such as data compression, statistical estimation and scientific computing.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Robust compressive sampling and connections with statistics (continued)
June 12, 2007

We show that accurate estimation from noisy undersampled data is sometimes possible and connect our results with a large literature in statistics concerned with high dimensionality; that is, situations in which the number of observations is less than the number of parameters.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Sparsity and the l1 norm
June 5, 2007

In many applications, one often has fewer equations than unknowns. While this seems hopeless, we will show that the premise that the object we wish to recover is sparse or compressible radically changes the problem, making the search for solutions feasible. This lecture discusses the importance of the l1-norm as a sparsity promoting functional and will go through a series of examples touching on many areas of data processing.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Robust compressive sampling and connections with statistics
June 11, 2007

We show that compressive sampling is–perhaps surprisingly–robust vis a vis modeling and measurement errors.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

Compressive sampling: sparsity and incoherence
June 6, 2007

Compressed sensing essentially relies on two tenets: the first is that the object we wish to recover is compressible in the sense that it has a sparse expansion in a set of basis functions; the second is that the measurements we make (the sensing waveforms) must be incoherent with these basis functions. This lecture will introduce key results in the field such as a new kind of sampling theorem which states that one can sample a spectrally sparse signal at a rate close to the information rate---and this without information loss.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

The uniform uncertainty principle
June 7, 2007

We introduce a strong form of uncertainty relation and discuss its fundamental role in the theory of compressive sampling. We give examples of random sensing matrices obeying this strong uncertainty principle; e.g. Gaussian matrices.

Emmanuel J. Candès (California Institute of Technology)
http://www.acm.caltech.edu/~emmanuel/

The role of probability in compressive sampling
June 8, 2007

This lecture will discuss the crucial role played by probability in compressive sampling; we will discuss techniques for obtaining nonasymptotic results about extremal eigenvalues of random matrices. Of special interest is the role played by high- dimensional convex geometry and techniques from geometric functional analysis such as the Rudelson's selection lemma and the role played by powerful results in the probabilistic theory of Banach space such as Talagrand's concentration inequality.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Discrete compressed sensing
June 6, 2007

The problem, best matrices for classes, Gelfand widths and their connection to compressed sensing.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Performance of iterated least squares
June 14, 2007

Convergence and exponential convergence.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Compression
June 5, 2007

Best k-term approximation for bases and dictionaries, decay rates, approximation classes, application to image compression via wavelet decompositions.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Performance in probability
June 12, 2007

Examples of performance for Gaussian and Bernoulli ensembles.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

The restricted isometry property (RIP)
June 7, 2007

Performance of compressed sensing under RIP.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Performance of CS matrices revisited
June 11, 2007

Proofs of the Kashin-Gluskin theorems.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Decoders
June 13, 2007

l1 minimization, greedy algorithms, iterated least squares.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Construction of CS matrices with best RIP
June 8, 2007

Bernoulli and Gaussian random variables.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Deterministic constructions of CS Matrices
June 15, 2007

Constructions from finite fields, circulant matrices.

Ronald DeVore (University of South Carolina)
http://www.math.tamu.edu/~rdevore/

Signal encoding
June 4, 2007

Shannon-Nyquist Theory, Pulse Code Modulation, Sigma-Delta Modulation, Kolmogorov entropy, optimal encoding.

Anna Gilbert (University of Michigan)
http://www.math.lsa.umich.edu/~annacg/

Algorithms for Compressed Sensing, II
June 8, 2007

What do these algorithms all have in common? What are the common goals of the problems and how do they achieve them? I will discuss several known techniques and open problems.

Anna Gilbert (University of Michigan)
http://www.math.lsa.umich.edu/~annacg/

Algorithms for Compressed Sensing, I
June 7, 2007

hat algorithmic problem do we mean by Compressed Sensing? There are a variety of alternatives, each with different algorithmic solutions (both theoretical and practical). I will discuss some of the different types of results from the combinatorial to the probabilistic.

Connect With Us:
Go