Campuses:

Entropy, inference, and channel capacity

Wednesday, June 22, 2005 - 2:00pm - 3:00pm
EE/CS 3-180
Sean Meyn (University of Illinois at Urbana-Champaign)
The goal of these three lectures is two-fold:

1. We will explore issues surrounding the capacity of non-coherent,
memoryless communication channels. It is now known that the
capacity-achieving input distribution is typically discrete, with a
finite number of mass points. Even for the additive white-noise
Gaussian channel, a distribution of this form is very nearly optimal
for SNR of unity or below.

2. These concepts generalize to worst-case channel models, and to
worst-case hypothesis testing. To see why, we will explore
applications and theory of mutual information and relative entropy. In
each of the applications considered, a discrete distribution is the
optimizer of a linear program over the space of probability measures,
or a convex program that admits a linear approximation.

These findings lead to many new applicable techniques, including
improved methods for signal constellation design, new methods for
computation of optimal input distributions, and a simple and effective
approach to robust hypothesis testing.