Information and Statistics

Monday, April 13, 2015 - 9:00am - 9:50am
Keller 3-180
Andrew Barron (Yale University)
This presentation explores information theory and statistics themes in two or three topics with which I have been involved, selected from among the the following topics: penalized likelihood concentration and risk bounds derivable from the information theory properties of penalized likelihood; greedy algorithms for vertex selection for projection onto convex hulls (and associated algorithms for term selection in regression); provably fast capacity-achieving codes for communication in the presence of Gaussian noise; information-theoretic characterization of minimax risk asymptotics; entropy and Fisher information inequalities as consequences of variance concentration of subset sums of independent random variables (and the resulting general entropy power inequalities and entropic-central limit theorems); and information theoretic understanding of monotonicity of relative entropy in Markov chains and speculations concerning discrete-time, collision-based theory of entropy in statistical mechanic.
MSC Code: