Sampling for Linear-Gaussian Inverse Problems (is faster than regularized inversion)
Friday, June 19, 2015 - 11:00am - 12:30pm
This is a technical talk on the recent marginal-then-conditional sampler for hierarchical Bayesian models. Bayesian models for inverse problems naturally have a hierarchical structure in which the data model depends on a high-dimensional latent structure, which in turn depends on a low-dimensional hyperparameter vector. In the linear-Gaussian case, of which image deblurring is a canonical example, the full conditional for the latent structure is Gaussian, so can be sampled using efficient methods from numerical linear algebra. We show that the marginal posterior over hyperparameters may be sampled cheaply, so that the cost of an independent posterior sample is dominated by the cost of a single regularized inversion. The computational budget required for selecting a regularizing parameter then allows many independent posterior samples to be computed, providing robust posterior estimates and errors.