# Adaptive Approximations and Dimension Reduction for Bayesian Inference in Partial Differential Equations

Friday, January 18, 2013 - 10:15am - 11:05am

Keller 3-180

Youssef Marzouk (Massachusetts Institute of Technology)

The interplay of experimental observations with mathematical models often requires conditioning models on data---for example, inferring the coefficients or boundary conditions of partial differential equations from noisy functionals of the solution field. The Bayesian approach to these problems in principle requires posterior sampling in high or infinite-dimensional parameter spaces, where generating each sample requires the numerical solution of a deterministic PDE.

We present two developments designed to reduce the significant computational costs of the Bayesian approach. First, we consider local polynomial approximations or surrogates for the parameter-to observable map. While surrogates can substantially accelerate Bayesian inference in inverse problems, the construction of globally accurate surrogates for complex models can be prohibitive and in a sense unnecessary, as the posterior distribution may concentrate on a small fraction of the prior support. We present a new approach that uses stochastic optimization to construct polynomial approximations over a sequence of measures adaptively determined from the data, eventually concentrating on the posterior distribution. Second, while the posterior distribution may appear high-dimensional, the intrinsic dimensionality of the inference problem is affected by prior information, limited data, and the smoothing properties of the forward operator. Often only a few directions are needed to capture the change from prior to posterior. We describe a method for identifying these directions through the solution of a generalized eigenvalue problem, and extend it to nonlinear problems where the Hessian of the log-likelihood varies over parameter space. Identifying these directions leads to more efficient Rao-Blackwellized posterior sampling schemes.

Joint work with Jinglai Li, James Martin, Tiangang Cui, and Tarek Moselhy.

We present two developments designed to reduce the significant computational costs of the Bayesian approach. First, we consider local polynomial approximations or surrogates for the parameter-to observable map. While surrogates can substantially accelerate Bayesian inference in inverse problems, the construction of globally accurate surrogates for complex models can be prohibitive and in a sense unnecessary, as the posterior distribution may concentrate on a small fraction of the prior support. We present a new approach that uses stochastic optimization to construct polynomial approximations over a sequence of measures adaptively determined from the data, eventually concentrating on the posterior distribution. Second, while the posterior distribution may appear high-dimensional, the intrinsic dimensionality of the inference problem is affected by prior information, limited data, and the smoothing properties of the forward operator. Often only a few directions are needed to capture the change from prior to posterior. We describe a method for identifying these directions through the solution of a generalized eigenvalue problem, and extend it to nonlinear problems where the Hessian of the log-likelihood varies over parameter space. Identifying these directions leads to more efficient Rao-Blackwellized posterior sampling schemes.

Joint work with Jinglai Li, James Martin, Tiangang Cui, and Tarek Moselhy.

MSC Code:

62F15

Keywords: