# Bayesian inference

Friday, September 8, 2017 - 10:40am - 11:15am

Luis Tenorio (Colorado School of Mines)

Since in Bayesian inversion data are often informative only on a low-dimensional subspace of the parameter space,

significant computational savings can be achieved using such subspace to characterize and approximate the posterior distribution of the parameters.

We study approximations of the posterior covariance matrix defined as low-rank updates of the prior covariance matrix and

prove their optimality for a broad class of loss functions which includes the Forstner

significant computational savings can be achieved using such subspace to characterize and approximate the posterior distribution of the parameters.

We study approximations of the posterior covariance matrix defined as low-rank updates of the prior covariance matrix and

prove their optimality for a broad class of loss functions which includes the Forstner

Wednesday, September 6, 2017 - 2:55pm - 3:30pm

Youssef Marzouk (Massachusetts Institute of Technology)

Many inverse problems may involve a large number of observations. Yet these observations are seldom equally informative; moreover, practical constraints on storage, communication, and computational costs may limit the number of observations that one wishes to employ. We introduce strategies for selecting subsets of the data that yield accurate approximations of the inverse solution. This goal can also be understood in terms of optimal experimental design.

Thursday, June 18, 2015 - 11:00am - 12:30pm

Colin Fox (University of Otago)

This talk gives a guided tour through examples and indicative theory that highlight two important topics that are typically omitted from applied mathematics or engineering training. The first is an appreciation for the calculus of distributions, that is different to the calculus of functions. That implies that inference in the presence of uncertainty is fundamentally different to inversion, and that the notion of 'best fit' can be misleading, or just plain wrong.

Tuesday, June 16, 2015 - 2:00pm - 3:30pm

Matthew Parno (Massachusetts Institute of Technology)

The goal of my sessions are to help attendees hit the ground running by working through simple case studies with the MIT Uncertainty Quantification (MUQ) library. Those attending these sessions will be given interactive IPython notebooks to allow for hands-on experimentation. We will have one session on forward UQ, focused on Monte Carlo, polynomial chaos, and global sensitivities. The second session will focus on Bayesian Inference and the use of different MCMC algorithms for solving such problems.

Monday, December 16, 2013 - 11:35am - 12:05pm

In this talk, two fully Bayesian methods (Bayesian uncertainty method and Bayesian mixture procedure) will be introduced that can evaluate generalized Polynomial Chaos (gPC) expansions in both stochastic and spatial domains when the number of the available basis functions is significantly larger than the size of the training data-set. The method relies on modeling the PC coefficients suitably and performing simultaneously basis selection and coefficient evaluation via a fully Bayesian stochastic procedure, called mixed shrinkage prior (MSP), we have developed.

Friday, January 18, 2013 - 10:15am - 11:05am

Youssef Marzouk (Massachusetts Institute of Technology)

The interplay of experimental observations with mathematical models often requires conditioning models on data---for example, inferring the coefficients or boundary conditions of partial differential equations from noisy functionals of the solution field. The Bayesian approach to these problems in principle requires posterior sampling in high or infinite-dimensional parameter spaces, where generating each sample requires the numerical solution of a deterministic PDE.