# Predictability and the Quantification of Uncertainty

Friday, September 17, 1999 - 11:20am - 12:00pm

Lind 409

James Glimm (University at Albany (SUNY))

Prediction involves a forward step, typically the solution of PDE's, and an inverse problem, to limit the degrees of freedom within the equations and data, given partial information concerning the solution. Prediction with the quantification of uncertainty is needed to take advantage of the opportunities created by modern simulation. As more of the stages of scientific inquiry are computationally based, there is an increased need to automate some of the decision processes associated with the computation.

The scientific tools needed for uncertainty quantification are very broadly distributed within the mathematical and quantitative sciences: numerical analysis, statistical data analysis, modeling, and simulation being prominent examples. The requirements for this technology are also broadly distributed.

This talk will develop a general stochastic framework for prediction, illustrated by problems of turbulent mixing and flow in porous media. The framework will impose requirements on both the forward (simulation) problem and the inverse (history matching) problem. Recent results of the speaker and collaborators will be presented.

Uncertainty concerning the problem formulation, that is the scientific model, comprising the equations, boundary conditions, and the physics and data parameters in the equations will be expressed in terms of a prior distribution. Observations will limit this uncertainty, and comparison of observations to simulations determines a likelihood for the validity of the model. This likelihood is determined from the mismatch between observation and model simulation and results from a probability model for observational and simulation errors. The likelihood is used in Bayes theorem to determine the posterior distribution for the model parameters. Any function of the solution can be evaluated, i.e. its mean and variance determined through integration with respect to the posterior distribution. Probability error models for numerical solutions are not common, and even for observations, this is unusual, so we discuss how they can be constructed.

The scientific tools needed for uncertainty quantification are very broadly distributed within the mathematical and quantitative sciences: numerical analysis, statistical data analysis, modeling, and simulation being prominent examples. The requirements for this technology are also broadly distributed.

This talk will develop a general stochastic framework for prediction, illustrated by problems of turbulent mixing and flow in porous media. The framework will impose requirements on both the forward (simulation) problem and the inverse (history matching) problem. Recent results of the speaker and collaborators will be presented.

Uncertainty concerning the problem formulation, that is the scientific model, comprising the equations, boundary conditions, and the physics and data parameters in the equations will be expressed in terms of a prior distribution. Observations will limit this uncertainty, and comparison of observations to simulations determines a likelihood for the validity of the model. This likelihood is determined from the mismatch between observation and model simulation and results from a probability model for observational and simulation errors. The likelihood is used in Bayes theorem to determine the posterior distribution for the model parameters. Any function of the solution can be evaluated, i.e. its mean and variance determined through integration with respect to the posterior distribution. Probability error models for numerical solutions are not common, and even for observations, this is unusual, so we discuss how they can be constructed.