HOME    »    SCIENTIFIC RESOURCES    »    Volumes
Abstracts and Talk Materials
Uncertainty Quantification in Industrial and Energy Applications: Experiences and Challenges
June 2-4, 2011


Mihai Anitescu (Argonne National Laboratory)
http://www.mcs.anl.gov/~anitescu/

Gradient-Enhanced Uncertainty Propagation
June 2, 2011

Keywords of the presentation: Gaussian process; derivative; universal Kriging, nuclear engi- neering

In this work we discuss an approach for uncertainty propagation through computationally expensive physics simulation codes. Our approach incorporates gradient information information to provide a higher quality surrogate with fewer simulation results compared with derivative-free approaches.

We use this information in two ways: we fit a polynomial or Gaussian process model ("surrogate") of the system response. In a third approach we hybridize the techniques where a Gaussian process with polynomial mean is fit resulting in an improvement of both techniques. The surrogate coupled with input uncertainty information provides a complete uncertainty approach when the physics simulation code can be run at only a small number of times. We discuss various algorithmic choices such as polynomial basis and covariance kernel. We demonstrate our findings on synthetic functions as well as nuclear reactor models.

Florian Augustin (TU München)
http://www-m2.ma.tum.de/bin/view/Allgemeines/FlorianAugustinEn

Poster -Algorithm Class ARODE
December 31, 1969

Ordinary differential equations with uncertain parameters are a vast field of research. Monte-Carlo simulation techniques are widely used to approximate quantities of interest of the solution of random ordinary differential equations. Nevertheless, over the last decades, methods based on spectral expansions of the solution process have drawn great interest. They are promising methods to efficiently approximate the solution of random ordinary differential equations. Although global approaches on the parameter domain reveal to be very inaccurate in many cases, an element-wise approach can be proven to converge. This poster presents an algorithm, which is based on the stochastic Galerkin Runge-Kutta method. It incorporates adaptive stepsize control in time and adaptive partitioning of the parameter domain.

Andrew J. Booker (The Boeing Company)

Uncertainty Quantification and Optimization Under Uncertainty: Experience and Challenges
June 4, 2011

Keywords of the presentation: Uncertainy quantification, optimzation under uncertainty, functional ANOVA, stochastic collocation

This talk will describe experiences and challenges at Boeing with Uncertainty Quantification (UQ) and Optimization Under Uncertainty (OUU) in conceptual design problems that use complex computer simulations. The talk will describe tools and methods that have been developed and used by the Applied Math group at Boeing and their perceived strengths and limitations. Application of the tools and methods will be illustrated with an example in conceptual design of a hypersonic vehicle. Finally I will discuss future development plans and needs in UQ and OUU.

Roger G. Ghanem (University of Southern California)
http://venus.usc.edu/

The Curse of Dimensionality, Model Validation, and UQ.
June 3, 2011

Keywords of the presentation: Polynomial chaos, Curse of DImensionality, Model Validation, Uncertainty Quantification.

The curse of dimensionality is a ubiquitous challenge in uncertainty quantification. It usually comes about as the complexity of analysis is controlled by the complexity of input parameters. In most cases of practical relevance, the output quantity of interest (QoI) is some integral of the input quantities and can thus be described in a much lower dimensional setting. This talk will describe novel procedures for honoring the low-dimensional character of the QoI without any loss of information. The talk will also describe the range of QoI that can be addressed using this formalism.

The role of UQ as the engine behind the model validation puts a burden of rigor on UQ formulations. The ability to explore the effect of particular probabilistic choices on model validity is paramount for practical applications in general, and data-poor applications in particular. The talk will also address achievable and meaningful definitions of the validation process and demonstrate their relevance in the context of industrial problems.

Albert B. Gilg (Siemens AG)
http://www-m2.ma.tum.de/bin/view/Allgemeines/ProfessorGilg
Utz Wever (Siemens AG)

Poster- Robust Design for Industrial Applications
December 31, 1969

Industrial product and process designs often exploit physical limits to improve performance. In this regime uncertainty originating from fluctuations during fabrication and small disturbances in system operations severely impacts product performance and quality. Design robustness becomes a key issue in optimizing industrial designs. We present examples of challenges and solution approaches implemented in our robust design tool RoDeO.

Albert B. Gilg (Siemens AG)
http://www-m2.ma.tum.de/bin/view/Allgemeines/ProfessorGilg

Mastering Impact of Uncertainties by Robust Design Optimization Techniques for Turbo-Machinery
June 4, 2011

Keywords of the presentation: Robust Design Optimization, turbo charger design, polynomial chaos expansions

Deterministic design optimization approaches are no longer satisfactory for industrial high technology products. Product and process designs often exploit physical limits to improve performance. In this regime uncertainty originating from fluctuations during fabrication and small disturbances in system operations severely impacts product performance and quality. Design robustness becomes a key issue in optimizing industrial designs. We present challenges and solution approaches implemented in our robust design tool RoDeO applied turbo charger design. In addition to the challenges for electricity generating turbines, turbo chargers have to work efficiently for a wide range of rotation frequencies. Time-consuming aerodynamic (CFD) and mechanical (FEM) computations for large sets of frequencies became a severely limiting factor even for deterministic optimization. Further more constrained deterministic optimization could not guarantee critical design limits under impact of uncertainty during fabrication. Especially, the treatment of design constraints in terms of thresholds for von Mises stress or modal frequencies became crucial. We introduce an efficient approach for the numerical treatment of such chance constraints that even do not need additional CFD and FEM calculations in our robust design tool set. An outlook for further design challenges concludes the presentation. Contents of this presentation are joint work of U. Wever, M. Klaus, M. Paffrath and A. Gilg.

Charles S. Jackson (University of Texas, Austin)
http://www.ig.utexas.edu/people/staff/charles/

Scientific and statistical challenges to quantifying uncertainties in climate projections
June 2, 2011

Keywords of the presentation: climate, Bayesian inference, MCMC, biases

The problem of estimating uncertainties in climate prediction is not well defined. While one can express its solution within a Bayesian statistical framework, the solution is not necessarily correct. One must confront the scientific issues for how observational data is used to test various hypotheses for the physics of climate. Moreover, one also must confront the computational challenges of estimating the posterior distribution without the help of a statistical emulator of the forward model. I will present results of a recently completed estimate of the uncertainty in specifying 15 parameters important to clouds, convection, and radiation of the Community Atmosphere Model. I learned that the maximum posterior probably is not in the same region of parameter space as the minimum log-likelihood. I have interpreted these differences to the existence of model biases and the potential that the minimum log-likelihood, which are often the desired solutions to data inversion problems, are over-fitting the data. Such a result highlights the need for a combination of scientific and computational thinking to begin to address uncertainties for complex multi-physics phenomena.

Charles S. Jackson (University of Texas, Austin)
http://www.ig.utexas.edu/people/staff/charles/

Poster - Scientific and statistical challenges to quantifying uncertainties in climate projections
December 31, 1969

The problem of estimating uncertainties in climate prediction is not well defined. While one can express its solution within a Bayesian statistical framework, the solution is not necessarily correct. One must confront the scientific issues for how observational data is used to test various hypotheses for the physics of climate. Moreover, one also must confront the computational challenges of estimating the posterior distribution without the help of a statistical emulator of the forward model. I will present results of a recently completed estimate of the uncertainty in specifying 15 parameters important to clouds, convection, and radiation of the Community Atmosphere Model. I learned that the maximum posterior probably is not in the same region of parameter space as the minimum log-likelihood. I have interpreted these differences to the existence of model biases and the potential that the minimum log-likelihood, which are often the desired solutions to data inversion problems, are over-fitting the data. Such a result highlights the need for a combination of scientific and computational thinking to begin to address uncertainties for complex multi-physics phenomena.

Gardar Johannesson (Lawrence Livermore National Laboratory)

Poster- The Uncertainty Quantification Project at Lawrence Livermore National Laboratory: Sensitivities and Uncertainties of the Community Atmosphere Model
December 31, 1969

A team at the Lawrence Livermore National Laboratory is currently undertaking an uncertainty analysis of the Cummunity Earth System Model (CESM), as a part of a larger effort to advance the science of Uncertainty Quantification (UQ). The Climate UQ effort has three major phases: UQ of the Cummunity Atmospheric Model (CAM) component of CESM, UQ of CAM coupled to a simple slab ocean model, and UQ of the fully coupled CESM (CAM + 3D ccean). In this poster we describe the first phase of the Climate UQ effort; the generate of CAM ensemble of simulations for sensitivity and uncertainty analysis.

Donald R. Jones (General Motors Corporation)

Improved Quantification of Prediction Error for Kriging Response Surfaces
June 2, 2011

Keywords of the presentation: kriging, standard error, mean squared error, global optimization

Kriging response surfaces are now widely used to optimize design parameters in industrial applications where assessing a design's performance requires long computer simulations. The typical approach starts by running the computer simulations at points in an experiment design and then fitting kriging surfaces to the resulting data. One then proceeds iteratively: calculations are made on the surfaces to select new point(s); the simulations are run at these points; and the surfaces are updated to reflect the results. The most advanced approaches for selecting new points for sampling balance sampling where the kriging predictor is good (local search) with sampling where the kriging mean squared error is high (global search). Putting some emphasis on searching where the error is high ensures that we improve the accuracy of the surfaces between iterations and also makes the search global.

A potential problem with these approaches, however, is that the classic formula for the kriging mean squared error underestimates the true error, especially in small samples. The reason is that the formula is derived under the assumption that the parameters of the underlying stochastic process are known, but in reality they are estimated. In this paper, we show how to fix this underestimation problem and explore how doing so affects the performance of kriging-based optimization methods.

Guang Lin (Battelle Pacific Northwest Laboratories)
http://www.pnl.gov/science/staff/staff_info.asp?staff_num=7095

Poster - Error Reduction and Optimal Parameters Estimation in Convective Cloud Scheme in Climate Model
December 31, 1969

In this work, we studied sensitivity of physic processes and simulations to parameters in climate model, reduced errors and derived optimal parameters used in cloud convection scheme. MVFSA method is employed to derive optimal parameters and quantify the climate uncertainty. Through this study, we observe that parameters such as downdraft, entrainment and cape consumption time have very important impact on convective precipitation. Although only precipitation is constrained in this study, other climate variables are controlled by the selected parameters so could be beneficial by the optimal parameters used in convective cloud scheme.

Gabriela Martínez (University of Minnesota, Twin Cities)
http://sites.google.com/site/mgmlhome/

Poster- Stochastic Two-Stage Problems with Stochastic Dominance Constraint
December 31, 1969

We analyze stochastic two-stage optimization problems with a stochastic dominance constraint on the recourse function. The dominance constraint provides risk control on the future cost. The dominance relation is represented by either the Lorenz functions or by the expected excess functions of the random variables. We propose two decomposition methods to solve the problem and prove their convergence. Our methods exploit the decomposition structure of the expected value two-stage problems and construct successive approximations of the stochastic dominance constraint.

George C. Papanicolaou Fatal error: Call to a member function getVisitorOrganization() on a non-object in /srv/web/IMA-Propel/build/classes/discovery3/Person.php on line 78