uncertainty quantification

Tuesday, June 26, 2018 - 10:00am - 11:00am
Markos Katsoulakis (University of Massachusetts)
We discuss new information-based Uncertainty Quantification (UQ) methods cable to assess and improve the predictive ability of computational models in applications ranging from materials design, optimizing catalysts and fuel cells to risk assessment in subsurface flows. These models are typically multi-scale and involve any available data, e.g. from electronic structure calculations or observational/experiment data.
Tuesday, May 29, 2018 - 10:00am - 10:50am
Ching-Shan Chou (The Ohio State University)
Mathematical models in systems biology often have many parameters, such as biochemical reaction rates, whose true values are unknown. When the number of parameters is large, it becomes computationally difficult to analyze their effects and to estimate parameter values from experimental data. This is especially challenging when the model is expensive to evaluate, as is the case for large spatial models. In this work, we introduce a methodology for using surrogate models to drastically reduce the cost of parameter analysis in such models.
Monday, March 14, 2016 - 3:00pm - 4:00pm
Matthias Heinkenschloss (Rice University)
Many science and engineering problems lead to optimization problems governed by partial differential equations (PDEs), and in many of these problems some of the problem data are not known exactly. I focus on a class of such optimization problems where the uncertain data are modeled by random variables or random fields, and where decision variables (controls/designs) are deterministic and have to be computed before the uncertainty is observed. It is important that the uncertainty in problem data is adequately incorporated into the formulation of the optimization problem.
Tuesday, February 23, 2016 - 11:55am - 12:40pm
Timothy Wildey (Sandia National Laboratories)
Uncertainty and error are ubiquitous in predictive modeling and simulation due to unknown model parameters, boundary conditions and various sources of numerical error. Consequently, there is considerable interest in developing efficient and accurate methods to quantify the uncertainty in the outputs of a computational model. Monte Carlo techniques are the standard approach due to their relative ease of implementation and the fact that they effectively circumvent the curse of dimensionality.
Monday, February 22, 2016 - 9:15am - 10:00am
Loren Miller (DataMetric Innovations LLC)
In 1993, The Goodyear Tire & Rubber Company ventured into an improbable, close-working partnership with Sandia National Laboratories, a U.S. nuclear weapons lab. We will discuss, from the industry perspective, the lessons learned while initiating, managing, and gaining value from the relationship, including pitfalls and how to surmount them, in what became one of the Department of Energy’s showcase Cooperative Research and Development Agreements.
Tuesday, February 23, 2016 - 10:30am - 10:55am
Ben Lee (Siemens AG)
Lead-acid batteries are capable of providing large amounts of power for short durations. Unlike Lithium-Ion batteries, they age well when stored at (near) full capacity, and are relatively inexpensive in terms of storage capacity.
Monday, December 16, 2013 - 8:45am - 9:15am
Peter Schultz (Sandia National Laboratories)
How does one go from good science to good engineering, and conversely, how does one reach into sub-continuum scale physics to add greater fidelity to an engineering scale analysis meant to inform high-consequence decisions? For scientific investigations at the atomistic scale, the numerical analyses that feed into quantitative assessments of uncertainties at the engineering scale can seem inaccessibly remote.
Subscribe to RSS - uncertainty quantification