Bayesian problems

Friday, June 19, 2015 - 11:00am - 12:30pm
Colin Fox (University of Otago)
This is a technical talk on the recent marginal-then-conditional sampler for hierarchical Bayesian models. Bayesian models for inverse problems naturally have a hierarchical structure in which the data model depends on a high-dimensional latent structure, which in turn depends on a low-dimensional hyperparameter vector. In the linear-Gaussian case, of which image deblurring is a canonical example, the full conditional for the latent structure is Gaussian, so can be sampled using efficient methods from numerical linear algebra.
Wednesday, June 17, 2015 - 9:00am - 10:30am
Youssef Marzouk (Massachusetts Institute of Technology), Luis Tenorio (Colorado School of Mines)
Tuesday, June 16, 2015 - 9:00am - 10:30am
Youssef Marzouk (Massachusetts Institute of Technology), Luis Tenorio (Colorado School of Mines)
Wednesday, May 9, 2012 - 1:30pm - 2:30pm
Fabian Wauthier (University of California, Berkeley)
Biased labelers are a systemic problem in crowdsourcing, and a
comprehensive toolbox for handling their responses is still being
developed. A typical crowdsourcing application can be divided into
three steps: data collection, data curation, and learning. At present
these steps are often treated separately. We present Bayesian Bias
Mitigation for Crowdsourcing (BBMC), a Bayesian model to unify all
three. Most data curation methods account for the effects of
labeler bias by modeling all labels as coming from a single latent
Tuesday, October 6, 2009 - 3:30pm - 4:00pm
Lawrence Carin (Duke University)
Non-parametric Bayesian techniques are considered for learning dictionaries for
sparse image representations, with applications in denoising, inpainting and
compressive sensing (CS). The beta process is employed as a prior for learning
the dictionary, and this non-parametric method naturally infers an appropriate
dictionary size. The Dirichlet process and a probit stick-breaking process are
also considered to exploit structure within an image. The proposed method can
learn a sparse dictionary in situ; training images may be exploited if
Wednesday, June 8, 2011 - 1:00pm - 2:00pm
David Higdon (Los Alamos National Laboratory)
A Bayesian formulation adapted from Kennedy and O'Hagan (2001) and
Higdon et al. (2008) is used to give parameter constraints from
physical observations and a limited number of simulations. The framework
is based on the idea of replacing the simulator by an emulator which
can then be used to facilitate computations required for the analysis.
In this talk I'll describe the details of this approach and apply it
to an example that uses large scale structure of the universe to
inform about a subset of the parameters controlling a
Friday, June 10, 2011 - 8:30am - 9:30am
Mark Berliner (The Ohio State University)
After a brief review of the hierarchical Bayesian viewpoint, I will present examples of interest in the geosciences. The first is a paleoclimate setting. The problem is to use observed temperatures at various depths and the heat equation to infer surface temperature history. The second combines an elementary physical model with observational data in modeling the flow of the Northeast Ice-Stream in Greenland.
Wednesday, June 8, 2011 - 2:30pm - 3:30pm
Youssef Marzouk (Massachusetts Institute of Technology)
Bayesian inference provides a natural framework for quantifying
uncertainty in PDE-constrained inverse problems, for fusing
heterogeneous sources of information, and for conditioning successive
predictions on data. In this setting, simulating from the posterior
via Markov chain Monte Carlo (MCMC) constitutes a fundamental
computational bottleneck. We present a new technique that entirely
avoids Markov chain-based simulation, by constructing a map under
which the posterior becomes the pushforward measure of the
Thursday, June 9, 2011 - 2:30pm - 3:30pm
Bani Mallick (Texas A & M University)
We present a Bayesian approach to to nonlinear inverse problems in which the unknown quantity is a random field (spatial or temporal). The Bayesian approach contains a natural mechanism for regularization in the form of prior information, can incorporate information from from heterogeneous sources and provide a quantitative assessment of uncertainty in the inverse solution. The Bayesian setting casts the inverse solution as a posterior probability distribution over the model parameters. Karhunen-Lo'eve expansion is used for dimension reduction of the random field.
Thursday, June 9, 2011 - 9:00am - 10:00am
Roger Ghanem (University of Southern California)
Recent developments with polynomial chaos expansions with random coefficients facilitate the accounting for subscale features, not captured in standard probabilistic models. These representations provide a geometric characterization of random variables and processes, which is quite distinct from the characterizations (in terms of probability density functions) typically adapted to Bayesian analysis. Given the importance of Bayes theorem within probability theory, it is important to synthesize the connection between these two representations.


Subscribe to RSS - Bayesian problems