Propagating Surrogate Uncertainty in Bayesian Inverse Problems
About
Standard Bayesian inference schemes are infeasible for inverse problems with computationally expensive forward models. A common solution is to replace the model with a cheaper surrogate. To avoid overconfident conclusions, it is essential to acknowledge the surrogate approximation by propagating its uncertainty. At present, a variety of distinct uncertainty propagation methods have been suggested, with little understanding of how they vary. To fill this gap, we propose a mixture distribution termed the expected posterior (EP) as a general baseline for uncertainty-aware posterior approximation, justified by decision theoretic and modular Bayesian inference arguments. We compare this distribution to popular alternatives, present an approximate Markov chain Monte Carlo sampler for EP-based inference, and highlight future directions.
Speaker

Andrew Roberts
Andrew Roberts is a PhD student in Computing and Data Sciences at Boston University, working with Professor Jonathan Huggins and Professor Michael Dietze. He is broadly interested in scientific machine learning, Bayesian modeling, and uncertainty quantification, with the goal of developing new methodologies for environmental and ecological applications. Andrew’s current work focuses on developing statistical and computational methods to better utilize process-based models of the terrestrial carbon cycle. Learn more about Andrew here: arob5.github.io