Speaker
Jiaoyang Huang, University of Pennsylvania
We consider Bayesian inference for large-scale inverse problems, where computational challenges arise from the need for the repeated evaluations of an expensive forward model, which is often given as a black box or is impractical to differentiate. In this talk I will propose a new derivative-free algorithm Unscented Kalman Inversion, which utilizes the ideas from Kalman filter, to efficiently solve these inverse problems. First, I will explain some basics about Variational Inference under general metric tensors. In particular, under the Fisher-Rao metric, the Gaussian Variational Inference leads to the natural gradient descent. Next, I will discuss two different views of our algorithm. It can be obtained from a Gaussian approximation of the filtering distribution of a novel mean field dynamical system. And it can also be viewed as a derivative-free approximation of the natural gradient descent. I will also discuss theoretical properties for linear inverse problems. Finally, I will discuss an extension of our algorithm using Gaussian mixture approximation, which leads to the Gaussian Mixture Kalman Inversion, an efficient derivative-free Bayesian inference approach capable of capturing multiple modes. I will demonstrate the effectiveness of this approach in several numerical experiments with multimodal posterior distributions, which typically converge within O(10) iterations. This is based on joint works with Yifan Chen, Daniel Zhengyu Huang, Sebastian Reich and Andrew Stuart.
In-Person seminars will be held at Dunham Lab, 10 Hillhouse Ave., Room 220, with an option of remote participation via zoom.
Password: 24
Or Telephone:203-432-9666 (2-ZOOM if on-campus) or 646 568 7788
Meeting ID: 924 1107 7917