讲座题目: Variational Bayesian Inference with Stochastic Search
讲座人简介：John Paisley received the B.S.E (2004), M.S. (2007) and Ph.D. (2010) from Duke University in Electrical & Computer Engineering, where he worked with Prof. Lawrence Carin. He was then a postdoctoral fellow at Princeton University with Prof. David Blei. He is now a postdoctoral fellow at UC Berkeley working with Prof. Michael I. Jordan in the Department of Electrical Engineering and Computer Science. His research interests include Bayesian nonparametrics and statistical modeling.
John Paisley，2010年美国杜克大学电子与计算机工程系获博士学位后，分别在普林斯顿大学David Blei（Latent Direichlet Allocation模型的提出者）课题组与UC-伯克利Michael I. Jordan教授（机器学习领域国际顶级专家）课题组从事博士后研究工作。Paisley博士的主要研究兴趣包括贝叶斯非参数模型、机器学习、统计信号处理等，目前在PAMI、TSP、TIP、BMC Bioinformatics、ICML、NIPS等国际顶级期刊与会议上发表学术论文近30篇。
讲座摘要: Mean-field variational inference is a method for approximate posterior inference in Bayesian models. It approximates the full posterior distribution with a factorized set of distributions by optimizing the variational objective function. Often, this objective function does not have a closed-form expression because the necessary integrals cannot be solved, in which case tractable lower bound approximations are usually introduced. We present an alternative algorithm based on stochastic optimization that allows for direct optimization of the variational lower bound. We use control variates to reduce the variance of the proposed stochastic search algorithm, in which existing lower bounds can play an important role. We demonstrate the approach on logistic regression.