[译]Variational Inference: A Review for Statisticians

Neil Zhu,ID Not_GOD,University AI 创始人 & Chief Scientist,致力于推进世界人工智能化进程。制定并实施 UAI 中长期增长战略和目标,带领团队快速成长为人工智能领域最专业的力量。
作为行业领导者,他和UAI一起在2014年创建了TASA(中国最早的人工智能社团), DL Center(深度学习知识中心全球价值网络),AI growth(行业智库培训)等,为中国的人工智能人才建设输送了大量的血液和养分。此外,他还参与或者举办过各类国际性的人工智能峰会和活动,产生了巨大的影响力,书写了60万字的人工智能精品技术内容,生产翻译了全球第一本深度学习入门书《神经网络与深度学习》,生产的内容被大量的专业垂直公众号和媒体转载与连载。曾经受邀为国内顶尖大学制定人工智能学习规划和教授人工智能前沿课程,均受学生和老师好评。

David M. Blei, Alp Kucukelbir, Jon D. McAuliffe

One of the core problems of modern statistics is to approximate difficult-to-compute probability distributions. This problem is especially important in Bayesian statistics, which frames all inference about unknown quantities as a calculation about the posterior. In this paper, we review variational inference (VI), a method from machine learning that approximates probability distributions through optimization. VI has been used in myriad applications and tends to be faster than classical methods, such as Markov chain Monte Carlo sampling. The idea behind VI is to first posit a family of distributions and then to find the member of that family which is close to the target. Closeness is measured by Kullback-Leibler divergence. We review the ideas behind mean-field variational inference, discuss the special case of VI applied to exponential family models, present a full example with a Bayesian mixture of Gaussians, and derive a variant that uses stochastic optimization to scale up to massive data. We discuss modern research in VI and highlight important open problems. VI is powerful, but it is not yet well understood. Our hope in writing this paper is to catalyse statistical research on this widely-used class of algorithms.

现代统计学中的一个核心问题就是如何近似非常难计算的概率分布。这个问题在贝叶斯统计中尤其重要,因为所有的未知量的推断就是一个关于后验分布的计算。本文,我们回顾变分推断(VI),一个来自机器学习中的通过优化来近似概率分布的方法。VI 已经用在了无数应用中,并且会比经典方法如MCMC 采样要快。VI 背后的思想是首先确定一个分布的族,然后找到这个族中的最接近目标的成员。我们通过 Kullback-Leibler 散度来衡量分布之间的距离。我们回顾了在 mean-field VI 后的想法,讨论了应用在 exponential 族模型上的 VI 特殊形式,给出了一个贝叶斯高斯混合模型的例子,并推导出一个使用随机优化方式可以扩展到大数据场景中的变体。然后讨论了当前在 VI 中的研究并且点出来几个重要的开放问题。VI 非常强大,但是仍然存在很多谜题有待解答。我们的希望通过本文来推动在这类问题上的统计学研究。

[待续]

你可能感兴趣的:([译]Variational Inference: A Review for Statisticians)