论文标题
在高斯过程中改善学习:两个世界中最好的
Towards Improved Learning in Gaussian Processes: The Best of Two Worlds
论文作者
论文摘要
高斯工艺训练分解为(近似)后部和学习超参数的推断。对于非高斯(非混合)的可能性,具有互补优势和劣势的两个共同选择是期望传播(EP)和变异推理(VI)。虽然VI与边际可能性的下限是推断后部近似值的合适目标,但它并不意味着它是超参数优化的良好学习目标。我们设计了一个混合训练程序,其中推理利用了共轭 - 计算VI,并且学习使用EP样的边际可能性近似。我们在二元分类上经验证明,这提供了一个良好的学习目标,并更好地概括了。
Gaussian process training decomposes into inference of the (approximate) posterior and learning of the hyperparameters. For non-Gaussian (non-conjugate) likelihoods, two common choices for approximate inference are Expectation Propagation (EP) and Variational Inference (VI), which have complementary strengths and weaknesses. While VI's lower bound to the marginal likelihood is a suitable objective for inferring the approximate posterior, it does not automatically imply it is a good learning objective for hyperparameter optimization. We design a hybrid training procedure where the inference leverages conjugate-computation VI and the learning uses an EP-like marginal likelihood approximation. We empirically demonstrate on binary classification that this provides a good learning objective and generalizes better.