论文标题

大规模灵活绑定高斯混合模型的随机一阶学习

Stochastic First-Order Learning for Large-Scale Flexibly Tied Gaussian Mixture Model

论文作者

Pasande, Mohammad, Hosseini, Reshad, Araabi, Babak Nadjar

论文摘要

高斯混合模型(GMM)是在许多应用中广泛使用的最有效的参数密度模型之一。 GMMS中协方差矩阵的灵活分解是应对普通GMM的挑战的强大方法,当面对高维数据和复杂的密度,通常需要大量高斯组件。但是,适合易于绑定的GMM的期望最大化算法仍然遇到了流媒体和非常大的尺寸数据的困难。为了克服这些挑战,本文建议使用一阶随机优化算法。具体而言,我们在正交矩阵的流形上提出了一种新的随机优化算法。通过对合成数据集和真实数据集的大量经验结果,我们观察到,随机优化方法可以优于预期最大化算法,因为获得更好的可能性,需要更少的时期以进行收敛,并且每个时期的时间更少。

Gaussian Mixture Models (GMMs) are one of the most potent parametric density models used extensively in many applications. Flexibly-tied factorization of the covariance matrices in GMMs is a powerful approach for coping with the challenges of common GMMs when faced with high-dimensional data and complex densities which often demand a large number of Gaussian components. However, the expectation-maximization algorithm for fitting flexibly-tied GMMs still encounters difficulties with streaming and very large dimensional data. To overcome these challenges, this paper suggests the use of first-order stochastic optimization algorithms. Specifically, we propose a new stochastic optimization algorithm on the manifold of orthogonal matrices. Through numerous empirical results on both synthetic and real datasets, we observe that stochastic optimization methods can outperform the expectation-maximization algorithm in terms of attaining better likelihood, needing fewer epochs for convergence, and consuming less time per each epoch.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源