论文标题
示例VAE:链接生成模型,最近的邻居检索和数据增强
Exemplar VAE: Linking Generative Models, Nearest Neighbor Retrieval, and Data Augmentation
论文作者
论文摘要
我们介绍了示例VAE,这是一个生成模型家族,弥合了基于参数和非参数的基于典范的生成模型之间的差距。示例VAE是基于Parzen窗口估计器的潜在空间中的VAE的变体,具有非参数的先验。为了从中取样,首先从训练集中绘制一个随机的示例,然后随机转化该示例为潜在代码和新观察。我们建议通过在潜在空间中使用大约最近的邻居搜索来定义对数边缘可能性的下限,以加快示例VAE训练的速度。为了增强概括,使用示例性保留一度和子采样来学习模型参数。实验证明了示例VAE对密度估计和表示学习的有效性。重要的是,使用置换式VAE进行置换不变的MNIST和时尚MNIST的生成数据增强将分类误差从1.17%降低到0.69%,从8.56%降低到8.16%。
We introduce Exemplar VAEs, a family of generative models that bridge the gap between parametric and non-parametric, exemplar based generative models. Exemplar VAE is a variant of VAE with a non-parametric prior in the latent space based on a Parzen window estimator. To sample from it, one first draws a random exemplar from a training set, then stochastically transforms that exemplar into a latent code and a new observation. We propose retrieval augmented training (RAT) as a way to speed up Exemplar VAE training by using approximate nearest neighbor search in the latent space to define a lower bound on log marginal likelihood. To enhance generalization, model parameters are learned using exemplar leave-one-out and subsampling. Experiments demonstrate the effectiveness of Exemplar VAEs on density estimation and representation learning. Importantly, generative data augmentation using Exemplar VAEs on permutation invariant MNIST and Fashion MNIST reduces classification error from 1.17% to 0.69% and from 8.56% to 8.16%.