论文标题

自适应尺寸降低和跨传输分类的变异推断

Adaptive Dimension Reduction and Variational Inference for Transductive Few-Shot Classification

论文作者

Hu, Yuqing, Pateux, Stéphane, Gripon, Vincent

论文摘要

考虑到数据注释的成本以及几乎没有标记的样本所提供的准确性提高,几乎没有射击的成本,几乎没有射击的转移学习越来越多。尤其是在几个射击分类(FSC)中,最近的作品探讨了旨在最大程度地相对于未知参数的可能性或后二阶段的特征分布。遵循这种静脉,考虑FSC和聚类之间的平行,我们寻求更好地考虑到由于缺乏数据而导致的估计不确定性,以及与每个类相关的群集的统计属性更好。因此,在本文中,我们提出了一种基于变异贝叶斯推论的新聚类方法,该方法通过基于概率线性判别分析的自适应维度降低进一步改善。当应用于先前研究中使用的功能时,我们提出的方法可显着提高现实不平衡的转导设置的准确性,其准确性高达$ 6 \%$。此外,当应用于平衡设置时,我们将获得非常有竞争力的结果,而无需使用对实际用例的级别平衡伪像。我们还提供了高性能的主链的方法的性能,报告的结果进一步超过了当前的最新准确性,这表明该方法的通用性。

Transductive Few-Shot learning has gained increased attention nowadays considering the cost of data annotations along with the increased accuracy provided by unlabelled samples in the domain of few shot. Especially in Few-Shot Classification (FSC), recent works explore the feature distributions aiming at maximizing likelihoods or posteriors with respect to the unknown parameters. Following this vein, and considering the parallel between FSC and clustering, we seek for better taking into account the uncertainty in estimation due to lack of data, as well as better statistical properties of the clusters associated with each class. Therefore in this paper we propose a new clustering method based on Variational Bayesian inference, further improved by Adaptive Dimension Reduction based on Probabilistic Linear Discriminant Analysis. Our proposed method significantly improves accuracy in the realistic unbalanced transductive setting on various Few-Shot benchmarks when applied to features used in previous studies, with a gain of up to $6\%$ in accuracy. In addition, when applied to balanced setting, we obtain very competitive results without making use of the class-balance artefact which is disputable for practical use cases. We also provide the performance of our method on a high performing pretrained backbone, with the reported results further surpassing the current state-of-the-art accuracy, suggesting the genericity of the proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源