论文标题
从神经2021元数据挑战中学到的经验教训:骨干微调而没有情节性元学习占主导
Lessons learned from the NeurIPS 2021 MetaDL challenge: Backbone fine-tuning without episodic meta-learning dominates for few-shot learning image classification
论文作者
论文摘要
尽管深层神经网络能够在各种任务上实现优于人类的绩效,但他们臭名昭著,因为他们需要大量的数据和计算资源,从而将其成功限制在可用的这些资源的领域。 Metalering方法可以通过从相关任务中转移知识来解决此问题,从而减少学习新任务所需的数据和计算资源的数量。我们组织了元素竞争系列,该系列为全世界的研究小组提供了创建和实验评估新的元(深)学习解决方案的机会。在本文中,我们在竞争组织者和排名最高的参与者之间进行了合作,我们描述了竞争的设计,数据集,最佳实验结果,以及Neurips 2021挑战中排名最高的方法,这些方法吸引了15个活跃的团队,这些团队吸引了最终阶段(通过超过100个代码代码阶段),使其在送货阶段超过100个代码阶段。顶级参与者的解决方案是开源的。汲取的教训包括学习良好表示对于有效的转移学习至关重要。
Although deep neural networks are capable of achieving performance superior to humans on various tasks, they are notorious for requiring large amounts of data and computing resources, restricting their success to domains where such resources are available. Metalearning methods can address this problem by transferring knowledge from related tasks, thus reducing the amount of data and computing resources needed to learn new tasks. We organize the MetaDL competition series, which provide opportunities for research groups all over the world to create and experimentally assess new meta-(deep)learning solutions for real problems. In this paper, authored collaboratively between the competition organizers and the top-ranked participants, we describe the design of the competition, the datasets, the best experimental results, as well as the top-ranked methods in the NeurIPS 2021 challenge, which attracted 15 active teams who made it to the final phase (by outperforming the baseline), making over 100 code submissions during the feedback phase. The solutions of the top participants have been open-sourced. The lessons learned include that learning good representations is essential for effective transfer learning.