论文标题

NAS Bench-Suite:NAS评估(现在)非常容易

NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy

论文作者

Mehta, Yash, White, Colin, Zela, Arber, Krishnakumar, Arjun, Zabergja, Guri, Moradian, Shakiba, Safari, Mahmoud, Yu, Kaicheng, Hutter, Frank

论文摘要

表格基准的释放,例如NAS Bench-101和NAS Bench-201,已大大降低了用于进行神经结构搜索(NAS)科学研究的计算开销。尽管它们已被广泛采用并用于调整现实世界中的NAS算法,但这些基准限于较小的搜索空间,并且仅专注于图像分类。最近,已经引入了一些新的NAS基准,这些基准涵盖了广泛的任务,包括对象检测,语音识别和自然语言处理。但是,到目前为止,这些NAS基准之间存在实质性差异,阻止了他们的广泛采用,从而限制了研究人员仅使用一些基准。在这项工作中,我们对25种不同组合的搜索空间和数据集组合的流行NAS算法和性能预测方法进行了深入分析,发现许多结论是从几个NAS基准测试中得出的许多结论,并未推广到其他基准。为了帮助解决这个问题,我们介绍了NAS Bench-Suite,这是NAS基准的全面且可扩展的基准集合,可以通过统一的界面访问,目的是为了促进可重复的,可推广和快速的NAS研究。我们的代码可在https://github.com/automl/naslib上找到。

The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-201, has significantly lowered the computational overhead for conducting scientific research in neural architecture search (NAS). Although they have been widely adopted and used to tune real-world NAS algorithms, these benchmarks are limited to small search spaces and focus solely on image classification. Recently, several new NAS benchmarks have been introduced that cover significantly larger search spaces over a wide range of tasks, including object detection, speech recognition, and natural language processing. However, substantial differences among these NAS benchmarks have so far prevented their widespread adoption, limiting researchers to using just a few benchmarks. In this work, we present an in-depth analysis of popular NAS algorithms and performance prediction methods across 25 different combinations of search spaces and datasets, finding that many conclusions drawn from a few NAS benchmarks do not generalize to other benchmarks. To help remedy this problem, we introduce NAS-Bench-Suite, a comprehensive and extensible collection of NAS benchmarks, accessible through a unified interface, created with the aim to facilitate reproducible, generalizable, and rapid NAS research. Our code is available at https://github.com/automl/naslib.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源