论文标题
EC-NAS:能源消耗意识到神经架构搜索的表格基准
EC-NAS: Energy Consumption Aware Tabular Benchmarks for Neural Architecture Search
论文作者
论文摘要
深度学习模型的选择,培训和部署的能源消耗最近引起了重大的增长。这项工作旨在促进能源有效的深度学习模型的设计,这些模型需要较少的计算资源,并通过专注于能源消耗来确定环境可持续性的优先级。神经体系结构搜索(NAS)受益于表格基准,这些基准通过预先计算的性能统计数据来评估NAS策略。我们主张将能源效率作为NAS中的额外性能标准。为此,我们引入了一个增强的表格基准,其中包含有关各种体系结构的能源消耗的数据。已指定为Ec-NAS的基准已以开源形式提供,以推动对能量意识NAS的研究。 EC-NAS结合了一个替代模型来预测能源消耗,有助于减少数据集创建的能源消耗。我们的发现通过利用多目标优化算法来强调EC-NAS的潜力,从而揭示了能量使用和准确性之间的平衡。这表明鉴定绩效妥协或根本没有妥协的能量叶子体系结构的可行性。
Energy consumption from the selection, training, and deployment of deep learning models has seen a significant uptick recently. This work aims to facilitate the design of energy-efficient deep learning models that require less computational resources and prioritize environmental sustainability by focusing on the energy consumption. Neural architecture search (NAS) benefits from tabular benchmarks, which evaluate NAS strategies cost-effectively through precomputed performance statistics. We advocate for including energy efficiency as an additional performance criterion in NAS. To this end, we introduce an enhanced tabular benchmark encompassing data on energy consumption for varied architectures. The benchmark, designated as EC-NAS, has been made available in an open-source format to advance research in energy-conscious NAS. EC-NAS incorporates a surrogate model to predict energy consumption, aiding in diminishing the energy expenditure of the dataset creation. Our findings emphasize the potential of EC-NAS by leveraging multi-objective optimization algorithms, revealing a balance between energy usage and accuracy. This suggests the feasibility of identifying energy-lean architectures with little or no compromise in performance.