EC-NAS: Energy Consumption Aware Tabular Benchmarks for Neural Architecture Search
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Standard
EC-NAS : Energy Consumption Aware Tabular Benchmarks for Neural Architecture Search. / Bakhtiarifard, Pedram; Igel, Christian; Selvan, Raghavendra.
2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Proceedings. IEEE, 2024. p. 5660-5664.Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - EC-NAS
T2 - 49th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024
AU - Bakhtiarifard, Pedram
AU - Igel, Christian
AU - Selvan, Raghavendra
N1 - Publisher Copyright: © 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Energy consumption from the selection, training, and deployment of deep learning models has seen a significant uptick recently. This work aims to facilitate the design of energy-efficient deep learning models that require less computational resources and prioritize environmental sustainability by focusing on the energy consumption. Neural architecture search (NAS) benefits from tabular benchmarks, which evaluate NAS strategies cost-effectively through pre-computed performance statistics. We advocate for including energy efficiency as an additional performance criterion in NAS. To this end, we introduce an enhanced tabular benchmark encompassing data on energy consumption for varied architectures. The benchmark, designated as EC-NAS1, has been made available in an open-source format to advance research in energy-conscious NAS. EC-NAS incorporates a surrogate model to predict energy consumption, aiding in diminishing the energy expenditure of the dataset creation. Our findings emphasize the potential of EC-NAS by leveraging multi-objective optimization algorithms, revealing a balance between energy usage and accuracy. This suggests the feasibility of identifying energy-lean architectures with little or no compromise in performance.
AB - Energy consumption from the selection, training, and deployment of deep learning models has seen a significant uptick recently. This work aims to facilitate the design of energy-efficient deep learning models that require less computational resources and prioritize environmental sustainability by focusing on the energy consumption. Neural architecture search (NAS) benefits from tabular benchmarks, which evaluate NAS strategies cost-effectively through pre-computed performance statistics. We advocate for including energy efficiency as an additional performance criterion in NAS. To this end, we introduce an enhanced tabular benchmark encompassing data on energy consumption for varied architectures. The benchmark, designated as EC-NAS1, has been made available in an open-source format to advance research in energy-conscious NAS. EC-NAS incorporates a surrogate model to predict energy consumption, aiding in diminishing the energy expenditure of the dataset creation. Our findings emphasize the potential of EC-NAS by leveraging multi-objective optimization algorithms, revealing a balance between energy usage and accuracy. This suggests the feasibility of identifying energy-lean architectures with little or no compromise in performance.
KW - Energy-aware benchmark
KW - multi-objective optimization
KW - neural architecture search
KW - sustainable machine learning
U2 - 10.1109/ICASSP48485.2024.10448303
DO - 10.1109/ICASSP48485.2024.10448303
M3 - Article in proceedings
AN - SCOPUS:85188929712
SP - 5660
EP - 5664
BT - 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Proceedings
PB - IEEE
Y2 - 14 April 2024 through 19 April 2024
ER -
ID: 395155031