Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search

Publikation: Working paperPreprintForskning

Standard

Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search. / Bakhtiarifard, Pedram; Igel, Christian; Selvan, Raghavendra.

arxiv.org, 2022.

Publikation: Working paperPreprintForskning

Harvard

Bakhtiarifard, P, Igel, C & Selvan, R 2022 'Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search' arxiv.org.

APA

Bakhtiarifard, P., Igel, C., & Selvan, R. (2022). Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search. arxiv.org.

Vancouver

Bakhtiarifard P, Igel C, Selvan R. Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search. arxiv.org. 2022 okt. 12.

Author

Bakhtiarifard, Pedram ; Igel, Christian ; Selvan, Raghavendra. / Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search. arxiv.org, 2022.

Bibtex

@techreport{0600321a5a244103b10ee1e6071ebef1,
title = "Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search",
abstract = "The demand for large-scale computational resources for Neural Architecture Search (NAS) has been lessened by tabular benchmarks for NAS. Evaluating NAS strategies is now possible on extensive search spaces and at a moderate computational cost. But so far, NAS has mainly focused on maximising performance on some hold-out validation/test set. However, energy consumption is a partially conflicting objective that should not be neglected. We hypothesise that constraining NAS to include the energy consumption of training the models could reveal a sub-space of undiscovered architectures that are more computationally efficient with a smaller carbon footprint. To support the hypothesis, an existing tabular benchmark for NAS is augmented with the energy consumption of each architecture. We then perform multi-objective optimisation that includes energy consumption as an additional objective. We demonstrate the usefulness of multi-objective NAS for uncovering the trade-off between performance and energy consumption as well as for finding more energy-efficient architectures. The updated tabular benchmark, EC-NAS-Bench, is open-sourced to encourage the further exploration of energy consumption-aware NAS.",
keywords = "cs.LG, stat.ML",
author = "Pedram Bakhtiarifard and Christian Igel and Raghavendra Selvan",
note = "Source code at https://github.com/PedramBakh/EC-NAS-Bench",
year = "2022",
month = oct,
day = "12",
language = "English",
publisher = "arxiv.org",
type = "WorkingPaper",
institution = "arxiv.org",

}

RIS

TY - UNPB

T1 - Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search

AU - Bakhtiarifard, Pedram

AU - Igel, Christian

AU - Selvan, Raghavendra

N1 - Source code at https://github.com/PedramBakh/EC-NAS-Bench

PY - 2022/10/12

Y1 - 2022/10/12

N2 - The demand for large-scale computational resources for Neural Architecture Search (NAS) has been lessened by tabular benchmarks for NAS. Evaluating NAS strategies is now possible on extensive search spaces and at a moderate computational cost. But so far, NAS has mainly focused on maximising performance on some hold-out validation/test set. However, energy consumption is a partially conflicting objective that should not be neglected. We hypothesise that constraining NAS to include the energy consumption of training the models could reveal a sub-space of undiscovered architectures that are more computationally efficient with a smaller carbon footprint. To support the hypothesis, an existing tabular benchmark for NAS is augmented with the energy consumption of each architecture. We then perform multi-objective optimisation that includes energy consumption as an additional objective. We demonstrate the usefulness of multi-objective NAS for uncovering the trade-off between performance and energy consumption as well as for finding more energy-efficient architectures. The updated tabular benchmark, EC-NAS-Bench, is open-sourced to encourage the further exploration of energy consumption-aware NAS.

AB - The demand for large-scale computational resources for Neural Architecture Search (NAS) has been lessened by tabular benchmarks for NAS. Evaluating NAS strategies is now possible on extensive search spaces and at a moderate computational cost. But so far, NAS has mainly focused on maximising performance on some hold-out validation/test set. However, energy consumption is a partially conflicting objective that should not be neglected. We hypothesise that constraining NAS to include the energy consumption of training the models could reveal a sub-space of undiscovered architectures that are more computationally efficient with a smaller carbon footprint. To support the hypothesis, an existing tabular benchmark for NAS is augmented with the energy consumption of each architecture. We then perform multi-objective optimisation that includes energy consumption as an additional objective. We demonstrate the usefulness of multi-objective NAS for uncovering the trade-off between performance and energy consumption as well as for finding more energy-efficient architectures. The updated tabular benchmark, EC-NAS-Bench, is open-sourced to encourage the further exploration of energy consumption-aware NAS.

KW - cs.LG

KW - stat.ML

M3 - Preprint

BT - Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search

PB - arxiv.org

ER -

ID: 333626050