Regularization in network optimization via trimmed stochastic gradient descent with noisy label

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Regularization in network optimization via trimmed stochastic gradient descent with noisy label. / Nakamura, Kensuke; Sohn, Bong Soo; Won, Kyoung Jae; Hong, Byung Woo.

I: IEEE Access, Bind 10, 2022, s. 34706-34715.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Nakamura, K, Sohn, BS, Won, KJ & Hong, BW 2022, 'Regularization in network optimization via trimmed stochastic gradient descent with noisy label', IEEE Access, bind 10, s. 34706-34715. https://doi.org/10.1109/ACCESS.2022.3171910

APA

Nakamura, K., Sohn, B. S., Won, K. J., & Hong, B. W. (2022). Regularization in network optimization via trimmed stochastic gradient descent with noisy label. IEEE Access, 10, 34706-34715. https://doi.org/10.1109/ACCESS.2022.3171910

Vancouver

Nakamura K, Sohn BS, Won KJ, Hong BW. Regularization in network optimization via trimmed stochastic gradient descent with noisy label. IEEE Access. 2022;10:34706-34715. https://doi.org/10.1109/ACCESS.2022.3171910

Author

Nakamura, Kensuke ; Sohn, Bong Soo ; Won, Kyoung Jae ; Hong, Byung Woo. / Regularization in network optimization via trimmed stochastic gradient descent with noisy label. I: IEEE Access. 2022 ; Bind 10. s. 34706-34715.

Bibtex

@article{d751976a4fe6456db77f5e5e83807a22,
title = "Regularization in network optimization via trimmed stochastic gradient descent with noisy label",
abstract = "Regularization is essential for avoiding over-fitting to training data in network optimization, leading to better generalization of the trained networks. The label noise provides a strong implicit regularization by replacing the target ground truth labels of training examples by uniform random labels. However, it can cause undesirable misleading gradients due to the large loss associated with incorrect labels. We propose a first-order optimization method (Label-Noised Trim-SGD) that uses the label noise with the example trimming in order to remove the outliers based on the loss. The proposed algorithm is simple yet enables us to impose a large label-noise and obtain a better regularization effect than the original methods. The quantitative analysis is performed by comparing the behavior of the label noise, the example trimming, and the proposed algorithm. We also present empirical results that demonstrate the effectiveness of our algorithm using the major benchmarks and the fundamental networks, where our method has successfully outperformed the state-of-the-art optimization methods.",
keywords = "Data models, data trimming, label noise, Loss measurement, network optimization, Neural networks, Noise measurement, Optimization, regularization, Stochastic processes, Training",
author = "Kensuke Nakamura and Sohn, {Bong Soo} and Won, {Kyoung Jae} and Hong, {Byung Woo}",
note = "Publisher Copyright: Author",
year = "2022",
doi = "10.1109/ACCESS.2022.3171910",
language = "English",
volume = "10",
pages = "34706--34715",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

RIS

TY - JOUR

T1 - Regularization in network optimization via trimmed stochastic gradient descent with noisy label

AU - Nakamura, Kensuke

AU - Sohn, Bong Soo

AU - Won, Kyoung Jae

AU - Hong, Byung Woo

N1 - Publisher Copyright: Author

PY - 2022

Y1 - 2022

N2 - Regularization is essential for avoiding over-fitting to training data in network optimization, leading to better generalization of the trained networks. The label noise provides a strong implicit regularization by replacing the target ground truth labels of training examples by uniform random labels. However, it can cause undesirable misleading gradients due to the large loss associated with incorrect labels. We propose a first-order optimization method (Label-Noised Trim-SGD) that uses the label noise with the example trimming in order to remove the outliers based on the loss. The proposed algorithm is simple yet enables us to impose a large label-noise and obtain a better regularization effect than the original methods. The quantitative analysis is performed by comparing the behavior of the label noise, the example trimming, and the proposed algorithm. We also present empirical results that demonstrate the effectiveness of our algorithm using the major benchmarks and the fundamental networks, where our method has successfully outperformed the state-of-the-art optimization methods.

AB - Regularization is essential for avoiding over-fitting to training data in network optimization, leading to better generalization of the trained networks. The label noise provides a strong implicit regularization by replacing the target ground truth labels of training examples by uniform random labels. However, it can cause undesirable misleading gradients due to the large loss associated with incorrect labels. We propose a first-order optimization method (Label-Noised Trim-SGD) that uses the label noise with the example trimming in order to remove the outliers based on the loss. The proposed algorithm is simple yet enables us to impose a large label-noise and obtain a better regularization effect than the original methods. The quantitative analysis is performed by comparing the behavior of the label noise, the example trimming, and the proposed algorithm. We also present empirical results that demonstrate the effectiveness of our algorithm using the major benchmarks and the fundamental networks, where our method has successfully outperformed the state-of-the-art optimization methods.

KW - Data models

KW - data trimming

KW - label noise

KW - Loss measurement

KW - network optimization

KW - Neural networks

KW - Noise measurement

KW - Optimization

KW - regularization

KW - Stochastic processes

KW - Training

U2 - 10.1109/ACCESS.2022.3171910

DO - 10.1109/ACCESS.2022.3171910

M3 - Journal article

AN - SCOPUS:85129585599

VL - 10

SP - 34706

EP - 34715

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -

ID: 307328329