Evolutionary kernel density regression

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Evolutionary kernel density regression. / Kramer, Oliver; Gieseke, Fabian.

I: Expert Systems with Applications, Bind 39, Nr. 10, 2012, s. 9246-9254.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Kramer, O & Gieseke, F 2012, 'Evolutionary kernel density regression', Expert Systems with Applications, bind 39, nr. 10, s. 9246-9254. https://doi.org/10.1016/j.eswa.2012.02.080

APA

Kramer, O., & Gieseke, F. (2012). Evolutionary kernel density regression. Expert Systems with Applications, 39(10), 9246-9254. https://doi.org/10.1016/j.eswa.2012.02.080

Vancouver

Kramer O, Gieseke F. Evolutionary kernel density regression. Expert Systems with Applications. 2012;39(10):9246-9254. https://doi.org/10.1016/j.eswa.2012.02.080

Author

Kramer, Oliver ; Gieseke, Fabian. / Evolutionary kernel density regression. I: Expert Systems with Applications. 2012 ; Bind 39, Nr. 10. s. 9246-9254.

Bibtex

@article{e8c680c57c294df9b7ef541ceb734799,
title = "Evolutionary kernel density regression",
abstract = "The Nadaraya-Watson estimator, also known as kernel regression, is a density-based regression technique. It weights output values with the relative densities in input space. The density is measured with kernel functions that depend on bandwidth parameters. In this work we present an evolutionary bandwidth optimizer for kernel regression. The approach is based on a robust loss function, leave-one-out cross-validation, and the CMSA-ES as optimization engine. A variant with local parameterized Nadaraya-Watson models enhances the approach, and allows the adaptation of the model to local data space characteristics. The unsupervised counterpart of kernel regression is an approach to learn principal manifolds. The learning problem of unsupervised kernel regression (UKR) is based on optimizing the latent variables, which is a multimodal problem with many local optima. We propose an evolutionary framework for optimization of UKR based on scaling of initial local linear embedding solutions, and minimization of the cross-validation error. Both methods are analyzed experimentally.",
keywords = "Bandwidth optimization, Evolution strategies, Kernel regression, Manifold learning, Unsupervised kernel regression",
author = "Oliver Kramer and Fabian Gieseke",
year = "2012",
doi = "10.1016/j.eswa.2012.02.080",
language = "English",
volume = "39",
pages = "9246--9254",
journal = "Expert Systems with Applications",
issn = "0957-4174",
publisher = "Pergamon Press",
number = "10",

}

RIS

TY - JOUR

T1 - Evolutionary kernel density regression

AU - Kramer, Oliver

AU - Gieseke, Fabian

PY - 2012

Y1 - 2012

N2 - The Nadaraya-Watson estimator, also known as kernel regression, is a density-based regression technique. It weights output values with the relative densities in input space. The density is measured with kernel functions that depend on bandwidth parameters. In this work we present an evolutionary bandwidth optimizer for kernel regression. The approach is based on a robust loss function, leave-one-out cross-validation, and the CMSA-ES as optimization engine. A variant with local parameterized Nadaraya-Watson models enhances the approach, and allows the adaptation of the model to local data space characteristics. The unsupervised counterpart of kernel regression is an approach to learn principal manifolds. The learning problem of unsupervised kernel regression (UKR) is based on optimizing the latent variables, which is a multimodal problem with many local optima. We propose an evolutionary framework for optimization of UKR based on scaling of initial local linear embedding solutions, and minimization of the cross-validation error. Both methods are analyzed experimentally.

AB - The Nadaraya-Watson estimator, also known as kernel regression, is a density-based regression technique. It weights output values with the relative densities in input space. The density is measured with kernel functions that depend on bandwidth parameters. In this work we present an evolutionary bandwidth optimizer for kernel regression. The approach is based on a robust loss function, leave-one-out cross-validation, and the CMSA-ES as optimization engine. A variant with local parameterized Nadaraya-Watson models enhances the approach, and allows the adaptation of the model to local data space characteristics. The unsupervised counterpart of kernel regression is an approach to learn principal manifolds. The learning problem of unsupervised kernel regression (UKR) is based on optimizing the latent variables, which is a multimodal problem with many local optima. We propose an evolutionary framework for optimization of UKR based on scaling of initial local linear embedding solutions, and minimization of the cross-validation error. Both methods are analyzed experimentally.

KW - Bandwidth optimization

KW - Evolution strategies

KW - Kernel regression

KW - Manifold learning

KW - Unsupervised kernel regression

UR - http://www.scopus.com/inward/record.url?scp=84859218135&partnerID=8YFLogxK

U2 - 10.1016/j.eswa.2012.02.080

DO - 10.1016/j.eswa.2012.02.080

M3 - Journal article

AN - SCOPUS:84859218135

VL - 39

SP - 9246

EP - 9254

JO - Expert Systems with Applications

JF - Expert Systems with Applications

SN - 0957-4174

IS - 10

ER -

ID: 167918422