Cost-sensitive adaboost algorithm for ordinal regression based on extreme learning machine

Annalisa Riccardi, Francisco Fernández-Navarro, Sante Carloni

Research output: Contribution to journalArticle

34 Citations (Scopus)

Abstract

In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.
LanguageEnglish
Pages1898-1909
Number of pages12
JournalIEEE Transactions on Cybernetics
Volume44
Issue number10
Early online date22 Jan 2014
DOIs
Publication statusPublished - 12 Sep 2014

Fingerprint

Adaptive boosting
Learning systems
Classifiers
Costs

Keywords

  • regression analysis
  • boosting
  • artificial neural networks
  • prediction algorithms

Cite this

Riccardi, Annalisa ; Fernández-Navarro, Francisco ; Carloni, Sante. / Cost-sensitive adaboost algorithm for ordinal regression based on extreme learning machine. In: IEEE Transactions on Cybernetics. 2014 ; Vol. 44, No. 10. pp. 1898-1909.
@article{1e0da4d3222b47f7ad8da8789de3db73,
title = "Cost-sensitive adaboost algorithm for ordinal regression based on extreme learning machine",
abstract = "In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.",
keywords = "regression analysis, boosting, artificial neural networks, prediction algorithms",
author = "Annalisa Riccardi and Francisco Fern{\'a}ndez-Navarro and Sante Carloni",
note = "{\circledC} 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.",
year = "2014",
month = "9",
day = "12",
doi = "10.1109/TCYB.2014.2299291",
language = "English",
volume = "44",
pages = "1898--1909",
journal = "IEEE Transactions on Cybernetics",
issn = "2168-2267",
number = "10",

}

Cost-sensitive adaboost algorithm for ordinal regression based on extreme learning machine. / Riccardi, Annalisa; Fernández-Navarro, Francisco; Carloni, Sante.

In: IEEE Transactions on Cybernetics, Vol. 44, No. 10, 12.09.2014, p. 1898-1909.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Cost-sensitive adaboost algorithm for ordinal regression based on extreme learning machine

AU - Riccardi, Annalisa

AU - Fernández-Navarro, Francisco

AU - Carloni, Sante

N1 - © 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

PY - 2014/9/12

Y1 - 2014/9/12

N2 - In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.

AB - In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.

KW - regression analysis

KW - boosting

KW - artificial neural networks

KW - prediction algorithms

U2 - 10.1109/TCYB.2014.2299291

DO - 10.1109/TCYB.2014.2299291

M3 - Article

VL - 44

SP - 1898

EP - 1909

JO - IEEE Transactions on Cybernetics

T2 - IEEE Transactions on Cybernetics

JF - IEEE Transactions on Cybernetics

SN - 2168-2267

IS - 10

ER -