Sparse support vector regression based on orthogonal forward selection for the generalised kernel model

X. X. Wang, S. Chen*, D. Lowe, C. J. Harris

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This paper considers sparse regression modelling using a generalised kernel model in which each kernel regressor has its individually tuned centre vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to select the regressors one by one, so as to determine the model structure. After the regressor selection, the corresponding model weight parameters are calculated from the Lagrange dual problem of the original regression problem with the regularised ε{lunate}-insensitive loss function. Unlike the support vector regression, this stage of the procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, sparser representation can be obtained. Experiments involving one simulated example and three real data sets are used to demonstrate the effectiveness of the proposed novel regression modelling approach.

Original languageEnglish
Pages (from-to)462-474
Number of pages13
JournalNeurocomputing
Volume70
Issue number1-3
DOIs
Publication statusPublished - 1 Dec 2006

Keywords

  • Generalised kernel model
  • Orthogonal least squares forward selection
  • Regression
  • Sparse modelling
  • Support vector machine

Fingerprint

Dive into the research topics of 'Sparse support vector regression based on orthogonal forward selection for the generalised kernel model'. Together they form a unique fingerprint.

Cite this