Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model

Xunxian Wang, David Lowe, Sheng Chen*, Chris J. Harris

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate number of regressors, the model weight parameters are calculated from the Lagrange dual problem of the original least squares problem. Different from the least squares support vector regression, this regression modelling procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, a very sparse representation can be obtained with excellent generalisation capability. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modelling approach.

Original languageEnglish
Pages (from-to)245-256
Number of pages12
JournalInternational Journal of Modelling, Identification and Control
Volume1
Issue number4
DOIs
Publication statusPublished - 1 Dec 2006

Keywords

  • Generalised kernel model
  • Least squares support vector machine
  • Orthogonal least squares forward selection
  • Regression
  • Sparse modelling

Fingerprint

Dive into the research topics of 'Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model'. Together they form a unique fingerprint.

Cite this