Abstract
A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate number of regressors, the model weight parameters are calculated from the Lagrange dual problem of the original least squares problem. Different from the least squares support vector regression, this regression modelling procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, a very sparse representation can be obtained with excellent generalisation capability. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modelling approach.
| Original language | English |
|---|---|
| Pages (from-to) | 245-256 |
| Number of pages | 12 |
| Journal | International Journal of Modelling, Identification and Control |
| Volume | 1 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - 1 Dec 2006 |
Keywords
- Generalised kernel model
- Least squares support vector machine
- Orthogonal least squares forward selection
- Regression
- Sparse modelling
Fingerprint
Dive into the research topics of 'Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver