TY - JOUR
T1 - Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model
AU - Wang, Xunxian
AU - Lowe, David
AU - Chen, Sheng
AU - Harris, Chris J.
PY - 2006/12/1
Y1 - 2006/12/1
N2 - A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate number of regressors, the model weight parameters are calculated from the Lagrange dual problem of the original least squares problem. Different from the least squares support vector regression, this regression modelling procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, a very sparse representation can be obtained with excellent generalisation capability. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modelling approach.
AB - A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate number of regressors, the model weight parameters are calculated from the Lagrange dual problem of the original least squares problem. Different from the least squares support vector regression, this regression modelling procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, a very sparse representation can be obtained with excellent generalisation capability. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modelling approach.
KW - Generalised kernel model
KW - Least squares support vector machine
KW - Orthogonal least squares forward selection
KW - Regression
KW - Sparse modelling
UR - http://www.scopus.com/inward/record.url?scp=34248656230&partnerID=8YFLogxK
U2 - 10.1504/IJMIC.2006.012612
DO - 10.1504/IJMIC.2006.012612
M3 - Article
AN - SCOPUS:34248656230
SN - 1746-6172
VL - 1
SP - 245
EP - 256
JO - International Journal of Modelling, Identification and Control
JF - International Journal of Modelling, Identification and Control
IS - 4
ER -