Regularisation of mixture density networks

Lars U. Hjorth, Ian T. Nabney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.
Original languageEnglish
Title of host publicationNinth International Conference on Artificial Neural Networks, 1999
Subtitle of host publicationICANN 99
PublisherIET
Pages521-526
Number of pages6
Volume2
ISBN (Print)0-85296-721-7
DOIs
Publication statusPublished - 1999
Event9th International Conference on Artificial Neural Networks - Edinburgh, United Kingdom
Duration: 7 Sep 19997 Sep 1999

Publication series

NameIET conference publications
PublisherIET
Number470
ISSN (Print)0537-9989

Conference

Conference9th International Conference on Artificial Neural Networks
Abbreviated titleICANN 99
CountryUnited Kingdom
CityEdinburgh
Period7/09/997/09/99

Fingerprint

Neural networks
Radial basis function networks
Probability density function
Satellites

Bibliographical note

This paper is a postprint of a paper submitted to and accepted for publication in IET conference publications and is subject to Institution of Engineering and Technology Copyright. The copy of record is available at IET Digital Library.

Keywords

  • NCRG
  • neural nets
  • Bayesian regularisation
  • maximum likelihood estimation
  • mixture density networks
  • multivalued functions
  • neural networks
  • probability

Cite this

Hjorth, L. U., & Nabney, I. T. (1999). Regularisation of mixture density networks. In Ninth International Conference on Artificial Neural Networks, 1999: ICANN 99 (Vol. 2, pp. 521-526). (IET conference publications; No. 470). IET. https://doi.org/10.1049/cp:19991162
Hjorth, Lars U. ; Nabney, Ian T. / Regularisation of mixture density networks. Ninth International Conference on Artificial Neural Networks, 1999: ICANN 99. Vol. 2 IET, 1999. pp. 521-526 (IET conference publications; 470).
@inproceedings{fd962d145f4347adbcbde6b3e1bc13f5,
title = "Regularisation of mixture density networks",
abstract = "Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.",
keywords = "NCRG, neural nets, Bayesian regularisation, maximum likelihood estimation, mixture density networks, multivalued functions, neural networks, probability",
author = "Hjorth, {Lars U.} and Nabney, {Ian T.}",
note = "This paper is a postprint of a paper submitted to and accepted for publication in IET conference publications and is subject to Institution of Engineering and Technology Copyright. The copy of record is available at IET Digital Library.",
year = "1999",
doi = "10.1049/cp:19991162",
language = "English",
isbn = "0-85296-721-7",
volume = "2",
series = "IET conference publications",
publisher = "IET",
number = "470",
pages = "521--526",
booktitle = "Ninth International Conference on Artificial Neural Networks, 1999",
address = "United Kingdom",

}

Hjorth, LU & Nabney, IT 1999, Regularisation of mixture density networks. in Ninth International Conference on Artificial Neural Networks, 1999: ICANN 99. vol. 2, IET conference publications, no. 470, IET, pp. 521-526, 9th International Conference on Artificial Neural Networks, Edinburgh, United Kingdom, 7/09/99. https://doi.org/10.1049/cp:19991162

Regularisation of mixture density networks. / Hjorth, Lars U.; Nabney, Ian T.

Ninth International Conference on Artificial Neural Networks, 1999: ICANN 99. Vol. 2 IET, 1999. p. 521-526 (IET conference publications; No. 470).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Regularisation of mixture density networks

AU - Hjorth, Lars U.

AU - Nabney, Ian T.

N1 - This paper is a postprint of a paper submitted to and accepted for publication in IET conference publications and is subject to Institution of Engineering and Technology Copyright. The copy of record is available at IET Digital Library.

PY - 1999

Y1 - 1999

N2 - Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

AB - Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

KW - NCRG

KW - neural nets

KW - Bayesian regularisation

KW - maximum likelihood estimation

KW - mixture density networks

KW - multivalued functions

KW - neural networks

KW - probability

UR - http://digital-library.theiet.org/content/conferences/10.1049/cp_19991162

UR - http://www.scopus.com/inward/record.url?scp=0033355853&partnerID=8YFLogxK

U2 - 10.1049/cp:19991162

DO - 10.1049/cp:19991162

M3 - Conference contribution

SN - 0-85296-721-7

VL - 2

T3 - IET conference publications

SP - 521

EP - 526

BT - Ninth International Conference on Artificial Neural Networks, 1999

PB - IET

ER -

Hjorth LU, Nabney IT. Regularisation of mixture density networks. In Ninth International Conference on Artificial Neural Networks, 1999: ICANN 99. Vol. 2. IET. 1999. p. 521-526. (IET conference publications; 470). https://doi.org/10.1049/cp:19991162