Bayesian training of mixture density networks

Lars U. Hjorth, Ian T. Nabney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Mixture Density Networks (MDNs) are a well-established method for modelling the conditional probability density which is useful for complex multi-valued functions where regression methods (such as MLPs) fail. In this paper we extend earlier research of a regularisation method for a special case of MDNs to the general case using evidence based regularisation and we show how the Hessian of the MDN error function can be evaluated using R-propagation. The method is tested on two data sets and compared with early stopping.
Original languageEnglish
Title of host publicationProceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000
Place of PublicationPiscataway, NJ, United States
PublisherIEEE
Pages455-460
Number of pages6
Volume4
ISBN (Print)9780769506197
DOIs
Publication statusPublished - 13 Aug 2000
EventInternational Joint Conference on Neural Networks - Como, Italy
Duration: 24 Jul 200027 Jul 2000

Conference

ConferenceInternational Joint Conference on Neural Networks
Abbreviated titleIJCNN 2000
CountryItaly
CityComo
Period24/07/0027/07/00

Keywords

  • Bayes methods
  • learning
  • artificial intelligence
  • neural nets
  • Bayesian training
  • MDN error function
  • mixture density networks
  • R-propagation
  • conditional probability density

Cite this

Hjorth, L. U., & Nabney, I. T. (2000). Bayesian training of mixture density networks. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000 (Vol. 4, pp. 455-460). IEEE. https://doi.org/10.1109/IJCNN.2000.860813