Bayesian training of mixture density networks

Lars U. Hjorth, Ian T. Nabney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Mixture Density Networks (MDNs) are a well-established method for modelling the conditional probability density which is useful for complex multi-valued functions where regression methods (such as MLPs) fail. In this paper we extend earlier research of a regularisation method for a special case of MDNs to the general case using evidence based regularisation and we show how the Hessian of the MDN error function can be evaluated using R-propagation. The method is tested on two data sets and compared with early stopping.
Original languageEnglish
Title of host publicationProceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000
Place of PublicationPiscataway, NJ, United States
PublisherIEEE
Pages455-460
Number of pages6
Volume4
ISBN (Print)9780769506197
DOIs
Publication statusPublished - 13 Aug 2000
EventInternational Joint Conference on Neural Networks - Como, Italy
Duration: 24 Jul 200027 Jul 2000

Conference

ConferenceInternational Joint Conference on Neural Networks
Abbreviated titleIJCNN 2000
CountryItaly
CityComo
Period24/07/0027/07/00

Keywords

  • Bayes methods
  • learning
  • artificial intelligence
  • neural nets
  • Bayesian training
  • MDN error function
  • mixture density networks
  • R-propagation
  • conditional probability density

Cite this

Hjorth, L. U., & Nabney, I. T. (2000). Bayesian training of mixture density networks. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000 (Vol. 4, pp. 455-460). Piscataway, NJ, United States: IEEE. https://doi.org/10.1109/IJCNN.2000.860813
Hjorth, Lars U. ; Nabney, Ian T. / Bayesian training of mixture density networks. Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000. Vol. 4 Piscataway, NJ, United States : IEEE, 2000. pp. 455-460
@inproceedings{5e7db71514184fb4aa233825b261c32f,
title = "Bayesian training of mixture density networks",
abstract = "Mixture Density Networks (MDNs) are a well-established method for modelling the conditional probability density which is useful for complex multi-valued functions where regression methods (such as MLPs) fail. In this paper we extend earlier research of a regularisation method for a special case of MDNs to the general case using evidence based regularisation and we show how the Hessian of the MDN error function can be evaluated using R-propagation. The method is tested on two data sets and compared with early stopping.",
keywords = "Bayes methods, learning, artificial intelligence, neural nets, Bayesian training, MDN error function, mixture density networks, R-propagation, conditional probability density",
author = "Hjorth, {Lars U.} and Nabney, {Ian T.}",
year = "2000",
month = "8",
day = "13",
doi = "10.1109/IJCNN.2000.860813",
language = "English",
isbn = "9780769506197",
volume = "4",
pages = "455--460",
booktitle = "Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000",
publisher = "IEEE",
address = "United States",

}

Hjorth, LU & Nabney, IT 2000, Bayesian training of mixture density networks. in Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000. vol. 4, IEEE, Piscataway, NJ, United States, pp. 455-460, International Joint Conference on Neural Networks, Como, Italy, 24/07/00. https://doi.org/10.1109/IJCNN.2000.860813

Bayesian training of mixture density networks. / Hjorth, Lars U.; Nabney, Ian T.

Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000. Vol. 4 Piscataway, NJ, United States : IEEE, 2000. p. 455-460.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Bayesian training of mixture density networks

AU - Hjorth, Lars U.

AU - Nabney, Ian T.

PY - 2000/8/13

Y1 - 2000/8/13

N2 - Mixture Density Networks (MDNs) are a well-established method for modelling the conditional probability density which is useful for complex multi-valued functions where regression methods (such as MLPs) fail. In this paper we extend earlier research of a regularisation method for a special case of MDNs to the general case using evidence based regularisation and we show how the Hessian of the MDN error function can be evaluated using R-propagation. The method is tested on two data sets and compared with early stopping.

AB - Mixture Density Networks (MDNs) are a well-established method for modelling the conditional probability density which is useful for complex multi-valued functions where regression methods (such as MLPs) fail. In this paper we extend earlier research of a regularisation method for a special case of MDNs to the general case using evidence based regularisation and we show how the Hessian of the MDN error function can be evaluated using R-propagation. The method is tested on two data sets and compared with early stopping.

KW - Bayes methods

KW - learning

KW - artificial intelligence

KW - neural nets

KW - Bayesian training

KW - MDN error function

KW - mixture density networks

KW - R-propagation

KW - conditional probability density

UR - http://www.scopus.com/inward/record.url?scp=0033685656&partnerID=8YFLogxK

UR - http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=860813

U2 - 10.1109/IJCNN.2000.860813

DO - 10.1109/IJCNN.2000.860813

M3 - Conference contribution

SN - 9780769506197

VL - 4

SP - 455

EP - 460

BT - Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000

PB - IEEE

CY - Piscataway, NJ, United States

ER -

Hjorth LU, Nabney IT. Bayesian training of mixture density networks. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000. IJCNN 2000. Vol. 4. Piscataway, NJ, United States: IEEE. 2000. p. 455-460 https://doi.org/10.1109/IJCNN.2000.860813