Computation with infinite neural networks

Christopher K.I. Williams

Research output: Contribution to journalArticlepeer-review

Abstract

For neural networks with a wide class of weight priors, it can be shown that in the limit of an infinite number of hidden units, the prior over functions tends to a gaussian process. In this article, analytic forms are derived for the covariance function of the gaussian processes corresponding to networks with sigmoidal and gaussian hidden units. This allows predictions to be made efficiently using networks with an infinite number of hidden units and shows, somewhat paradoxically, that it may be easier to carry out Bayesian prediction with infinite networks rather than finite ones.

Original languageEnglish
Pages (from-to)1203-1216
Number of pages14
JournalNeural Computation
Volume10
Issue number5
DOIs
Publication statusPublished - Jul 1998

Bibliographical note

Copyright of the Massachusetts Institute of Technology Press (MIT Press)

Keywords

  • neural networks
  • Gaussian process
  • hidden units
  • infinite networks
  • finite networks

Fingerprint

Dive into the research topics of 'Computation with infinite neural networks'. Together they form a unique fingerprint.

Cite this