Computing with infinite networks

Christopher K. I. Williams

Research output: Chapter in Book/Published conference outputConference publication

Abstract

For neural networks with a wide class of weight-priors, it can be shown that in the limit of an infinite number of hidden units the prior over functions tends to a Gaussian process. In this paper analytic forms are derived for the covariance function of the Gaussian processes corresponding to networks with sigmoidal and Gaussian hidden units. This allows predictions to be made efficiently using networks with an infinite number of hidden units, and shows that, somewhat paradoxically, it may be easier to compute with infinite networks than finite ones.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems
EditorsM. C. Mozer, M. I. Jordan, T. Petsche
Place of PublicationCambridge, US
PublisherMIT
Pages265-301
Number of pages37
ISBN (Print)0262100657
Publication statusPublished - 1996
Event10th Annual Conference on Neural Information Processing Systems, NIPS 1996 - Denver, CO, United Kingdom
Duration: 2 Dec 19965 Dec 1996

Publication series

NameProceesing of the 1996 conference
PublisherMassachusetts Institute of Technology Press (MIT Press)

Conference

Conference10th Annual Conference on Neural Information Processing Systems, NIPS 1996
Country/TerritoryUnited Kingdom
CityDenver, CO
Period2/12/965/12/96

Bibliographical note

Copyright of the Massachusetts Institute of Technology Press (MIT Press)

Keywords

  • neural networks
  • weight-priors
  • Gaussian process
  • sigmoidal

Fingerprint

Dive into the research topics of 'Computing with infinite networks'. Together they form a unique fingerprint.

Cite this