The learning dynamics of a universal approximator

Ansgar H. L. West, David Saad, Ian T. Nabney, Michael C. Mozer (Editor), Thomas Petsche (Editor), Michael I. Jordan (Editor)

Research output: Contribution to journalArticle

Abstract

The learning properties of a universal approximator, a normalized committee machine with adjustable biases, are studied for on-line back-propagation learning. Within a statistical mechanics framework, numerical studies show that this model has features which do not exist in previously studied two-layer network models without adjustable biases, e.g., attractive suboptimal symmetric phases even for realizable cases and noiseless data.
Original languageEnglish
Pages (from-to)288-294
Number of pages7
JournalAdvances in Neural Information Processing Systems
Volume9
Publication statusPublished - May 1997

Fingerprint

learning
statistical mechanics

Bibliographical note

Copyright of the Massachusetts Institute of Technology Press (MIT Press)

Keywords

  • approximator
  • back-propagation
  • symmetric phases
  • realizable cases
  • noiseless data

Cite this

West, A. H. L., Saad, D., Nabney, I. T., Mozer, M. C. (Ed.), Petsche, T. (Ed.), & Jordan, M. I. (Ed.) (1997). The learning dynamics of a universal approximator. Advances in Neural Information Processing Systems, 9, 288-294.
West, Ansgar H. L. ; Saad, David ; Nabney, Ian T. ; Mozer, Michael C. (Editor) ; Petsche, Thomas (Editor) ; Jordan, Michael I. (Editor). / The learning dynamics of a universal approximator. In: Advances in Neural Information Processing Systems. 1997 ; Vol. 9. pp. 288-294.
@article{daa79ae7900145ad9d24505409375016,
title = "The learning dynamics of a universal approximator",
abstract = "The learning properties of a universal approximator, a normalized committee machine with adjustable biases, are studied for on-line back-propagation learning. Within a statistical mechanics framework, numerical studies show that this model has features which do not exist in previously studied two-layer network models without adjustable biases, e.g., attractive suboptimal symmetric phases even for realizable cases and noiseless data.",
keywords = "approximator, back-propagation, symmetric phases, realizable cases, noiseless data",
author = "West, {Ansgar H. L.} and David Saad and Nabney, {Ian T.} and Mozer, {Michael C.} and Thomas Petsche and Jordan, {Michael I.}",
note = "Copyright of the Massachusetts Institute of Technology Press (MIT Press)",
year = "1997",
month = "5",
language = "English",
volume = "9",
pages = "288--294",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

West, AHL, Saad, D, Nabney, IT, Mozer, MC (ed.), Petsche, T (ed.) & Jordan, MI (ed.) 1997, 'The learning dynamics of a universal approximator', Advances in Neural Information Processing Systems, vol. 9, pp. 288-294.

The learning dynamics of a universal approximator. / West, Ansgar H. L.; Saad, David; Nabney, Ian T.; Mozer, Michael C. (Editor); Petsche, Thomas (Editor); Jordan, Michael I. (Editor).

In: Advances in Neural Information Processing Systems, Vol. 9, 05.1997, p. 288-294.

Research output: Contribution to journalArticle

TY - JOUR

T1 - The learning dynamics of a universal approximator

AU - West, Ansgar H. L.

AU - Saad, David

AU - Nabney, Ian T.

A2 - Mozer, Michael C.

A2 - Petsche, Thomas

A2 - Jordan, Michael I.

N1 - Copyright of the Massachusetts Institute of Technology Press (MIT Press)

PY - 1997/5

Y1 - 1997/5

N2 - The learning properties of a universal approximator, a normalized committee machine with adjustable biases, are studied for on-line back-propagation learning. Within a statistical mechanics framework, numerical studies show that this model has features which do not exist in previously studied two-layer network models without adjustable biases, e.g., attractive suboptimal symmetric phases even for realizable cases and noiseless data.

AB - The learning properties of a universal approximator, a normalized committee machine with adjustable biases, are studied for on-line back-propagation learning. Within a statistical mechanics framework, numerical studies show that this model has features which do not exist in previously studied two-layer network models without adjustable biases, e.g., attractive suboptimal symmetric phases even for realizable cases and noiseless data.

KW - approximator

KW - back-propagation

KW - symmetric phases

KW - realizable cases

KW - noiseless data

UR - http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=3990

UR - http://www.scopus.com/inward/record.url?scp=84899020045&partnerID=8YFLogxK

M3 - Article

VL - 9

SP - 288

EP - 294

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -

West AHL, Saad D, Nabney IT, Mozer MC, (ed.), Petsche T, (ed.), Jordan MI, (ed.). The learning dynamics of a universal approximator. Advances in Neural Information Processing Systems. 1997 May;9:288-294.