Time trials on second-order and variable-learning-rate algorithms

Richard Rohwer

    Research output: Chapter in Book/Published conference outputConference publication

    Abstract

    The performance of seven minimization algorithms are compared on five neural network problems. These include a variable-step-size algorithm, conjugate gradient, and several methods with explicit analytic or numerical approximations to the Hessian.
    Original languageEnglish
    Title of host publicationProceedings of the conference on Advances in neural information processing systems 3
    EditorsR. Lippmann, J. Moody, D. Touretzky
    Place of PublicationSan Francisco, CA, USA
    PublisherMorgan Kaufmann
    Pages977-983
    Number of pages7
    Volume3
    ISBN (Print)1-55860-184-8
    Publication statusPublished - 1990
    EventAdvances in Neural Information Processing Systems 3 -
    Duration: 1 Jan 19901 Jan 1990

    Conference

    ConferenceAdvances in Neural Information Processing Systems 3
    Period1/01/901/01/90

    Bibliographical note

    © The Author

    Keywords

    • minimization algorithms
    • neural network
    • Hessian

    Fingerprint

    Dive into the research topics of 'Time trials on second-order and variable-learning-rate algorithms'. Together they form a unique fingerprint.

    Cite this