An exploration of the effect of super large n-tuples on single layer RAMnets

Richard Rohwer, Alex Lamb

    Research output: Chapter in Book/Report/Conference proceedingChapter

    Abstract

    N-tuple recognition systems (RAMnets) are normally modeled using a small number of input lines to each RAM, because the address space grows exponentially with the number of inputs. It is impossible to implement an arbitrarily-large address space as physical memory. But given modest amounts of training data, correspondingly modest numbers of bits will be set in that memory. Hash arrays can therefore be used instead of a direct implementation of the required address space. This paper describes some exploratory experiments using the hash array technique to investigate the performance of RAMnets with very large numbers of input lines. An argument is presented which concludes that performance should peak at a relatively small n-tuple size, but the experiments carried out so far contradict this. Further experiments are needed to confirm this unexpected result.
    Original languageEnglish
    Title of host publicationProceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons
    EditorsNigel Allinson
    PublisherUniversity of York
    Pages33-37
    Number of pages5
    Publication statusPublished - 1993
    EventProceedings of the Weightless Neural Network Workshop '93, Computing with Logical Neurons -
    Duration: 1 Jan 19931 Jan 1993

    Workshop

    WorkshopProceedings of the Weightless Neural Network Workshop '93, Computing with Logical Neurons
    Period1/01/931/01/93

    Fingerprint

    Data storage equipment
    Experiments
    Random access storage

    Keywords

    • N-tuple recognition systems
    • RAMnets
    • memory
    • hash array technique

    Cite this

    Rohwer, R., & Lamb, A. (1993). An exploration of the effect of super large n-tuples on single layer RAMnets. In N. Allinson (Ed.), Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons (pp. 33-37). University of York.
    Rohwer, Richard ; Lamb, Alex. / An exploration of the effect of super large n-tuples on single layer RAMnets. Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons. editor / Nigel Allinson. University of York, 1993. pp. 33-37
    @inbook{93d7fc686b014bcea2e9e19add60b2cf,
    title = "An exploration of the effect of super large n-tuples on single layer RAMnets",
    abstract = "N-tuple recognition systems (RAMnets) are normally modeled using a small number of input lines to each RAM, because the address space grows exponentially with the number of inputs. It is impossible to implement an arbitrarily-large address space as physical memory. But given modest amounts of training data, correspondingly modest numbers of bits will be set in that memory. Hash arrays can therefore be used instead of a direct implementation of the required address space. This paper describes some exploratory experiments using the hash array technique to investigate the performance of RAMnets with very large numbers of input lines. An argument is presented which concludes that performance should peak at a relatively small n-tuple size, but the experiments carried out so far contradict this. Further experiments are needed to confirm this unexpected result.",
    keywords = "N-tuple recognition systems, RAMnets, memory, hash array technique",
    author = "Richard Rohwer and Alex Lamb",
    year = "1993",
    language = "English",
    pages = "33--37",
    editor = "Nigel Allinson",
    booktitle = "Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons",
    publisher = "University of York",

    }

    Rohwer, R & Lamb, A 1993, An exploration of the effect of super large n-tuples on single layer RAMnets. in N Allinson (ed.), Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons. University of York, pp. 33-37, Proceedings of the Weightless Neural Network Workshop '93, Computing with Logical Neurons, 1/01/93.

    An exploration of the effect of super large n-tuples on single layer RAMnets. / Rohwer, Richard; Lamb, Alex.

    Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons. ed. / Nigel Allinson. University of York, 1993. p. 33-37.

    Research output: Chapter in Book/Report/Conference proceedingChapter

    TY - CHAP

    T1 - An exploration of the effect of super large n-tuples on single layer RAMnets

    AU - Rohwer, Richard

    AU - Lamb, Alex

    PY - 1993

    Y1 - 1993

    N2 - N-tuple recognition systems (RAMnets) are normally modeled using a small number of input lines to each RAM, because the address space grows exponentially with the number of inputs. It is impossible to implement an arbitrarily-large address space as physical memory. But given modest amounts of training data, correspondingly modest numbers of bits will be set in that memory. Hash arrays can therefore be used instead of a direct implementation of the required address space. This paper describes some exploratory experiments using the hash array technique to investigate the performance of RAMnets with very large numbers of input lines. An argument is presented which concludes that performance should peak at a relatively small n-tuple size, but the experiments carried out so far contradict this. Further experiments are needed to confirm this unexpected result.

    AB - N-tuple recognition systems (RAMnets) are normally modeled using a small number of input lines to each RAM, because the address space grows exponentially with the number of inputs. It is impossible to implement an arbitrarily-large address space as physical memory. But given modest amounts of training data, correspondingly modest numbers of bits will be set in that memory. Hash arrays can therefore be used instead of a direct implementation of the required address space. This paper describes some exploratory experiments using the hash array technique to investigate the performance of RAMnets with very large numbers of input lines. An argument is presented which concludes that performance should peak at a relatively small n-tuple size, but the experiments carried out so far contradict this. Further experiments are needed to confirm this unexpected result.

    KW - N-tuple recognition systems

    KW - RAMnets

    KW - memory

    KW - hash array technique

    M3 - Chapter

    SP - 33

    EP - 37

    BT - Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons

    A2 - Allinson, Nigel

    PB - University of York

    ER -

    Rohwer R, Lamb A. An exploration of the effect of super large n-tuples on single layer RAMnets. In Allinson N, editor, Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons. University of York. 1993. p. 33-37