N-tuple recognition systems (RAMnets) are normally modeled using a small number of input lines to each RAM, because the address space grows exponentially with the number of inputs. It is impossible to implement an arbitrarily-large address space as physical memory. But given modest amounts of training data, correspondingly modest numbers of bits will be set in that memory. Hash arrays can therefore be used instead of a direct implementation of the required address space. This paper describes some exploratory experiments using the hash array technique to investigate the performance of RAMnets with very large numbers of input lines. An argument is presented which concludes that performance should peak at a relatively small n-tuple size, but the experiments carried out so far contradict this. Further experiments are needed to confirm this unexpected result.
|Title of host publication||Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons|
|Publisher||University of York|
|Number of pages||5|
|Publication status||Published - 1993|
|Event||Proceedings of the Weightless Neural Network Workshop '93, Computing with Logical Neurons - |
Duration: 1 Jan 1993 → 1 Jan 1993
|Workshop||Proceedings of the Weightless Neural Network Workshop '93, Computing with Logical Neurons|
|Period||1/01/93 → 1/01/93|
- N-tuple recognition systems
- hash array technique
Rohwer, R., & Lamb, A. (1993). An exploration of the effect of super large n-tuples on single layer RAMnets. In N. Allinson (Ed.), Proceedings of the Weightless Neural Network Workshop 1993, Computing with Logical Neurons (pp. 33-37). University of York.