The role of stochasticity in an information-optimal neural population code

Nigel G. Stocks, Robert Morse, Mark D McDonnell

Research output: Unpublished contribution to conferenceUnpublished Conference Paperpeer-review

Abstract

In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise, in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and; hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.
Original languageEnglish
Number of pages11
DOIs
Publication statusPublished - 13 Sept 2009
EventInternational Workshop on Statistical-Mechanical Informatics - Kyoto, Japan
Duration: 13 Sept 200916 Sept 2009

Conference

ConferenceInternational Workshop on Statistical-Mechanical Informatics
Country/TerritoryJapan
CityKyoto
Period13/09/0916/09/09

Fingerprint

Dive into the research topics of 'The role of stochasticity in an information-optimal neural population code'. Together they form a unique fingerprint.

Cite this