Abstract
We study the space of functions computed by random-layered machines, including deep neural networks and Boolean circuits. Investigating the distribution of Boolean functions computed on the recurrent and layer-dependent architectures, we find that it is the same in both models. Depending on the initial conditions and computing elements used, we characterize the space of functions computed at the large depth limit and show that the macroscopic entropy of Boolean functions is either monotonically increasing or decreasing with the growing depth.
Original language | English |
---|---|
Article number | 168301 |
Number of pages | 6 |
Journal | Physical Review Letters |
Volume | 125 |
Issue number | 16 |
DOIs | |
Publication status | Published - 12 Oct 2020 |
Bibliographical note
© 2020 American Physical Society. Space of Functions Computed by Deep-Layered Machines. Alexander Mozeika, Bo Li, and David Saad. Phys. Rev. Lett. 125, 168301 – Published 12 October 2020Funding: B. L. and D. S. acknowledge support from the
Leverhulme Trust (RPG-2018-092), European Union’s
Horizon 2020 research and innovation program under the
Marie Skłodowska-Curie Grant Agreement No. 835913.
D. S. acknowledges support from the EPSRC program
grant TRANSNET (EP/R035342/1).