Theory of Recurrent Neural Network with Common Synaptic Inputs

  • Kawamura Masaki
    Faculty of Science, Yamaguchi University
  • Yamana Michiko
    System Engineering Research Laboratory, Central Research Institute of Electric Power Industry
  • Okada Masato
    Department of Complexity Science and Engineering, Graduate School of Frontier Sciences, The University of Tokyo Laboratory for Mathematical Neuroscience, RIKEN Brain Science Institute “Intelligent Cooperation and Control,” PRESTO, JST

Search this article

Abstract

We discuss the effects of common synaptic inputs in a recurrent neural network. Because of the effects of these common synaptic inputs, the correlation between neural inputs cannot be ignored, and thus the network exhibits sample dependence. Networks of this type do not have well-defined thermodynamic limits, and self-averaging breaks down. We therefore need to develop a suitable theory without relying on these common properties. While the effects of the common synaptic inputs have been analyzed in layered neural networks, it was apparently difficult to analyze these effects in recurrent neural networks due to feedback connections. We investigated a sequential associative memory model as an example of recurrent networks and succeeded in deriving a macroscopic dynamical description as a recurrence relation form of a probability density function.

Journal

Citations (3)*help

See more

References(30)*help

See more

Related Projects

See more

Details 詳細情報について

Report a problem

Back to top