A Learning Algorithm for Continually Running Fully Recurrent Neural Networks

  • Ronald J. Williams
    College of Computer Science, Northeastern University, Boston, MA 02115, USA
  • David Zipser
    Institute for Cognitive Science, University of California, La Jolla, CA 92093, USA

抄録

<jats:p> The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have (1) the advantage that they do not require a precisely defined training interval, operating while the network runs; and (2) the disadvantage that they require nonlocal communication in the network being trained and are computationally expensive. These algorithms allow networks having recurrent connections to learn complex tasks that require the retention of information over time periods having either fixed or indefinite length. </jats:p>

収録刊行物

被引用文献 (112)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ