First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's Method

  • Roberto Battiti
    Dipartimento di Matematica, Università di Trento, 38050 Povo (Trento), Italy

この論文をさがす

抄録

<jats:p> On-line first-order backpropagation is sufficiently fast and effective for many large-scale classification problems but for very high precision mappings, batch processing may be the method of choice. This paper reviews first- and second-order optimization methods for learning in feedforward neural networks. The viewpoint is that of optimization: many methods can be cast in the language of optimization techniques, allowing the transfer to neural nets of detailed results about computational complexity and safety procedures to ensure convergence and to avoid numerical problems. The review is not intended to deliver detailed prescriptions for the most appropriate methods in specific applications, but to illustrate the main characteristics of the different methods and their mutual relations. </jats:p>

収録刊行物

被引用文献 (25)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ