-
- GECZY Peter
- Department of Information and Computer Sciences, Toyohashi University of Technology
-
- USUI Shiro
- Department of Information and Computer Sciences, Toyohashi University of Technology
この論文をさがす
抄録
The problem of estimating the parameters in Artificial Neural Networks is mostly seen as one of nonlinear parameter estimation. Several training algorithms, for setting the parameters, use the steepest descent methods and their variants. Unfortunately, in many cases, due to an insufficient number of learning samples, the training problems are underdetermined. Analogously, an inappropriately selected training set may cause rank-deficiencies of not only the resulting mapping, but also of its principal submappings. Due to such problem of rank-deficiencies, the search direction for iterative steepest descent and conjugate gradient techniques is incomplete and training algorithms indicate slow convergence or, in the worst cases, they fall to find minima. In this paper we show the existence of a minimum training set, T_<min>, such that Jacobeans of all the principal submappings are of maximum rank and the Jacobean of the resulting mapping, F, has a full rank.
収録刊行物
-
- 電子情報通信学会技術研究報告. NC, ニューロコンピューティング
-
電子情報通信学会技術研究報告. NC, ニューロコンピューティング 95 (405), 7-14, 1995-12-09
一般社団法人電子情報通信学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1570291227539413760
-
- NII論文ID
- 110003233065
-
- NII書誌ID
- AN10091178
-
- 本文言語コード
- en
-
- データソース種別
-
- CiNii Articles