Enhancing the Generalization Ability of Backpropagation Algorithm through Controlling the Outputs of the Hidden Layers

  • WAN Weishui
    Intelligent Control Laboratory, Graduate School of Information Science and Electrical Engineering, Kyushu University
  • HIRASAWA Kotaro
    Intelligent Control Laboratory, Graduate School of Information Science and Electrical Engineering, Kyushu University
  • HU Jinglu
    Intelligent Control Laboratory, Graduate School of Information Science and Electrical Engineering, Kyushu University
  • MURATA Junichi
    Intelligent Control Laboratory, Graduate School of Information Science and Electrical Engineering, Kyushu University

この論文をさがす

抄録

It is well known that Backpropagation algorithm is one of the most basic algorithms in neural networks(NNs). Its role can not be overestimated in the field of neural networks. In this paper we propose a new variant of backpropagation algorithm through controlling the outputs of the hidden layers. The proposed algorithm therefore provides better generalization results than the basic backpropagation algorithm. The added term to the criterion function has the following property: (1) Small added noises in the inputs to the networks will not give evident effects to the outputs of the networks; (2) Small added noises in the weight matrix except the one between the hidden layer and output layer will not give large effects to the outputs of the networks. In addition, simulation comparisons are also made between the new algorithm and some conventional regularization methods, such as Laplace and Gaussian regularizer. Simulation results on the two-spiral problem, function approximation problem and Iris data classification problem confirm this assertion.

収録刊行物

参考文献 (12)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ