Noisy-OR, Noisy-AND ゲートによる位置不変性の変分学習

DOI オープンアクセス

書誌事項

タイトル別名
  • Variational Learning of Feature Pooling in a Bayesian Network with Noisy-OR and Noisy-AND Gates

抄録

<p>In the viewpoint of the Bayesian brain hypothesis, Bayesian network model of cerebral cortex is promissing not only for computational modeling of brain, but also for an efficient brain- like artificial intelligence. A norious drawback in a Bayesian network is, however, the number of parameters that grows exponentially against the number of parent variables for a random variable. Restriction of the model may be a solution to this problem. Inspired by the biological plausibility, we previously proposed to use the combination of the noisy-OR and noisy-AND gates, whose numbers of parameters grow linearly with the number of parent random variables. Although we showed that this model can have translation invariance in a small-scale setting, it was difficult to enlarge the scale because of the hidden variables. In this study, we extend the previous attempt by employing a variational learning method to overcome the intractability of the estimation of the massive hidden variables. We can scale the model up to learn the hand-written digit data.</p>

収録刊行物

関連プロジェクト

もっと見る

詳細情報 詳細情報について

  • CRID
    1390289398727081728
  • NII論文ID
    130008089125
  • DOI
    10.11517/jsaisigtwo.2020.agi-016_05
  • ISSN
    24365556
  • 本文言語コード
    ja
  • データソース種別
    • JaLC
    • CiNii Articles
    • KAKEN
  • 抄録ライセンスフラグ
    使用可

問題の指摘

ページトップへ