Computational learning theory : EuroCOLT '93 : based on the proceedings of the First European Conference on Computational Learning Theory, organized by the Institute of Mathematics and Its Applications and held at Royal Holloway, University of London in December, 1993
著者
書誌事項
Computational learning theory : EuroCOLT '93 : based on the proceedings of the First European Conference on Computational Learning Theory, organized by the Institute of Mathematics and Its Applications and held at Royal Holloway, University of London in December, 1993
(The Institute of Mathematics and its Applications conference series, new ser.,
Clarendon Press , Oxford University Press, 1994
大学図書館所蔵 全12件
  青森
  岩手
  宮城
  秋田
  山形
  福島
  茨城
  栃木
  群馬
  埼玉
  千葉
  東京
  神奈川
  新潟
  富山
  石川
  福井
  山梨
  長野
  岐阜
  静岡
  愛知
  三重
  滋賀
  京都
  大阪
  兵庫
  奈良
  和歌山
  鳥取
  島根
  岡山
  広島
  山口
  徳島
  香川
  愛媛
  高知
  福岡
  佐賀
  長崎
  熊本
  大分
  宮崎
  鹿児島
  沖縄
  韓国
  中国
  タイ
  イギリス
  ドイツ
  スイス
  フランス
  ベルギー
  オランダ
  スウェーデン
  ノルウェー
  アメリカ
内容説明・目次
内容説明
This volume contains 17 of the contributed papers presented at the 1st European Conference on Computational Learning Theory. Also included are invited presentations on the complexity of learning on neural nets, on new directions in computational learning theory, and on a neurodial model for cognitive functions. The proceedings give an overview of current work in computational learning theory, ranging from results inspired by neural network research to those arising from more classical artificial intelligence approaches. The study of machine learning within the mathematical framework of complexity theory has been a relatively recent development. The burgeoning interest in the application of machine learning to a wide variety of problems from control to financial market prediction has fired a corresponding upsurge in mathematical research.
目次
- W. Maass: On the complexity of learning on neural nets
- M. Frazier and L. Pitt: Some new directions in computational learning theory
- L.G. Valiant: A neuroidal model for cognitive functions
- J. Kivinen, H. Mannila and E. Ukkonen: Learning rules with local exceptions
- M. Golea and M. Marchand: On learning simple deterministic and probabilistic neural concepts
- P. Fischer: Learning unions of convex polygons
- T. Hegedus: On training simple neural networks and small-weight neurons
- H.U. Simon: Bounds on the number of examples needed for learning functions
- M. Anthony and J. Shawe-Taylor: Valid generalization of functions from close approximations on a sample
- J. Kivinen and M.K. Warmuth: Using experts for predicting continuous outcomes
- K. Pillaipakkamnatt and V. Raghavan: Read-twice DNF formulas are properly learnable
- F. Ameur, P. Fischer, K.U. Hoeffgen and F. Meyer auf der Heide: Trial and error: a new approach to space-bounded learning
- S. Anoulova and S. Poelt: Using Kullback-Leibler divergence in learning theory
- Saoudi Yokomori: Learning local and recognizable w-languages and monadic logic programs
- R. Wiehagen, C.H. Smith and T. Zeugmann: Classification of predicates and languages
- H. Wiklicky: The neural network loading problem is undecidable
- R. Gavalda: On the power of equivalence
- On-line prediction and conversion strategies
- K. Yamanishi: Learning non-parametric smooth rules by stochastic rules with finite partitioning
- S. Poelt: Improved sample size bounds for PAB-decisions.
「Nielsen BookData」 より