Computational learning theory : EuroCOLT '93 : based on the proceedings of the First European Conference on Computational Learning Theory, organized by the Institute of Mathematics and Its Applications and held at Royal Holloway, University of London in December, 1993
Author(s)
Bibliographic Information
Computational learning theory : EuroCOLT '93 : based on the proceedings of the First European Conference on Computational Learning Theory, organized by the Institute of Mathematics and Its Applications and held at Royal Holloway, University of London in December, 1993
(The Institute of Mathematics and its Applications conference series, new ser.,
Clarendon Press , Oxford University Press, 1994
Available at 12 libraries
  Aomori
  Iwate
  Miyagi
  Akita
  Yamagata
  Fukushima
  Ibaraki
  Tochigi
  Gunma
  Saitama
  Chiba
  Tokyo
  Kanagawa
  Niigata
  Toyama
  Ishikawa
  Fukui
  Yamanashi
  Nagano
  Gifu
  Shizuoka
  Aichi
  Mie
  Shiga
  Kyoto
  Osaka
  Hyogo
  Nara
  Wakayama
  Tottori
  Shimane
  Okayama
  Hiroshima
  Yamaguchi
  Tokushima
  Kagawa
  Ehime
  Kochi
  Fukuoka
  Saga
  Nagasaki
  Kumamoto
  Oita
  Miyazaki
  Kagoshima
  Okinawa
  Korea
  China
  Thailand
  United Kingdom
  Germany
  Switzerland
  France
  Belgium
  Netherlands
  Sweden
  Norway
  United States of America
Description and Table of Contents
Description
This volume contains 17 of the contributed papers presented at the 1st European Conference on Computational Learning Theory. Also included are invited presentations on the complexity of learning on neural nets, on new directions in computational learning theory, and on a neurodial model for cognitive functions. The proceedings give an overview of current work in computational learning theory, ranging from results inspired by neural network research to those arising from more classical artificial intelligence approaches. The study of machine learning within the mathematical framework of complexity theory has been a relatively recent development. The burgeoning interest in the application of machine learning to a wide variety of problems from control to financial market prediction has fired a corresponding upsurge in mathematical research.
Table of Contents
- W. Maass: On the complexity of learning on neural nets
- M. Frazier and L. Pitt: Some new directions in computational learning theory
- L.G. Valiant: A neuroidal model for cognitive functions
- J. Kivinen, H. Mannila and E. Ukkonen: Learning rules with local exceptions
- M. Golea and M. Marchand: On learning simple deterministic and probabilistic neural concepts
- P. Fischer: Learning unions of convex polygons
- T. Hegedus: On training simple neural networks and small-weight neurons
- H.U. Simon: Bounds on the number of examples needed for learning functions
- M. Anthony and J. Shawe-Taylor: Valid generalization of functions from close approximations on a sample
- J. Kivinen and M.K. Warmuth: Using experts for predicting continuous outcomes
- K. Pillaipakkamnatt and V. Raghavan: Read-twice DNF formulas are properly learnable
- F. Ameur, P. Fischer, K.U. Hoeffgen and F. Meyer auf der Heide: Trial and error: a new approach to space-bounded learning
- S. Anoulova and S. Poelt: Using Kullback-Leibler divergence in learning theory
- Saoudi Yokomori: Learning local and recognizable w-languages and monadic logic programs
- R. Wiehagen, C.H. Smith and T. Zeugmann: Classification of predicates and languages
- H. Wiklicky: The neural network loading problem is undecidable
- R. Gavalda: On the power of equivalence
- On-line prediction and conversion strategies
- K. Yamanishi: Learning non-parametric smooth rules by stochastic rules with finite partitioning
- S. Poelt: Improved sample size bounds for PAB-decisions.
by "Nielsen BookData"