Bibliographic Information

Constraints and prospects

edited by Stephen José Hanson, George A. Drastal, and Ronald L. Rivest

(Computational learning theory and natural learning systems / edited by Russell Greiner, Thomas Petsche and Stephen José Hanson, v. 1)(Bradford book)

MIT Press, c1994

Available at  / 40 libraries

Search this Book/Journal

Note

Includes bibliographical references and index

Description and Table of Contents

Description

These original contributions converge on an exciting and fruitful intersection of three historically distinct areas of learning research: computational learning theory, neural networks, and symbolic machine learning. Bridging theory and practice, computer science and psychology, they consider general issues in learning systems that could provide constraints for theory and at the same time interpret theoretical results in the context of experiments with actual learning systems.In all, nineteen chapters address questions such as, What is a natural system? How should learning systems gain from prior knowledge? If prior knowledge is important, how can we quantify how important? What makes a learning problem hard? How are neural networks and symbolic machine learning approaches similar? Is there a fundamental difference in the kind of task a neural network can easily solve as opposed to those a symbolic algorithm can easily solve?

Table of Contents

  • Part 1 Foundations: logic and learning, Daniel N. Osherson et al
  • learning theoretical terms, Ranan B. Banerji
  • how loading complexity is affected by node function sets, Stephen Judd
  • defining the limits of analogical planning, Diane J. Cook. Part 2 Representation and bias: learning hard concepts through constructive induction - framework and rationale, Larry Rendell and Raj Seshu
  • learning disjunctive concepts using domain knowledge, Harish Ragavan and Larry Rendell
  • learning in an abstraction space, George Drastal
  • binary decision trees and an "average-case" model for concept learning - implications for feature construction and the study of bias, Raj Seshu
  • refining algorithms with knowledge-based neural networks - improving the Chou-Fasman algorithm for protein folding, Richard Maclin and Jude W. Shavlik. Part 3 Sampling problems: efficient distribution-free learning of probabilistic concepts, Michael J. Kearns and Robert E. Schapire
  • VC dimension and sampling complexity of learning sparse polynomials and rational functions, Marek Karpinski and Thorsten Werther
  • learning from data with bounded inconsistency - theoretical and experimental results, Haym Hirsh and William W. Cohen
  • how fast can a threshold gate learn?, Wolfgang Maass and Gyorgy Turan
  • when are k-nearest neighbour and backpropagation accurate for feasible-sized sets of examples?, Eric B. Baum. Part 4 Experimental: comparing connectionist and symbolic learning methods, J.R. Quinlan
  • weight elimination and effective network size, Andreas S. Weigend and David E. Rumelhart
  • simulation results for a new two-armed bandit heuristic, Ronald L. Rivest and Yiqun Yin
  • hard questions about easy tasks - issues from learning to play games, Susan L. Epstein
  • experiments on the transfer of knowledge between neural networks, Lorien Y. Pratt.

by "Nielsen BookData"

Related Books: 1-2 of 2

Details

Page Top