An Introduction to neural networks

Bibliographic Information

An Introduction to neural networks

James A. Anderson

MIT Press, c1995

  • : hbk
  • : pbk

Available at  / 61 libraries

Search this Book/Journal

Note

"A Bradford book."

Includes bibliographical references and index

Description and Table of Contents

Volume

: hbk ISBN 9780262011440

Description

An Introduction to Neural Networks falls into a new ecological niche for texts. Based on notes that have been class-tested for more than a decade, it is aimed at cognitive science and neuroscience students who need to understand brain function in terms of computational modeling, and at engineers who want to go beyond formal algorithms to applications and computing strategies. It is the only current text to approach networks from a broad neuroscience and cognitive science perspective, with an emphasis on the biology and psychology behind the assumptions of the models, as well as on what the models might be used for. It describes the mathematical and computational tools needed and provides an account of the author's own ideas.Students learn how to teach arithmetic to a neural network and get a short course on linear associative memory and adaptive maps. They are introduced to the author's brain-state-in-a-box (BSB) model and are provided with some of the neurobiological background necessary for a firm grasp of the general subject.The field now known as neural networks has split in recent years into two major groups, mirrored in the texts that are currently available: the engineers who are primarily interested in practical applications of the new adaptive, parallel computing technology, and the cognitive scientists and neuroscientists who are interested in scientific applications. As the gap between these two groups widens, Anderson notes that the academics have tended to drift off into irrelevant, often excessively abstract research while the engineers have lost contact with the source of ideas in the field. Neuroscience, he points out, provides a rich and valuable source of ideas about data representation and setting up the data representation is the major part of neural network programming. Both cognitive science and neuroscience give insights into how this can be done effectively: cognitive science suggests what to compute and neuroscience suggests how to compute it.

Table of Contents

  • Properties of single neurons
  • synaptic integration and neuron models
  • essential vector operations
  • lateral inhibition and sensory processing
  • simple matrix operations
  • the linear associator - background and foundations
  • the linear associator - simulations
  • early network models - the perceptron
  • gradient descent algorithms
  • representation of information
  • applications of simple associators - concept formation and object motion
  • energy and neural networks - Hopfield networks and Boltzmann machines
  • nearest neighbour models
  • adaptive maps
  • the BSB model - a simple nonlinear autoassociative neural network
  • associative computation
  • teaching arithmetic to a neural network.
Volume

: pbk ISBN 9780262510813

Description

An Introduction to Neural Networks falls into a new ecological niche for texts. Based on notes that have been class-tested for more than a decade, it is aimed at cognitive science and neuroscience students who need to understand brain function in terms of computational modeling, and at engineers who want to go beyond formal algorithms to applications and computing strategies. It is the only current text to approach networks from a broad neuroscience and cognitive science perspective, with an emphasis on the biology and psychology behind the assumptions of the models, as well as on what the models might be used for. It describes the mathematical and computational tools needed and provides an account of the author's own ideas. Students learn how to teach arithmetic to a neural network and get a short course on linear associative memory and adaptive maps. They are introduced to the author's brain-state-in-a-box (BSB) model and are provided with some of the neurobiological background necessary for a firm grasp of the general subject. The field now known as neural networks has split in recent years into two major groups, mirrored in the texts that are currently available: the engineers who are primarily interested in practical applications of the new adaptive, parallel computing technology, and the cognitive scientists and neuroscientists who are interested in scientific applications. As the gap between these two groups widens, Anderson notes that the academics have tended to drift off into irrelevant, often excessively abstract research while the engineers have lost contact with the source of ideas in the field. Neuroscience, he points out, provides a rich and valuable source of ideas about data representation and setting up the data representation is the major part of neural network programming. Both cognitive science and neuroscience give insights into how this can be done effectively: cognitive science suggests what to compute and neuroscience suggests how to compute it.

by "Nielsen BookData"

Details

  • NCID
    BA24911458
  • ISBN
    • 0262011441
    • 0262510812
  • LCCN
    94030749
  • Country Code
    us
  • Title Language Code
    eng
  • Text Language Code
    eng
  • Place of Publication
    Cambridge, Mass.
  • Pages/Volumes
    xi, 650 p.
  • Size
    27 cm
  • Classification
  • Subject Headings
Page Top