Artificial neural networks : approximation and learning theory

Bibliographic Information

Artificial neural networks : approximation and learning theory

Halbert White with A.R. Gallant ... [et al.]

Blackwell, 1992

Available at  / 36 libraries

Search this Book/Journal

Note

Includes bibliographical references and index

Description and Table of Contents

Description

The recent re-emergence of network-based approaches to artificial intelligence has been accomplished by a virtual explosion of research. This research spans a range of disciplines - cognitive science, computer science, biology, neuroscience, electrical engineering, psychology, econometrics, philosophy, etc. which is, perhaps, wider than any other contemporary endeavour. Of all the contributing disciplines, the relatively universal language of mathematics provides some of the most powerful tools for answering fundamental questions about the capabilities and limitations of these `artificial neural networks'. In this collection, Halbert White and his colleagues present a rigorous mathematical analysis of the approximation and learning capabilities of the leading class of single hidden layer feedforward networks. Drawing together work previously scattered in space and time, the book gives a unified view of network learning not available in any other single location, and forges fundamental links between network learning and modern mathematical statistics.

Table of Contents

  • Part 1 Approximation theory: there exists a neural network that does not make avoidable mistakes, A.R. Gallant and H. White
  • multilayer feedforward networks are universal approximators, K. Hornik, et al
  • universal approximation using feedforward networks with non-sigmoid hidden layer activation functions, M. Stinchcombe and H. White
  • approximating and learning unknown mappings using multilayer feedwork networks with bounded weights, M. Stinchcombe and H. White
  • universal approximation of an unknown mapping and its derivatives, K. Hornik, et al. Part 2 Learning and statistics: neural network learning and statistics, H. White
  • learning in artificial neural networks, H. White
  • some asymptotic results for learning in single hidden layer feedforward networks, H. White
  • connectionist nonparametric regression, H. White
  • nonparametric estimation of conditional quantiles using neural networks
  • on learning the derivatives of an unknown mapping with multilayer feedforward netowrks, A.R. Gallant and H. White
  • consequences and detection of misspecified nonlinear regression models, H. White
  • maximum likelihood estimation of misspecified models, H. White
  • some results for sieve estimation with dependent observations, H. White and J. Wooldridge.

by "Nielsen BookData"

Details

Page Top