Memory and the computational brain : why cognitive science will transform neuroscience
著者
書誌事項
Memory and the computational brain : why cognitive science will transform neuroscience
(Blackwell/Maryland lectures in language and cognition)
Wiley-Blackwell, 2009
- : pbk
大学図書館所蔵 件 / 全6件
-
該当する所蔵館はありません
- すべての絞り込み条件を解除する
注記
Includes bibliographical references (p. [288]-298) and index
内容説明・目次
内容説明
Memory and the Computational Brain offers a provocative argument that goes to the heart of neuroscience, proposing that the field can and should benefit from the recent advances of cognitive science and the development of information theory over the course of the last several decades.
A provocative argument that impacts across the fields of linguistics, cognitive science, and neuroscience, suggesting new perspectives on learning mechanisms in the brain
Proposes that the field of neuroscience can and should benefit from the recent advances of cognitive science and the development of information theory
Suggests that the architecture of the brain is structured precisely for learning and for memory, and integrates the concept of an addressable read/write memory mechanism into the foundations of neuroscience
Based on lectures in the prestigious Blackwell-Maryland Lectures in Language and Cognition, and now significantly reworked and expanded to make it ideal for students and faculty
目次
Preface viii
1 Information 1
Shannon's Theory of Communication 2
Measuring Information 7
Efficient Coding 16
Information and the Brain 20
Digital and Analog Signals 24
Appendix: The Information Content of Rare Versus Common 25
Events and Signals
2 Bayesian Updating 27
Bayes' Theorem and Our Intuitions about Evidence 30
Using Bayes' Rule 32
Summary 41
3 Functions 43
Functions of One Argument 43
Composition and Decomposition of Functions 46
Functions of More than One Argument 48
The Limits to Functional Decomposition 49
Functions Can Map to Multi-Part Outputs 49
Mapping to Multiple-Element Outputs Does Not Increase Expressive Power 50
Defining Particular Functions 51
Summary: Physical/Neurobiological Implications of Facts about Functions 53
4 Representations 55
Some Simple Examples 56
Notation 59
The Algebraic Representation of Geometry 64
5 Symbols 72
Physical Properties of Good Symbols 72
Symbol Taxonomy 79
Summary 82
6 Procedures 85
Algorithms 85
Procedures, Computation, and Symbols 87
Coding and Procedures 89
Two Senses of Knowing 100
A Geometric Example 101
7 Computation 104
Formalizing Procedures 105
The Turing Machine 107
Turing Machine for the Successor Function 110
Turing Machines for fis even 111
Turing Machines for f+ 115
Minimal Memory Structure 121
General Purpose Computer 122
Summary 124
8 Architectures 126
One-Dimensional Look-Up Tables (If-Then Implementation) 128
Adding State Memory: Finite-State Machines 131
Adding Register Memory 137
Summary 144
9 Data Structures 149
Finding Information in Memory 151
An Illustrative Example 160
Procedures and the Coding of Data Structures 165
The Structure of the Read-Only Biological Memory 167
10 Computing with Neurons 170
Transducers and Conductors 171
Synapses and the Logic Gates 172
The Slowness of It All 173
The Time-Scale Problem 174
Synaptic Plasticity 175
Recurrent Loops in Which Activity Reverberates 183
11 The Nature of Learning 187
Learning As Rewiring 187
Synaptic Plasticity and the Associative Theory of Learning 189
Why Associations Are Not Symbols 191
Distributed Coding 192
Learning As the Extraction and Preservation of Useful Information 196
Updating an Estimate of One's Location 198
12 Learning Time and Space 207
Computational Accessibility 207
Learning the Time of Day 208
Learning Durations 211
Episodic Memory 213
13 The Modularity of Learning 218
Example 1: Path Integration 219
Example 2: Learning the Solar Ephemeris 220
Example 3: "Associative" Learning 226
Summary 241
14 Dead Reckoning in a Neural Network 242
Reverberating Circuits as Read/Write Memory Mechanisms 245
Implementing Combinatorial Operations by Table-Look-Up 250
The Full Model 251
The Ontogeny of the Connections? 252
How Realistic Is the Model? 254
Lessons to Be Drawn 258
Summary 265
15 Neural Models of Interval Timing 266
Timing an Interval on First Encounter 266
Dworkin's Paradox 268
Neurally Inspired Models 269
The Deeper Problems 276
16 The Molecular Basis of Memory 278
The Need to Separate Theory of Memory from Theory of Learning 278
The Coding Question 279
A Cautionary Tale 281
Why Not Synaptic Conductance? 282
A Molecular or Sub-Molecular Mechanism? 283
Bringing the Data to the Computational Machinery 283
Is It Universal? 286
References 288
Glossary 299
Index 312
「Nielsen BookData」 より