Information dynamics : foundations and applications
著者
書誌事項
Information dynamics : foundations and applications
Springer, c2001
大学図書館所蔵 全16件
  青森
  岩手
  宮城
  秋田
  山形
  福島
  茨城
  栃木
  群馬
  埼玉
  千葉
  東京
  神奈川
  新潟
  富山
  石川
  福井
  山梨
  長野
  岐阜
  静岡
  愛知
  三重
  滋賀
  京都
  大阪
  兵庫
  奈良
  和歌山
  鳥取
  島根
  岡山
  広島
  山口
  徳島
  香川
  愛媛
  高知
  福岡
  佐賀
  長崎
  熊本
  大分
  宮崎
  鹿児島
  沖縄
  韓国
  中国
  タイ
  イギリス
  ドイツ
  スイス
  フランス
  ベルギー
  オランダ
  スウェーデン
  ノルウェー
  アメリカ
注記
Includes bibliographical references (p. 263-274) and index
内容説明・目次
内容説明
This book offers a new, theoretical approach to information dynamics, i.e., information processing in complex dynamical systems. The presentation establishes a consistent theoretical framework for the problem of discovering knowledge behind empirical, dynamical data and addresses applications in information processing and coding in dynamical systems. This will be an essential reference for those in neural computing, information theory, nonlinear dynamics and complex systems modeling.
目次
l Introduction.- 2 Dynamical Systems: An Overview 7.- 2.1 Deterministic Dynamical Systems.- 2.1.1 Fundamental Concepts.- 2.1.2 Attractors.- 2.1.3 Strange Attractors: Chaotic Dynamics.- 2.1.4 Quantitative Description of Chaos.- 2.1.5 Chaotic Dynamical Systems.- 2.2 Stochastic Dynamical Systems.- 2.2.1 Gaussian White Noise.- 2.2.2 Markov Processes.- 2.2.3 Linear and Nonlinear Stochastic Dynamics.- 2.3 Statistical Time-Series Analysis.- 2.3.1 Nonstationarity: Slicing Windows.- 2.3.2 Linear Statistical Inference: Correlations and Power Spectrum.- 2.3.3 Linear Filter.- 3 Statistical Structure Extraction in Dynamical Systems: Parametric Formulation.- 3.1 Basic Concepts of Information Theory.- 3.2 Parametric Estimation : Maximum-Likelihood Principle.- 3.2.1 Bayesian Estimation.- 3.2.2 Maximum Likelihood.- 3.2.3 Maximum-Entropy Principle.- 3.2.4 Minimum Kullback-Leibler Entropy.- 3.3 Linear Models.- 3.4 Nonlinear Models.- 3.4.1 Feedforward Neural Networks.- 3.4.2 Recurrent Neural Networks.- 3.5 Density Estimation.- 3.6 Information-Theoretic Approach to Time-Series Modeling: Redundancy Extraction.- 3.6.1 Generalities.- 3.6.2 Unsupervised Learning : Independent Component Analysis for Univariate Time Series.- 3.6.3 Unsupervised Learning: Independent Component Analysis for Multivariate Time Series.- 3.6.4 Supervised Learning : Maximum-Likelihood.- 4 Applications: Parametric Characterization of Time Series.- 4.1 Feedforward Learning : Chaotic Dynamics.- 4.2 Recurrent Learning : Chaotic Dynamics.- 4.3 Dynamical Overtraining and Lyapunov Penalty Term.- 4.4 Feedforward and Recurrent Learning of Biomedical Data.- 4.5 Unsupervised Redundancy-Extraction-Based Modeling: Chaotic Dynamics.- 4.5.1 Univariate Time Series : Mackey-Glass.- 4.5.2 Multivariate Time Series : Taylor-Couette.- 4.6 Unsupervised Redundancy Extraction Modeling: Biomedical Data.- 5 Statistical Structure Extraction in Dynamical Systems: Nonparametric Formulation.- 5.1 Nonparametric Detection ofStatistical Dependencies in Time Series.- 5.1.1 Introduction and Historical Perspective.- 5.1.2 Statistical Independence Measure.- 5.1.3 Statistical Test: The Surrogates Method.- 5.1.4 Nonstationarity.- 5.1.5 A Qualitative Test of Nonlinearity.- 5.2 Nonparametric Characterization of Dynamics: The Information Flow Concept.- 5.2.1 Introduction and Historical Perspective.- 5.2.2 Information Flow for Finite Partitions.- 5.2.3 Information Flow for Infinitesimal Partition.- 5.3 Information Flow and Coarse Graining.- 5.3.1 Generalized Correlation Functions.- 5.3.2 Distinguishing Different Dynamics.- 6 Applications: Nonparametric Characterization of Time Series.- 6.1 Detecting Nonlinear Correlations in Time Series.- 6.1.1 Test ofNonlinearity.- 6.1.2 Testing Predictability: Artificial Time Series.- 6.1.3 Testing Predictability: Real-World Time Series.- 6.1.4 Data Selection.- 6.1.5 Sensitivity Analysis.- 6.2 Nonparametric Analysis of Time Series : Optimal Delay Selection.- 6.2.1 Nonchaotic Deterministic.- 6.2.2 Linear Stochastic.- 6.2.3 Chaotic Deterministic.- 6.3 Determining the Information Flow ofDynamical Systems from Continuous Probability Distributions.- 6.4 Dynamical Characterization ofTime Signals: The Integrated Information Flow.- 6.5 Information Flow and Coarse Graining: Numerical Experiments.- 6.5.1 The Logistic Map.- 6.5.2 White and Colored Noise.- 6.5.3 EEG Signals.- 7 Statistical Structure Extraction in Dynamical Systems: Semiparametric Formulation.- 7.1 Markovian Characterization of Univariate Time Series.- 7.1.1 Measures ofIndependence.- 7.1.2 Markovian Dynamics and Information Flow.- 7.2 Markovian Characterization of Multivariate Time Series.- 7.2.1 Multidimensional Cumulant-Based Measure of Information Flow.- 7.2.2 Nonlinear N-dimensional Markov Models as Approximations ofthe Original Time Series.- 8 Applications: Semiparametric Characterization of Time Series.- 8.1 Univariate Time Series : Artificial Data.- 8.1.1 Autoregressive Models : Linear Correlations.- 8.1.2 Nonlinear Dependencies: Non-Chaos, Chaos, and Noisy Chaos.- 8.2 Univariate Time Series: Real-World Data.- 8.2.1 Monthly Sunspot Numbers.- 8.2.2 The Hidden Dynamics of the Heart Rate Variability.- 8.3 Multivariate Time Series: Artificial Data.- 8.3.1 Autoregressive Time Series.- 8.3.2 Nonlinear Time Series.- 8.3.3 Chaotic Time Series : The Henon Map.- 8.4 Multivariate Time Series : Tumor Detection in EEG Time Series.- 9 Information Processing and Coding in Spatiotemporal Dynamical Systems: Spiking Networks.- 9.1 Spiking Neurons.- 9.1.1 Theoretical Models.- 9.1.2 Rate Coding versus Temporal Coding.- 9.2 Information Processing and Coding in Single Spiking Neurons.- 9.3 Information Processing and Coding in Networks of Spiking Neurons.- 9.4 The Processing and Coding ofDynamical Systems.- 10 Applications: Information Processing and Coding in Spatiotemporal Dynamical Systems.- 10.1 The Binding Problem.- 10.2 Discrimination of Stimulus by Spiking Neural Networks.- 10.2.1 The Task: Visual Stimulus Discrimination.- 10.2.2 The Neural Network: Cortical Architecture.- 10.3 Numerical Experiments.- Epilogue.- Appendix A Chain Rules, Inequalities and Other Useful Theorems in Information Theory.- A.1 Chain Rules.- A.2 Fundamental Inequalities ofInformation Theory.- Appendix B Univariate and Multivariate Cumulants.- Appendix C Information Flow of Chaotic Systems: Thermodynamical Formulation.- Appendix D Generalized Discriminability by the Spike Response Model ofa Single Spiking Neuron: Analytical Results.- References.
「Nielsen BookData」 より