Nonlinear time series analysis
著者
書誌事項
Nonlinear time series analysis
(Wiley series in probability and mathematical statistics)
John Wiley & Sons, 2019
- : cloth
大学図書館所蔵 全19件
  青森
  岩手
  宮城
  秋田
  山形
  福島
  茨城
  栃木
  群馬
  埼玉
  千葉
  東京
  神奈川
  新潟
  富山
  石川
  福井
  山梨
  長野
  岐阜
  静岡
  愛知
  三重
  滋賀
  京都
  大阪
  兵庫
  奈良
  和歌山
  鳥取
  島根
  岡山
  広島
  山口
  徳島
  香川
  愛媛
  高知
  福岡
  佐賀
  長崎
  熊本
  大分
  宮崎
  鹿児島
  沖縄
  韓国
  中国
  タイ
  イギリス
  ドイツ
  スイス
  フランス
  ベルギー
  オランダ
  スウェーデン
  ノルウェー
  アメリカ
注記
Includes bibliographical references and index
内容説明・目次
内容説明
A comprehensive resource that draws a balance between theory and applications of nonlinear time series analysis
Nonlinear Time Series Analysis offers an important guide to both parametric and nonparametric methods, nonlinear state-space models, and Bayesian as well as classical approaches to nonlinear time series analysis. The authors-noted experts in the field-explore the advantages and limitations of the nonlinear models and methods and review the improvements upon linear time series models.
The need for this book is based on the recent developments in nonlinear time series analysis, statistical learning, dynamic systems and advanced computational methods. Parametric and nonparametric methods and nonlinear and non-Gaussian state space models provide a much wider range of tools for time series analysis. In addition, advances in computing and data collection have made available large data sets and high-frequency data. These new data make it not only feasible, but also necessary to take into consideration the nonlinearity embedded in most real-world time series. This vital guide:
* Offers research developed by leading scholars of time series analysis
* Presents R commands making it possible to reproduce all the analyses included in the text
* Contains real-world examples throughout the book
* Recommends exercises to test understanding of material presented
* Includes an instructor solutions manual and companion website
Written for students, researchers, and practitioners who are interested in exploring nonlinearity in time series, Nonlinear Time Series Analysis offers a comprehensive text that explores the advantages and limitations of the nonlinear models and methods and demonstrates the improvements upon linear time series models.
目次
Preface xiii
1 Why Should We Care About Nonlinearity? 1
1.1 Some Basic Concepts 2
1.2 Linear Time Series 3
1.3 Examples of Nonlinear Time Series 3
1.4 Nonlinearity Tests 20
1.4.1 Nonparametric Tests 21
1.4.2 Parametric Tests 31
1.5 Exercises 38
References 39
2 Univariate Parametric Nonlinear Models 41
2.1 A General Formulation 41
2.1.1 Probability Structure 42
2.2 Threshold Autoregressive Models 43
2.2.1 A Two-regime TAR Model 44
2.2.2 Properties of Two-regime TAR(1) Models 45
2.2.3 Multiple-regime TAR Models 48
2.2.4 Estimation of TAR Models 50
2.2.5 TAR Modeling 52
2.2.6 Examples 55
2.2.7 Predictions of TAR Models 62
2.3 Markov Switching Models 63
2.3.1 Properties of Markov Switching Models 66
2.3.2 Statistical Inference of the State Variable 66
2.3.3 Estimation of Markov Switching Models 69
2.3.4 Selecting the Number of States 75
2.3.5 Prediction of Markov Switching Models 75
2.3.6 Examples 76
2.4 Smooth Transition Autoregressive Models 92
2.5 Time-varying Coefficient Models 99
2.5.1 Functional Coefficient AR Models 99
2.5.2 Time-varying Coefficient AR Models 104
2.6 Appendix: Markov Chains 111
2.7 Exercises 114
References 116
3 Univariate Nonparametric Models 119
3.1 Kernel Smoothing 119
3.2 Local Conditional Mean 125
3.3 Local Polynomial Fitting 129
3.4 Splines 134
3.4.1 Cubic and B-Splines 138
3.4.2 Smoothing Splines 141
3.5 Wavelet Smoothing 145
3.5.1 Wavelets 145
3.5.2 The Wavelet Transform 147
3.5.3 Thresholding and Smoothing 150
3.6 Nonlinear Additive Models 158
3.7 Index Model and Sliced Inverse Regression 164
3.8 Exercises 169
References 170
4 Neural Networks, Deep Learning, and Tree-based Methods 173
4.1 Neural Networks 173
4.1.1 Estimation or Training of Neural Networks 176
4.1.2 An Example 179
4.2 Deep Learning 181
4.2.1 Deep Belief Nets 182
4.2.2 Demonstration 184
4.3 Tree-based Methods 195
4.3.1 Decision Trees 195
4.3.2 Random Forests 212
4.4 Exercises 214
References 215
5 Analysis of Non-Gaussian Time Series 217
5.1 Generalized Linear Time Series Models 218
5.1.1 Count Data and GLARMA Models 220
5.2 Autoregressive Conditional Mean Models 229
5.3 Martingalized GARMA Models 232
5.4 Volatility Models 234
5.5 Functional Time Series 245
5.5.1 Convolution FAR models 248
5.5.2 Estimation of CFAR Models 251
5.5.3 Fitted Values and Approximate Residuals 253
5.5.4 Prediction 253
5.5.5 Asymptotic Properties 254
5.5.6 Application 254
Appendix: Discrete Distributions for Count Data 260
5.6 Exercises 261
References 263
6 State Space Models 265
6.1 A General Model and Statistical Inference 266
6.2 Selected Examples 269
6.2.1 Linear Time Series Models 269
6.2.2 Time Series with Observational Noises 271
6.2.3 Time-varying Coefficient Models 272
6.2.4 Target Tracking 273
6.2.5 Signal Processing in Communications 279
6.2.6 Dynamic Factor Models 283
6.2.7 Functional and Distributional Time Series 284
6.2.8 Markov Regime Switching Models 289
6.2.9 Stochastic Volatility Models 290
6.2.10 Non-Gaussian Time Series 291
6.2.11 Mixed Frequency Models 291
6.2.12 Other Applications 292
6.3 Linear Gaussian State Space Models 293
6.3.1 Filtering and the Kalman Filter 293
6.3.2 Evaluating the likelihood function 295
6.3.3 Smoothing 297
6.3.4 Prediction and Missing Data 299
6.3.5 Sequential Processing 300
6.3.6 Examples and R Demonstrations 300
6.4 Exercises 325
References 327
7 Nonlinear State Space Models 335
7.1 Linear and Gaussian Approximations 335
7.1.1 Kalman Filter for Linear Non-Gaussian Systems 336
7.1.2 Extended Kalman Filters for Nonlinear Systems 336
7.1.3 Gaussian Sum Filters 338
7.1.4 The Unscented Kalman Filter 339
7.1.5 Ensemble Kalman Filters 341
7.1.6 Examples and R implementations 342
7.2 Hidden Markov Models 351
7.2.1 Filtering 351
7.2.2 Smoothing 352
7.2.3 The Most Likely State Path: the Viterbi Algorithm 355
7.2.4 Parameter Estimation: the Baum-Welch Algorithm 356
7.2.5 HMM Examples and R Implementation 358
7.3 Exercises 371
References 372
8 Sequential Monte Carlo 375
8.1 A Brief Overview of Monte Carlo Methods 376
8.1.1 General Methods of Generating Random Samples 378
8.1.2 Variance Reduction Methods 384
8.1.3 Importance Sampling 387
8.1.4 Markov Chain Monte Carlo 398
8.2 The SMC Framework 402
8.3 Design Issue I: Propagation 410
8.3.1 Proposal Distributions 411
8.3.2 Delay Strategy (Lookahead) 415
8.4 Design Issue II: Resampling 421
8.4.1 The Priority Score 422
8.4.2 Choice of Sampling Methods in Resampling 423
8.4.3 Resampling Schedule 425
8.4.4 Benefits of Resampling 426
8.5 Design Issue III: Inference 428
8.6 Design Issue IV: Marginalization and the Mixture Kalman Filter 429
8.6.1 Conditional Dynamic Linear Models 429
8.6.2 Mixture Kalman Filters 430
8.7 Smoothing with SMC 433
8.7.1 Simple Weighting Approach 433
8.7.2 Weight Marginalization Approach 434
8.7.3 Two-filter Sampling 436
8.8 Parameter Estimation with SMC 438
8.8.1 Maximum Likelihood Estimation 438
8.8.2 Bayesian Parameter Estimation 441
8.8.3 Varying Parameter Approach 441
8.9 Implementation Considerations 442
8.10 Examples and R Implementation 444
8.10.1 R Implementation of SMC: Generic SMC and Resampling Methods 444
8.10.2 Tracking in a Clutter Environment 449
8.10.3 Bearing-only Tracking with Passive Sonar 466
8.10.4 Stochastic Volatility Models 471
8.10.5 Fading Channels as Conditional Dynamic Linear Models 478
8.11 Exercises 486
References 487
Index 493
「Nielsen BookData」 より