Elements of multivariate time series analysis

Bibliographic Information

Elements of multivariate time series analysis

Gregory C. Reinsel

(Springer series in statistics)

Springer, 2003, c1997

2nd ed

Available at  / 20 libraries

Search this Book/Journal

Note

"First softcover printing, 2003"--T.p. verso

Includes bibliographical references (p. [332]-344) and indexes

Description and Table of Contents

Description

Now available in paperback, this book introduces basic concepts and methods useful in the analysis and modeling of multivariate time series data. It concentrates on the time-domain analysis of multivariate time series, and assumes univariate time series analysis, while covering basic topics such as stationary processes and their covariance matrix structure, vector AR, MA, and ARMA models, forecasting, least squares and maximum likelihood estimation for ARMA models, associated likelihood ratio testing procedures.

Table of Contents

1. Vector Time Series and Model Representations.- 1.1 Stationary Multivariate Time Series and Their Properties.- 1.1.1 Covariance and Correlation Matrices for a Stationary Vector Process.- 1.1.2 Some Spectral Characteristics for a Stationary Vector Process.- 1.1.3 Some Relations for Linear Filtering of a Stationary Vector Process.- 1.2 Linear Model Representations for a Stationary Vector Process.- 1.2.1 Infinite Moving Average (Wold) Representation of a Stationary Vector Process.- 1.2.2 Vector Autoregressive Moving Average (ARMA) Model Representations.- A1 Appendix: Review of Multivariate Normal Distribution and Related Topics.- A l. l Review of Some Basic Matrix Theory Results.- A l. 2 Vec Operator and Kronecker Products of Matrices.- A l. 3 Expected Values and Covariance Matrices of Random Vectors.- A1.4 The Multivariate Normal Distribution.- A1.5 Some Basic Results on Stochastic Convergence.- 2. Vector ARMA Time Series Models and Forecasting.- 2.1 Vector Moving Average Models.- 2.1.1 Invertibility of the Vector Moving Average Model.- 2.1.2 Covariance Matrices of the Vector Moving Average Model.- 2.1.3 Features of the Vector MA(1) Model.- 2.1.4 Model Structure for Subset of Components in the Vector MA Model.- 2.2 Vector Autoregressive Models.- 2.2.1 Stationarity of the Vector Autoregressive Model.- 2.2.2 Yule-Walker Relations for Covariance Matrices of a Vector AR Process.- 2.2.3 Covariance Features of the Vector AR(1) Model.- 2.2.4 Univariate Model Structure Implied by Vector AR Model.- 2.3 Vector Mixed Autoregressive Moving Average Models.- 2.3.1 Stationarity and Invertibility of the Vector ARMA Model.- 2.3.2 Relations for the Covariance Matrices of the Vector ARMA Model.- 2.3.3 Some Features of the Vector ARMA(1,1) Model.- 2.3.4 Consideration of Parameter Identifiability for Vector ARMA Models.- 2.3.5 Further Aspects of Nonuniqueness of Vector ARMA Model Representations.- 2.4 Nonstationary Vector ARMA Models.- 2.4.1 Vector ARIMA Models for Nonstationary Processes.- 2.4.2 Cointegration in Nonstationary Vector Processes.- 2.4.3 The Vector IMA(1,1) Process or Exponential Smoothing Model.- 2.5 Prediction for Vector ARMA Models.- 2.5.1 Minimum Mean Squared Error Prediction.- 2.5.2 Forecasting for Vector ARMA Processes and Covariance Matrices of Forecast Errors.- 2.5.3 Computation of Forecasts for Vector ARMA Processes.- 2.5.4 Some Examples of Forecast Functions for Vector ARMA Models.- 2.6 State-Space Form of the Vector ARMA Model.- A2 Appendix: Methods for Obtaining Autoregressive and Moving Average Parameters from Covariance Matrices.- A2.1 Iterative Algorithm for Factorization of Moving Average Spectral Density Matrix in Terms of Covariance Matrices.- A2.2 Autoregressive and Moving Average Parameter Matrices in Terms of Covariance Matrices for the Vector ARMA Model.- A2.3 Evaluation of Covariance Matrices in Terms of the AR and MA Parameters for the Vector ARMA Model.- 3. Canonical Structure of Vector ARMA Models.- 3.1 Consideration of Kronecker Structure for Vector ARMA Models.- 3.1.1 Kronecker Indices and McMillan Degree of Vector ARMA Process.- 3.1.2 Echelon Form Structure of Vector ARMA Model Implied by Kronecker Indices.- 3.1.3 Reduced-Rank Form of Vector ARMA Model Implied by Kronecker Indices.- 3.2 Canonical Correlation Structure for ARMA Time Series.- 3.2.1 Review of Canonical Correlations in Multivariate Analysis.- 3.2.2 Canonical Correlations for Vector ARMA Processes.- 3.2.3 Relation to Scalar Component Model Structure.- 3.3 Partial Autoregressive and Partial Correlation Matrices.- 3.3.1 Vector Autoregressive Model Approximations and Partial Autoregression Matrices.- 3.3.2 Recursive Fitting of Vector AR Model Approximations.- 3.3.3 Partial Cross-Correlation Matrices for a Stationary Vector Process.- 3.3.4 Partial Canonical Correlations for a Stationary Vector Process.- 4. Initial Model Building and Least Squares Estimation for Vector AR Models.- 4.1 Sample Cross-Covariance and Correlation Matrices and Their Properties.- 4.1.1 Sample Estimates of Mean Vector and of Covariance and Correlation Matrices.- 4.1.2 Asymptotic Properties of Sample Correlations.- 4.2 Sample Partial AR and Partial Correlation Matrices and Their Properties.- 4.2.1 Test for Order of AR Model Based on Sample Partial Autoregression Matrices.- 4.2.2 Equivalent Test Statistics Based on Sample Partial Correlation Matrices.- 4.3 Conditional Least Squares Estimation of Vector AR Models.- 4.3.1 Least Squares Estimation for the Vector AR(1) Model.- 4.3.2 Least Squares Estimation for the Vector AR Model of General Order.- 4.3.3 Likelihood Ratio Testing for the Order of the AR Model.- 4.3.4 Derivation of the Wald Statistic for Testing the Order of the AR Model.- 4.4 Relation of LSE to Yule-Walker Estimate for Vector AR Models.- 4.5 Additional Techniques for Specification of Vector ARMA Models.- 4.5.1 Use of Order Selection Criteria for Model Specification.- 4.5.2 Sample Canonical Correlation Analysis Methods.- 4.5.3 Order Determination Using Linear LSE Methods for the Vector ARMA Model.- A4 Appendix: Review of the General Multivariate Linear Regression Model.- A4.1 Properties of the Maximum Likelihood Estimator of the Regression Matrix.- A4.2 Likelihood Ratio Test of Linear Hypothesis About Regression Coefficients.- A4.3 Asymptotically Equivalent Forms of the Test of Linear Hypothesis.- A4.4 Multivariate Linear Model with Reduced-Rank Structure.- A4.5 Generalization to Seemingly Unrelated Regressions Model.- 5. Maximum Likelihood Estimation and Model Checking for Vector ARMA Models.- 5.1 Conditional Maximum Likelihood Estimation for Vector ARMA Models.- 5.1.1 Conditional Likelihood Function for the Vector ARMA Model.- 5.1.2 Likelihood Equations for Conditional ML Estimation.- 5.1.3 Iterative Computation of the Conditional MLE by GLS Estimation.- 5.1.4 Asymptotic Distribution for the MLE in the Vector ARMA Model.- 5.2 ML Estimation and LR Testing of ARMA Models Under Linear Restrictions.- 5.2.1 ML Estimation of Vector ARMA Models with Linear Constraints on the Parameters.- 5.2.2 LR Testing of the Hypothesis of the Linear Constraints.- 5.2.3 ML Estimation of Vector ARMA Models in the Echelon Canonical Form.- 5.3 Exact Likelihood Function for Vector ARMA Models.- 5.3.1 Expressions for the Exact Likelihood Function and Exact Backcasts.- 5.3.2 Special Cases of the Exact Likelihood Results.- 5.3.3 Finite Sample Forecast Results Based on the Exact Likelihood Approach.- 5.4 Innovations Form of the Exact Likelihood Function for ARMA Models.- 5.4.1 Use of Innovations Algorithm Approach for the Exact Likelihood.- 5.4.2 Prediction of Vector ARMA Processes Using the Innovations Approach.- 5.5 Overall Checking for Model Adequacy.- 5.5.1 Residual Correlation Matrices and Overall Goodness-of-Fit Test.- 5.5.2 Asymptotic Distribution of Residual Covariances and Goodness-of-Fit Statistic.- 5.5.3 Use of the Score Test Statistic for Model Diagnostic Checking.- 5.6 Effects of Parameter Estimation Errors on Prediction Properties.- 5.6.1 Effects of Parameter Estimation Errors on Forecasting in the Vector AR(p) Model.- 5.6.2 Prediction Through Approximation by Autoregressive Model Fitting.- 5.7 Motivation for AIC as Criterion for Model Selection, and Corrected Versions of AIC.- 5.8 Numerical Examples.- 6. Reduced-Rank and Nonstationary Cointegrated Models.- 6.1 Nested Reduced-Rank AR Models and Partial Canonical Correlation Analysis.- 6.1.1 Specification of Ranks Through Partial Canonical Correlation Analysis.- 6.1.2 Canonical Form for the Reduced-Rank Model.- 6.1.3 Maximum Likelihood Estimation of Parameters in the Model.- 6.1.4 Relation of Reduced-Rank AR Model with Scalar Component Models and Kronecker Indices.- 6.2 Review of Estimation and Testing for Nonstationarity (Unit Roots) in Univariate ARIMA Models.- 6.2.1 Limiting Distribution Results in the AR(1) Model with a Unit Root.- 6.2.2 Unit-Root Distribution Results for General Order AR Models.- 6.3 Nonstationary (Unit-Root) Multivariate AR Models, Estimation, and Testing.- 6.3.1 Unit-Root Nonstationary Vector AR Model, the Error-Correction Form, and Cointegration.- 6.3.2 Asymptotic Properties of the Least Squares Estimator.- 6.3.3 Reduced-Rank Estimation of the Error-Correction Form of the Model.- 6.3.4 Likelihood Ratio Test for the Number of Unit Roots 199 6.3.5 Reduced-Rank Estimation Through Partial Canonical Correlation Analysis.- 6.3.6 Extension to Account for a Constant Term in the Estimation.- 6.3.7 Forecast Properties for the Cointegrated Model.- 6.3.8 Explicit Unit-Root Structure of the Nonstationary AR Model and Implications.- 6.3.9 Further Numerical Examples.- 6.4 A Canonical Analysis for Vector Autoregressive Time Series.- 6.4.1 Canonical Analysis Based on Measure of Predictability.- 6.4.2 Application to the Analysis of Nonstationary Series for Cointegration.- 6.5 Multiplicative Seasonal Vector ARMA Models.- 6.5.1 Some Special Seasonal ARMA Models for Vector Time Series.- 7. State-Space Models, Kaiman Filtering, and Related Topics.- 7.1 State-Variable Models and Kaiman Filtering.- 7.1.1 The Kaiman Filtering Relations.- 7.1.2 Smoothing Relations in the State-Variable Model.- 7.1.3 Innovations Form of State-Space Model and Steady State for Time-Invariant Models.- 7.1.4 Controllability, Observability, and Minimality for Time-Invariant Models.- 7.2 State-Variable Representations of the Vector ARMA Model.- 7.2.1 A State-Space Form Based on the Prediction Space of Future Values.- 7.2.2 Exact Likelihood Function Through the State-Variable Approach.- 7.2.3 Alternate State-Space Forms for the Vector ARMA Model.- 7.2.4 Minimal Dimension State-Variable Representation and Kronecker Indices.- 7.2.5 (Minimal Dimension) Echelon Canonical State-Space Representation.- 7.3 Exact Likelihood Estimation for Vector ARMA Processes with Missing Values.- 7.3.1 State-Space Model and Kaiman Filtering with Missing Values.- 7.3.2 Estimation of Missing Values in ARMA Models.- 7.3.3 Initialization for Kaiman Filtering, Smoothing, and Likelihood Evaluation in Nonstationary Models.- 7.4 Classical Approach to Smoothing and Filtering of Time Series.- 7.4.1 Smoothing for Univariate Time Series.- 7.4.2 Smoothing Relations for the Signal Plus Noise or Structural Components Model.- 7.4.3 A Simple Vector Structural Component Model for Trend.- 8. Linear Models with Exogenous Variables.- 8.1 Representations of Linear Models with Exogenous Variables.- 8.2 Forecasting in ARMAX Models.- 8.2.1 Forecasts When Future Exogenous Variables Must Be Forecasted.- 8.2.2 MSE Matrix of Optimal Forecasts.- 8.2.3 Forecasting When Future Exogenous Variables Are Specified.- 8.3 Optimal Feedback Control in ARMAX Models.- 8.4 Model Specification, ML Estimation, and Model Checking for ARMAX Models.- 8.4.1 Some Comments on Specification and Checking of ARMAX Models.- 8.4.2 ML Estimation for ARMAX Models.- 8.4.3 Asymptotic Distribution Theory of Estimators in ARMAX Models.- 8.5 Numerical Example.- Appendix: Time Series Data Sets.- Exercises and Problems.- References.- Author Index.

by "Nielsen BookData"

Related Books: 1-1 of 1

Details

Page Top