Fundamental statistical inference : a computational approach
著者
書誌事項
Fundamental statistical inference : a computational approach
(Wiley series in probability and mathematical statistics)
Wiley, 2018
- : hardback
大学図書館所蔵 件 / 全13件
-
該当する所蔵館はありません
- すべての絞り込み条件を解除する
注記
Includes bibliographical references (p. 537-560) and index
内容説明・目次
内容説明
A hands-on approach to statistical inference that addresses the latest developments in this ever-growing field
This clear and accessible book for beginning graduate students offers a practical and detailed approach to the field of statistical inference, providing complete derivations of results, discussions, and MATLAB programs for computation. It emphasizes details of the relevance of the material, intuition, and discussions with a view towards very modern statistical inference. In addition to classic subjects associated with mathematical statistics, topics include an intuitive presentation of the (single and double) bootstrap for confidence interval calculations, shrinkage estimation, tail (maximal moment) estimation, and a variety of methods of point estimation besides maximum likelihood, including use of characteristic functions, and indirect inference. Practical examples of all methods are given. Estimation issues associated with the discrete mixtures of normal distribution, and their solutions, are developed in detail. Much emphasis throughout is on non-Gaussian distributions, including details on working with the stable Paretian distribution and fast calculation of the noncentral Student's t. An entire chapter is dedicated to optimization, including development of Hessian-based methods, as well as heuristic/genetic algorithms that do not require continuity, with MATLAB codes provided.
The book includes both theory and nontechnical discussions, along with a substantial reference to the literature, with an emphasis on alternative, more modern approaches. The recent literature on the misuse of hypothesis testing and p-values for model selection is discussed, and emphasis is given to alternative model selection methods, though hypothesis testing of distributional assumptions is covered in detail, notably for the normal distribution.
Presented in three parts-Essential Concepts in Statistics; Further Fundamental Concepts in Statistics; and Additional Topics-Fundamental Statistical Inference: A Computational Approach offers comprehensive chapters on: Introducing Point and Interval Estimation; Goodness of Fit and Hypothesis Testing; Likelihood; Numerical Optimization; Methods of Point Estimation; Q-Q Plots and Distribution Testing; Unbiased Point Estimation and Bias Reduction; Analytic Interval Estimation; Inference in a Heavy-Tailed Context; The Method of Indirect Inference; and, as an appendix, A Review of Fundamental Concepts in Probability Theory, the latter to keep the book self-contained, and giving material on some advanced subjects such as saddlepoint approximations, expected shortfall in finance, calculation with the stable Paretian distribution, and convergence theorems and proofs.
目次
Preface xi
PART I ESSENTIAL CONCEPTS IN STATISTICS
1 Introducing Point and Interval Estimation 3
1.1 Point Estimation / 4
1.1.1 Bernoulli Model / 4
1.1.2 Geometric Model / 6
1.1.3 Some Remarks on Bias and Consistency / 11
1.2 Interval Estimation via Simulation / 12
1.3 Interval Estimation via the Bootstrap / 18
1.3.1 Computation and Comparison with Parametric Bootstrap / 18
1.3.2 Application to Bernoulli Model and Modification / 20
1.3.3 Double Bootstrap / 24
1.3.4 Double Bootstrap with Analytic Inner Loop / 26
1.4 Bootstrap Confidence Intervals in the Geometric Model / 31
1.5 Problems / 35
2 Goodness of Fit and Hypothesis Testing 37
2.1 Empirical Cumulative Distribution Function / 38
2.1.1 The Glivenko-Cantelli Theorem / 38
2.1.2 Proofs of the Glivenko-Cantelli Theorem / 41
2.1.3 Example with Continuous Data and Approximate Confidence Intervals / 45
2.1.4 Example with Discrete Data and Approximate Confidence Intervals / 49
2.2 Comparing Parametric and Nonparametric Methods / 52
2.3 Kolmogorov-Smirnov Distance and Hypothesis Testing / 57
2.3.1 The Kolmogorov-Smirnov and Anderson-Darling Statistics / 57
2.3.2 Significance and Hypothesis Testing / 59
2.3.3 Small-Sample Correction / 63
2.4 Testing Normality with KD and AD / 65
2.5 Testing Normality with W2 and U2 / 68
2.6 Testing the Stable Paretian Distributional Assumption: First Attempt / 69
2.7 Two-Sample Kolmogorov Test / 73
2.8 More on (Moron?) Hypothesis Testing / 74
2.8.1 Explanation / 75
2.8.2 Misuse of Hypothesis Testing / 77
2.8.3 Use and Misuse of p-Values / 79
2.9 Problems / 82
3 Likelihood 85
3.1 Introduction / 85
3.1.1 Scalar Parameter Case / 87
3.1.2 Vector Parameter Case / 92
3.1.3 Robustness and the MCD Estimator / 100
3.1.4 Asymptotic Properties of the Maximum Likelihood Estimator / 102
3.2 Cramer-Rao Lower Bound / 107
3.2.1 Univariate Case / 108
3.2.2 Multivariate Case / 111
3.3 Model Selection / 114
3.3.1 Model Misspecification / 114
3.3.2 The Likelihood Ratio Statistic / 117
3.3.3 Use of Information Criteria / 119
3.4 Problems / 120
4 Numerical Optimization 123
4.1 Root Finding / 123
4.1.1 One Parameter / 124
4.1.2 Several Parameters / 131
4.2 Approximating the Distribution of the Maximum Likelihood Estimator / 135
4.3 General Numerical Likelihood Maximization / 136
4.3.1 Newton-Raphson and Quasi-Newton Methods / 137
4.3.2 Imposing Parameter Restrictions / 140
4.4 Evolutionary Algorithms / 145
4.4.1 Differential Evolution / 146
4.4.2 Covariance Matrix Adaption Evolutionary Strategy / 149
4.5 Problems / 155
5 Methods of Point Estimation 157
5.1 Univariate Mixed Normal Distribution / 157
5.1.1 Introduction / 157
5.1.2 Simulation of Univariate Mixtures / 160
5.1.3 Direct Likelihood Maximization / 161
5.1.4 Use of the EM Algorithm / 169
5.1.5 Shrinkage-Type Estimation / 174
5.1.6 Quasi-Bayesian Estimation / 176
5.1.7 Confidence Intervals / 178
5.2 Alternative Point Estimation Methodologies / 184
5.2.1 Method of Moments Estimator / 185
5.2.2 Use of Goodness-of-Fit Measures / 190
5.2.3 Quantile Least Squares / 191
5.2.4 Pearson Minimum Chi-Square / 193
5.2.5 Empirical Moment Generating Function Estimator / 195
5.2.6 Empirical Characteristic Function Estimator / 198
5.3 Comparison of Methods / 199
5.4 A Primer on Shrinkage Estimation / 200
5.5 Problems / 202
PART II FURTHER FUNDAMENTAL CONCEPTS IN STATISTICS
6 Q-Q Plots and Distribution Testing 209
6.1 P-P Plots and Q-Q Plots / 209
6.2 Null Bands / 211
6.2.1 Definition and Motivation / 211
6.2.2 Pointwise Null Bands via Simulation / 212
6.2.3 Asymptotic Approximation of Pointwise Null Bands / 213
6.2.4 Mapping Pointwise and Simultaneous Significance Levels / 215
6.3 Q-Q Test / 217
6.4 Further P-P and Q-Q Type Plots / 219
6.4.1 (Horizontal) Stabilized P-P Plots / 219
6.4.2 Modified S-P Plots / 220
6.4.3 MSP Test for Normality / 224
6.4.4 Modified Percentile (Fowlkes-MP) Plots / 228
6.5 Further Tests for Composite Normality / 231
6.5.1 Motivation / 232
6.5.2 Jarque-Bera Test / 234
6.5.3 Three Powerful (and More Recent) Normality Tests / 237
6.5.4 Testing Goodness of Fit via Binning: Pearson's X P2 Test / 240
6.6 Combining Tests and Power Envelopes / 247
6.6.1 Combining Tests / 248
6.6.2 Power Comparisons for Testing Composite Normality / 252
6.6.3 Most Powerful Tests and Power Envelopes / 252
6.7 Details of a Failed Attempt / 255
6.8 Problems / 260
7 Unbiased Point Estimation and Bias Reduction 269
7.1 Sufficiency / 269
7.1.1 Introduction / 269
7.1.2 Factorization / 272
7.1.3 Minimal Sufficiency / 276
7.1.4 The Rao-Blackwell Theorem / 283
7.2 Completeness and the Uniformly Minimum Variance Unbiased Estimator / 286
7.3 An Example with i.i.d. Geometric Data / 289
7.4 Methods of Bias Reduction / 293
7.4.1 The Bias-Function Approach / 293
7.4.2 Median-Unbiased Estimation / 296
7.4.3 Mode-Adjusted Estimator / 297
7.4.4 The Jackknife / 302
7.5 Problems / 305
8 Analytic Interval Estimation 313
8.1 Definitions / 313
8.2 Pivotal Method / 315
8.2.1 Exact Pivots / 315
8.2.2 Asymptotic Pivots / 318
8.3 Intervals Associated with Normal Samples / 319
8.3.1 Single Sample / 319
8.3.2 Paired Sample / 320
8.3.3 Two Independent Samples / 322
8.3.4 Welch's Method for 𝜇1 𝜇2 when 𝜎12 𝜎22 / 323
8.3.5 Satterthwaite's Approximation / 324
8.4 Cumulative Distribution Function Inversion / 326
8.4.1 Continuous Case / 326
8.4.2 Discrete Case / 330
8.5 Application of the Nonparametric Bootstrap / 334
8.6 Problems / 337
PART III ADDITIONAL TOPICS
9 Inference in a Heavy-Tailed Context 341
9.1 Estimating the Maximally Existing Moment / 342
9.2 A Primer on Tail Estimation / 346
9.2.1 Introduction / 346
9.2.2 The Hill Estimator / 346
9.2.3 Use with Stable Paretian Data / 349
9.3 Noncentral Student's t Estimation / 351
9.3.1 Introduction / 351
9.3.2 Direct Density Approximation / 352
9.3.3 Quantile-Based Table Lookup Estimation / 353
9.3.4 Comparison of NCT Estimators / 354
9.4 Asymmetric Stable Paretian Estimation / 358
9.4.1 Introduction / 358
9.4.2 The Hint Estimator / 359
9.4.3 Maximum Likelihood Estimation / 360
9.4.4 The McCulloch Estimator / 361
9.4.5 The Empirical Characteristic Function Estimator / 364
9.4.6 Testing for Symmetry in the Stable Model / 366
9.5 Testing the Stable Paretian Distribution / 368
9.5.1 Test Based on the Empirical Characteristic Function / 368
9.5.2 Summability Test and Modification / 371
9.5.3 ALHADI: The 𝛼-Hat Discrepancy Test / 375
9.5.4 Joint Test Procedure / 383
9.5.5 Likelihood Ratio Tests / 384
9.5.6 Size and Power of the Symmetric Stable Tests / 385
9.5.7 Extension to Testing the Asymmetric Stable Paretian Case / 395
10 The Method of Indirect Inference 401
10.1 Introduction / 401
10.2 Application to the Laplace Distribution / 403
10.3 Application to Randomized Response / 403
10.3.1 Introduction / 403
10.3.2 Estimation via Indirect Inference / 406
10.4 Application to the Stable Paretian Distribution / 409
10.5 Problems / 416
A Review of Fundamental Concepts in Probability Theory 419
A.1 Combinatorics and Special Functions / 420
A.2 Basic Probability and Conditioning / 423
A.3 Univariate Random Variables / 424
A.4 Multivariate Random Variables / 427
A.5 Continuous Univariate Random Variables / 430
A.6 Conditional Random Variables / 432
A.7 Generating Functions and Inversion Formulas / 434
A.8 Value at Risk and Expected Shortfall / 437
A.9 Jacobian Transformations / 451
A.10 Sums and Other Functions / 453
A.11 Saddlepoint Approximations / 456
A.12 Order Statistics / 460
A.13 The Multivariate Normal Distribution / 462
A.14 Noncentral Distributions / 465
A.15 Inequalities and Convergence / 467
A.15.1 Inequalities for Random Variables / 467
A.15.2 Convergence of Sequences of Sets / 469
A.15.3 Convergence of Sequences of Random Variables / 473
A.16 The Stable Paretian Distribution / 483
A.17 Problems / 492
A.18 Solutions / 509
References 537
Index 561
「Nielsen BookData」 より