Regression analysis by example using R

書誌事項

Regression analysis by example using R

Ali S. Hadi, Samprit Chatterjee

(Wiley series in probability and mathematical statistics)

Wiley, c2024

6th ed

  • : hbk

大学図書館所蔵 件 / 10

この図書・雑誌をさがす

注記

Includes bibliographical references and index

内容説明・目次

内容説明

Regression Analysis By Example Using R A STRAIGHTFORWARD AND CONCISE DISCUSSION OF THE ESSENTIALS OF REGRESSION ANALYSIS In the newly revised sixth edition of Regression Analysis By Example Using R, distinguished statistician Dr Ali S. Hadi delivers an expanded and thoroughly updated discussion of exploratory data analysis using regression analysis in R. The book provides in-depth treatments of regression diagnostics, transformation, multicollinearity, logistic regression, and robust regression. The author clearly demonstrates effective methods of regression analysis with examples that contain the types of data irregularities commonly encountered in the real world. This newest edition also offers a brand-new, easy to read chapter on the freely available statistical software package R. Readers will also find: Reorganized, expanded, and upgraded exercises at the end of each chapter with an emphasis on data analysis Updated data sets and examples throughout the book Complimentary access to a companion website that provides data sets in xlsx, csv, and txt format Perfect for upper-level undergraduate or beginning graduate students in statistics, mathematics, biostatistics, and computer science programs, Regression Analysis By Example Using R will also benefit readers who need a reference for quick updates on regression methods and applications.

目次

Preface xiv 1 Introduction 1 1.1 What Is Regression Analysis? 1 1.2 Publicly Available Data Sets 2 1.3 Selected Applications of Regression Analysis 3 1.3.1 Agricultural Sciences 3 1.3.2 Industrial and Labor Relations 4 1.3.3 Government 5 1.3.4 History 5 1.3.5 Environmental Sciences 6 1.3.6 Industrial Production 6 1.3.7 The Space Shuttle Challenger 7 1.3.8 Cost of Health Care 7 1.4 Steps in Regression Analysis 7 1.4.1 Statement of the Problem 9 1.4.2 Selection of Potentially Relevant Variables 9 1.4.3 Data Collection 9 1.4.4 Model Specification 10 1.4.5 Method of Fitting 12 1.4.6 Model Fitting 13 1.4.7 Model Criticism and Selection 14 1.4.8 Objectives of Regression Analysis 15 1.5 Scope and Organization of the Book 16 2 A Brief Introduction to R 19 2.1 What Is R and RStudio? 19 2.2 Installing R and RStudio 20 2.3 Getting Started With R 21 2.3.1 Command Level Prompt 21 2.3.2 Calculations Using R 22 2.3.3 Editing Your R Code 24 2.3.4 Best Practice: Object Names in R 25 2.4 Data Values and Objects in R 25 2.4.1 Types of Data Values in R 25 2.4.2 Types (Structures) of Objects in R 28 2.4.3 Object Attributes 34 2.4.4 Testing (Checking) Object Type 34 2.4.5 Changing Object Type 34 2.5 R Packages (Libraries) 35 2.5.1 Installing R Packages 35 2.5.2 Name Spaces 36 2.5.3 Updating R 37 2.5.4 Datasets in R Packages 37 2.6 Importing (Reading) Data into R Workspace 37 2.6.1 Best Practice: Working Directory 38 2.6.2 Reading ASCII (Text) Files 38 2.6.3 Reading CSV Files 40 2.6.4 Reading Excel Files 40 2.6.5 Reading Files from the Internet 41 2.7 Writing (Exporting) Data to Files 42 2.7.1 Diverting Normal R Output to a File 42 2.7.2 Saving Graphs in Files 42 2.7.3 Exporting Data to Files 43 2.8 Some Arithmetic and Other Operators 43 2.8.1 Vectors 43 2.8.2 Matrix Computations 45 2.9 Programming in R 50 2.9.1 Best Practice: Script Files 50 2.9.2 Some Useful Commands or Functions 50 2.9.3 Conditional Execution 51 2.9.4 Loops 53 2.9.5 Functions and Functionals 54 2.9.6 User Defined Functions 55 2.10 Bibliographic Notes 60 3 Simple Linear Regression 65 3.1 Introduction 65 3.2 Covariance and Correlation Coefficient 65 3.3 Example: Computer Repair Data 69 3.4 The Simple Linear Regression Model 72 3.5 Parameter Estimation 73 3.6 Tests of Hypotheses 77 3.7 Confidence Intervals 82 3.8 Predictions 83 3.9 Measuring the Quality of Fit 84 3.10 Regression Line Through the Origin 88 3.11 Trivial Regression Models 89 3.12 Bibliographic Notes 90 4 Multiple Linear Regression 97 4.1 Introduction 97 4.2 Description of the Data and Model 97 4.3 Example: Supervisor Performance Data 98 4.4 Parameter Estimation 100 4.5 Interpretations of Regression Coefficients 101 4.6 Centering and Scaling 104 4.6.1 Centering and Scaling in Intercept Models 104 4.6.2 Scaling in No-Intercept Models 105 4.7 Properties of the Least Squares Estimators 106 4.8 Multiple Correlation Coefficient 107 4.9 Inference for Individual Regression Coefficients 108 4.10 Tests of Hypotheses in a Linear Model 111 4.10.1 Testing All Regression Coefficients Equal to Zero 4.10.2 Testing a Subset of Regression Coefficients Equal to 113 4.10.3 Testing the Equality of Regression Coefficients 4.10.4 Estimating and Testing of Regression Parameters 118 4.11 Predictions 121 4.12 Summary 122 5 Regression Diagnostics: Detection of Model Violations 131 5.1 Introduction 131 5.2 The Standard Regression Assumptions 132 5.3 Various Types of Residuals 134 5.4 Graphical Methods 136 5.5 Graphs Before Fitting a Model 139 5.5.1 One-Dimensional Graphs 139 5.5.2 Two-Dimensional Graphs 140 5.5.3 Rotating Plots 142 5.5.4 Dynamic Graphs 142 5.6 Graphs After Fitting a Model 143 5.7 Checking Linearity and Normality Assumptions 143 5.8 Leverage, Influence, and Outliers 144 5.8.1 Outliers in the Response Variable 146 5.8.2 Outliers in the Predictors 146 5.8.3 Masking and Swamping Problems 147 5.9 Measures of Influence 148 5.9.1 Cook’s Distance 150 5.9.2 Welsch and Kuh Measure 151 5.9.3 Hadi’s Influence Measure 151 5.10 The Potential-Residual Plot 152 5.11 Regression Diagnostics in R 154 5.12 What to Do with the Outliers? 155 5.13 Role of Variables in a Regression Equation 156 5.11.1 Added-Variable Plot 156 5.11.2 Residual Plus Component Plot 157 5.14 Effects of an Additional Predictor 159 5.15 Robust Regression 161 6 Qualitative Variables as Predictors 167 6.1 Introduction 167 6.2 Salary Survey Data 168 6.3 Interaction Variables 171 6.4 Systems of Regression Equations 175 6.4.1 Models with Different Slopes and Different Intercepts 176 6.4.2 Models with Same Slope and Different Intercepts 183 6.4.3 Models with Same Intercept and Different Slopes 184 6.5 Other Applications of Indicator Variables 185 6.6 Seasonality 186 6.7 Stability of Regression Parameters Over Time 187 7 Transformation of Variables 195 7.1 Introduction 195 7.2 Transformations to Achieve Linearity 197 7.3 Bacteria Deaths Due to X-Ray Radiation 199 7.3.1 Inadequacy of a Linear Model 200 7.3.2 Logarithmic Transformation for Achieving Linearity 201 7.4 Transformations to Stabilize Variance 203 7.5 Detection of Heteroscedastic Errors 208 7.6 Removal of Heteroscedasticity 210 7.7 Weighted Least Squares 211 7.8 Logarithmic Transformation of Data 212 7.9 Power Transformation 213 7.10 Summary 216 8 Weighted Least Squares 223 8.1 Introduction 223 8.2 Heteroscedastic Models 224 8.2.1 Supervisors Data 224 8.2.2 College Expense Data 226 8.3 Two-Stage Estimation 227 8.4 Education Expenditure Data 229 8.5 Fitting a Dose-Response Relationship Curve 237 9 The Problem of Correlated Errors 241 9.1 Introduction: Autocorrelation 241 9.2 Consumer Expenditure and Money Stock 242 9.3 Durbin-Watson Statistic 245 9.4 Removal of Autocorrelation by Transformation 246 9.5 Iterative Estimation with Autocorrelated Errors 249 9.6 Autocorrelation and Missing Variables 250 9.7 Analysis of Housing Starts 251 9.8 Limitations of the Durbin-Watson Statistic 253 9.9 Indicator Variables to Remove Seasonality 255 9.10 Regressing Two Time Series 257 10 Analysis of Collinear Data 261 10.1 Introduction 261 10.2 Effects of Collinearity on Inference 262 10.3 Effects of Collinearity on Forecasting 267 CONTENTS 10.4 Detection of Collinearity 271 10.4.1 Simple Signs of Collinearity 271 10.4.2 Variance Inflation Factors 274 10.4.3 The Condition Indices 276 11 Working With Collinear Data 283 11.1 Introduction 283 11.2 Principal Components 283 11.3 Computations Using Principal Components 287 11.4 Imposing Constraints 289 11.5 Searching for Linear Functions of the β's 292 11.6 Biased Estimation of Regression Coefficients 295 11.7 Principal Components Regression 296 11.8 Reduction of Collinearity in the Estimation Data 298 11.9 Constraints on the Regression Coefficients 300 11.10 Principal Components Regression: A Caution 301 11.11 Ridge Regression 303 11.12 Estimation by the Ridge Method 305 11.13 Ridge Regression: Some Remarks 308 11.14 Summary 311 11.15 Bibliographic Notes 311 12 Variable Selection Procedures 321 12.1 Introduction 321 12.2 Formulation of the Problem 322 12.3 Consequences of Variables Deletion 322 12.4 Uses of Regression Equations 324 12.4.1 Description and Model Building 324 12.4.2 Estimation and Prediction 324 12.4.3 Control 324 12.5 Criteria for Evaluating Equations 325 12.5.1 Residual Mean Square 325 12.5.2 Mallows Cp 326 12.5.3 Information Criteria 327 12.6 Collinearity and Variable Selection 328 12.7 Evaluating All Possible Equations 328 12.8 Variable Selection Procedures 329 12.8.1 Forward Selection Procedure 329 12.8.2 Backward Elimination Procedure 330 12.8.3 Stepwise Method 330 12.9 General Remarks on Variable Selection Methods 331 12.10 A Study of Supervisor Performance 332 12.11 Variable Selection with Collinear Data 336 12.12 The Homicide Data 336 12.13 Variable Selection Using Ridge Regression 339 12.14 Selection of Variables in an Air Pollution Study 339 12.15 A Possible Strategy for Fitting Regression Models 345 12.16 Bibliographic Notes 347 13 Logistic Regression 353 13.1 Introduction 353 13.2 Modeling Qualitative Data 354 13.3 The Logit Model 354 13.4 Example: Estimating Probability of Bankruptcies 356 13.5 Logistic Regression Diagnostics 358 13.6 Determination of Variables to Retain 359 13.7 Judging the Fit of a Logistic Regression 362 13.8 The Multinomial Logit Model 364 13.8.1 Multinomial Logistic Regression 364 13.8.2 Example: Determining Chemical Diabetes 365 13.8.3 Ordinal Logistic Regression 368 13.8.4 Example: Determining Chemical Diabetes Revisited 368 13.9 Classification Problem: Another Approach 370 14 Further Topics 375 14.1 Introduction 375 14.2 Generalized Linear Model 375 14.3 Poisson Regression Model 376 14.4 Introduction of New Drugs 377 14.5 Robust Regression 378 14.6 Fitting a Quadratic Model 379 14.7 Distribution of PCB in U.S. Bays 381 Exercises 384 References 385 Index

「Nielsen BookData」 より

関連文献: 1件中  1-1を表示

詳細情報

ページトップへ