Performance evaluation : proven approaches for improving program and organizational performance
Author(s)
Bibliographic Information
Performance evaluation : proven approaches for improving program and organizational performance
Jossey-Bass, c2008
1st ed
- : pbk
Access to Electronic Resource 1 items
Available at / 5 libraries
-
No Libraries matched.
- Remove all filters.
Note
Includes bibliographical references and index
Description and Table of Contents
Description
Performance Evaluation is a hands-on text for practitioners, researchers, educators, and students in how to use scientifically-based evaluations that are both rigorous and flexible. Author Ingrid Guerra-Lopez, an internationally-known evaluation expert, introduces the foundations of evaluation and presents the most applicable models for the performance improvement field. Her book offers a wide variety of tools and techniques that have proven successful and is organized to illustrate evaluation in the context of continual performance improvement.
Table of Contents
Acknowledgments xi
Preface xiii
The Author xv
Part One: Introduction To Evaluation
One: Foundations of Evaluation 3
A Brief Overview of Evaluation History 4
Evaluation: Purpose and Definition 5
Performance Improvement: A Conceptual Framework 8
Making Evaluation Happen: Ensuring Stakeholders' Buy-In 9
The Evaluator: A Job or a Role? 10
The Relationship to Other Investigative Processes 11
When Does Evaluation Occur? 15
General Evaluation Orientations 18
Challenges That Evaluators Face 20
Ensuring Commitment 23
Benefits of Evaluation 24
Basic Definitions 25
Two: Principles of Performance-based Evaluation 27
Principle 1: Evaluation Is Based on Asking the Right Questions 28
Principle 2: Evaluation of Process Is a Function of Obtained Results 32
Principle 3: Goals and Objectives of Organizations Should Be Based on Valid Needs 33
Principle 4: Derive Valid Needs Using a Top-Down Approach 34
Principle 5: Every Organization Should Aim for the Best That Society Can Attain 34
Principle 6: The Set of Evaluation Questions Drives the Evaluation Study 35
Part Two: Models of Evaluation
Three: Overview of Existing Evaluation Models 39
Overview of Classic Evaluation Models 40
Selected Evaluation Models 42
Selecting a Model 43
Conceptualizing a Useful Evaluation That Fits the Situation 44
Four: KIRKPATRICK'S Four LEVELS Of Evaluation 47
Kirkpatrick's Levels 49
Comments on the Model 54
Strengths and Limitations 55
Application Example: Wagner (1995) 56
Five: Phillips's Return-on-investment Methodology 61
Phillips's ROI Process 63
Comments on the Model 67
Strengths and Limitations 70
Application Example: Blake (1999) 70
Six: Brinkerhoff's Success Case Method 75
The SCM Process 77
Strengths and Weaknesses 78
Application Example: Brinkerhoff (2005) 79
Seven: the Impact Evaluation Process 81
The Elements of the Process 83
Comments on the Model 96
Strengths and Limitations 97
Application Example 97
Eight: the Cipp Model 107
Stufflebeam's Four Types of Evaluation 108
Articulating Core Values of Programs and Solutions 111
Methods Used in CIPP Evaluations 112
Strengths and Limitations 113
Application Example: Filella-Guiu and Blanch-Pana (2002) 113
Nine: Evaluating Evaluations 117
Evaluation Standards 119
The American Evaluation Association Principles for Evaluators 120
Application Example: Lynch et al. (2003) 122
Part Three: Tools and Techniques of Evaluation
Ten: Data 133
Characteristics of Data 135
Scales of Measurement 137
Defining Required Data from Performance Objectives 139
Deriving Measurable Indicators 141
Finding Data Sources 152
Follow-Up Questions and Data 155
Eleven: Data Collection 159
Observation Methodology and the Purpose of Measurement 160
Designing the Experiment 186
Problems with Classic Experimental Studies in Applied Settings 188
Time-Series Studies 188
Simulations and Games 189
Document-Centered Methods 191
Conclusion 192
Twelve: Analysis of Evaluation Data: Tools and Techniques 195
Analysis of Models and Patterns 196
Analysis Using Structured Discussion 197
Methods of Quantitative Analysis 199
Statistics 200
Graphical Representations of Data 210
Measures of Relationship 212
Inferential Statistics: Parametric and Nonparametric 214
Interpretation 217
Thirteen: Communicating The Findings 221
Recommendations 222
Considerations for Implementing Recommendations 225
Developing the Report 226
The Evaluator's Role After the Report 235
Part Four: Continual Improvement
Fourteen: Common Errors in Evaluation 239
Errors of System Mapping 240
Errors of Logic 242
Errors of Procedure 244
Conclusion 246
Fifteen: Continual Improvement 249
What Is Continual Improvement? 250
Monitoring Performance 250
Adjusting Performance 253
The Role of Leadership 254
Sixteen: Contracting for Evaluation Services 257
The Contract 258
Contracting Controls 260
Ethics and Professionalism 262
Sample Statement of Work 262
Seventeen: Intelligence Gathering For Decision Making 271
Performance Measurement Systems 273
Issues in Performance Measurement Systems 275
Conclusion 277
Eighteen: the Future of Evaluation in Performance Improvement 279
Evaluation and Measurement in Performance Improvement Today 281
What Does the Future Hold? 282
Conclusion 283
References and Related Readings 285
Index 295
by "Nielsen BookData"