Evaluation essentials from A to Z

書誌事項

Evaluation essentials from A to Z

Marvin C. Alkin

Guilford Press, c2011

  • pbk. : alk. paper
  • hardcover : alk. paper

大学図書館所蔵 件 / 5

この図書・雑誌をさがす

注記

Includes bibliographical references and index

内容説明・目次

内容説明

Written in a refreshing conversational style, this text thoroughly prepares students, program administrators, and new evaluators to conduct evaluations or to use them in their work. The book's question-driven focus and clear discussions about the importance of fostering evaluation use by building collaborative relationships with stakeholders set it apart from other available texts. In 26 concise sections, Marvin C. Alkin explores how to articulate answerable evaluation questions, collect and analyze data using both quantitative and qualitative methods, and deal with contingencies that might alter the traditional sequence of an evaluation. Student-friendly features include handy bulleted recaps of each section, ""Thinking Ahead"" and ""Next Steps"" pointers, cautionary notes, annotated suggestions for further reading, and an in-depth case study that provides the basis for end-of-chapter exercises.

目次

1. Overview. 2. Structure. A. What Is Evaluation? 3. Professional Program Evaluation. 4. Evaluation and Research. 5. Evaluation Definition. 6. A Confusion of Terms. 7. Evaluation Purposes. B. Why Do Evaluations? 8. Making Decisions. 9. Issues for Professional Evaluation. Time-Out: The RUPAS Case. 10. The Rural Parents' Support Program (RUPAS): A Community Well-Being Case Study, Nicole Eisenberg. C. Who Does Evaluations? 11. Evaluator Settings. 12. Multiple Orientations to Doing Evaluation. 13. My View. D. Who Are the Stakeholders for an Evaluation? 14. Stakeholders-Not Audience. 15. Who Are the Stakeholders? 16. Focus on Primary Stakeholders. 17. Differences in Stakeholder Participation. E. How Are Positive Stakeholder Relationships Maintained? 18. Gaining RTC (Respect, Trust, Credibility). 19. Concluding Note. F. What Is the Organizational, Social, and Political Context? 20. Organizational Context. 21. Social Context. 22. Political Context. 23. My Advice. 24. Thinking Ahead. G. How Do You Describe the Program? 25. Program Components. 26. Program Size and Organizational Location. 27. Program Differences. 28. What Should We Know about Programs. 29. Learning about the Program. H. How Do You "Understand" the Program? 30. Theory of Action. 31. Logic Models. 32. Why is This Important? 33. What Does a Logic Model Look Like? 34. A Partial Logic Model. 35. Getting Started. I. What Are the Questions/Issues to Be Addressed? 36. Kinds of Evaluation Questions. 37. Getting Started on Defining Questions. 38. Some Next Steps. J. Who Provides Data? 39. Again, Be Clear on the Questions. 40. Focus of the Data. 41. Selecting Individuals. K. What Are Instruments for Collecting Quantitative Data? 42. Instruments for Attaining Quantitative Data. 43. Acquisition of Data. 44. Existing Data. 45. Finding Existing Instruments. 46. Developing New Instruments. 47. Questionnaire Construction. 48. Measuring Achievement. 49. Achievement Test Construction. 50. Observation Protocols. L. What Are Instruments for Collecting Qualitative Data? 51. Developing New Instruments. 52. Observations. 53. Interviews and Focus Groups. 54. Surveys and Questionnaires. M. What Are the Logistics of Data Collection? 55. Gaining Data Access. 56. Collecting Data. 57. Quality of Data. 58. Understanding the Organization's Viewpoints. 59. My Advice. N. Are the Questions Evaluable (Able to Be Evaluated)? 60. Stage of the Program. 61. Resources. 62. Nature of the Question. 63. Establishing Standards. 64. Technical Issues. 65. Ethical Issues. 66. Political Feasibility. 67. My Advice. O. What Is the Evaluation Plan (Process Measures)? 68. The Evaluation Design. 69. Process Measures. 70. Program Elements. 71. Program Mechanisms. 72. Question. P. What Is the Evaluation Plan (Outcome Measures)? 73. An Exercise to Assist Us. 74. Toward Stronger Causal Models. 75. Descriptive Designs. 76. Mixed Methods. 77. Summary. 78. My Advice. Q. What Is the Evaluation Plan (Procedures and Agreements)? 79. Evaluation Activities: Past, Present, and Upcoming. 80. The Written Evaluation Plan. 81. The Contract. 82. My Advice. 83. R. How Are Quantitative Data Analyzed? 84. Types of Data. 85. A First Acquaintance with the Data. 86. Measures of Central Tendency. 87. Measures of Variability. 89. Getting Further Acquainted with the Data. 90. Descriptive and Inferential Statistics. 91. Are the Results Significant? 92. Appropriate Statistical Techniques. 93. My Warning. S. How Are Qualitative Data Analyzed? 94. Refining the Data. 95. Testing the Validity of the Analysis. T. How Do Analyzed Data Answer Questions? 96. Difficulties in Valuing. 97. Valuing in a Formative Context. 98. "Valuing" Redefined. 99. A Final Note. U. How Are Evaluation Results Reported? 100. Communication. 101. Reporting. 102. The Final Written Report. 103. Nature and Quality of Writing. V. What Is the Evaluator's Role in Helping Evaluations to Be Used? 104. A Word about "Use." 105. What Is Use? 106. What Can You Do? 107. Guard against Misuse. W. How Are Evaluations Managed? 108. Acquiring the Evaluation. 109. Contract/Agreement. 110. Budget. 111. Operational Management. X. What Are the Evaluation Standards and Codes of Behavior? 112. Judging an Evaluation. 113. The Program Evaluation Standards. 114. American Evaluation Association Guiding Principles. Y. How Are Costs Analyzed? 115. Cost-Effectiveness Analysis. 116. Cost-Benefit Analysis. 117. Cost-Utility Analysis. 118. And Now to Costs. 119. How to Determine Cost. Z. How Can You Embark on a Program to Learn More about Evaluation? 120. Getting Feedback on Evaluation. 121. Taking Full Advantage of This Volume. 122. Gaining Evaluation Expertise Beyond This Volume. Appendix. 123. An Evaluation Lesson, "Unknown Student."

「Nielsen BookData」 より

詳細情報

ページトップへ