Understanding and investigating response processes in validation research
Author(s)
Bibliographic Information
Understanding and investigating response processes in validation research
(Social indicators research series, v. 69)
Springer, c2017
Available at 2 libraries
  Aomori
  Iwate
  Miyagi
  Akita
  Yamagata
  Fukushima
  Ibaraki
  Tochigi
  Gunma
  Saitama
  Chiba
  Tokyo
  Kanagawa
  Niigata
  Toyama
  Ishikawa
  Fukui
  Yamanashi
  Nagano
  Gifu
  Shizuoka
  Aichi
  Mie
  Shiga
  Kyoto
  Osaka
  Hyogo
  Nara
  Wakayama
  Tottori
  Shimane
  Okayama
  Hiroshima
  Yamaguchi
  Tokushima
  Kagawa
  Ehime
  Kochi
  Fukuoka
  Saga
  Nagasaki
  Kumamoto
  Oita
  Miyazaki
  Kagoshima
  Okinawa
  Korea
  China
  Thailand
  United Kingdom
  Germany
  Switzerland
  France
  Belgium
  Netherlands
  Sweden
  Norway
  United States of America
Note
Includes bibliographical references
Description and Table of Contents
Description
This volume addresses an urgent need across multiple disciplines to broaden our understanding and use of response processes evidence of test validity. It builds on the themes and findings of the volume Validity and Validation in Social, Behavioral, and Health Sciences (Zumbo & Chan, 2014), with a focus on measurement validity evidence based on response processes. Approximately 1000 studies are published each year examining the validity of inferences made from tests and measures in the social, behavioural, and health sciences. The widely accepted Standards for Educational and Psychological Testing (1999, 2014) present five sources of evidence for validity: content-related, response processes, internal structure, relationships with other variables, and consequences of testing. Many studies focus on internal structure and relationships with other variables sources of evidence, which have a long history in validation research, known methodologies, and numerous exemplars in the literature. Far less is understood by test users and researchers conducting validation work about how to think about and apply new and emerging sources of validity evidence. This groundbreaking volume is the first to present conceptual models of response processes, methodological issues that arise in gathering response processes evidence, as well as applications and exemplars for providing response processes evidence in validation work.
Table of Contents
Response Processes in the Context of Validity: Setting the Stage (Anita M. Hubley and Bruno D. Zumbo).- Chapter 2. Response Processes and Measurement Validity in Health Psychology (Mark R. Beauchamp and Desmond McEwan).- Chapter 3. Contributions of Response Process Analysis to the Validation of an Assessment of Higher Education Students' Competence in Business and Economics (Sebastian Bruckner and James W. Pellegrino) .- Chapter 4. Ecological Framework of Item Responding as Validity Evidence: An Application of Multilevel DIF Modeling Using PISA Data (Michelle Y. Chen and Bruno D. Zumbo) .- Chapter 5. Putting Flesh on the Psychometric Bone: Making Sense of IRT Parameters in Non-cognitive Measures by Investigating the Social-cognitive Aspects of the Items (Anita M. Hubley, Amery D. Wu, Yan Liu, and Bruno D. Zumbo).- Chapter 6 Some Observations on Response Processes Research and Its Future Theoretical and Methodological Directions (Mihaela Launeanu and Anita M. Hubley).- Chapter 7. A Model Building Approach to Examining Response Processes as a Source of Validity Evidence for Self-Report Items and Measures (Mihaela Launeanu and Anita M. Hubley).- Chapter 8. Response Processes and Validity Evidence: Controlling for Emotions in Think Aloud Interviews (Jacqueline P. Leighton, Wei Tang, and Qi Guo).- Chapter 9. Response Time Data as Validity Evidence: Has It Lived It Up Its Promise, and If Not, What Would It Take To Do So (Zhi Li, Jayanti Banerjee, and Bruno D. Zumbo).- Chapter 10. Observing Testing Situations: Validation as Jazz (Bryan Maddox and Bruno D. Zumbo).- Chapter 11. A Rationale for and Demonstration of the Use of DIF and Mixed Methods (Jose-Luis Padilla, and Isabel Benitez).- Chapter 12. Cognitive Interviewing and Think Aloud Methods (Jose-Luis Padilla and Jacqueline P. Leighton).- Chapter 13. Some Thoughts on Gathering Response Processes Validity Evidence in the Context of Online Measurement and the Digital Revolution (Lara B. Russell and Anita M. Hubley).- Chapter 14. Longitudinal Change in Response Processes: A Response Shift Perspective (Richard Sawatzky, Tolulope T. Sajobi, Ronak Brahmbhatt, Eric K. H. Chan, Lisa M. Lix, and Bruno D. Zumbo).- Chapter 15. Validating a Distractor-Driven Geometry Test Using a Generalized Diagnostic Classification Model (Benjamin R. Shear and Louis A. Roussos).- Chapter 16. Understanding Test-taking Strategies for a Reading Comprehension Test via Latent Variable Regression with Pratt's Importance Measures (Amery D. Wu and Bruno D. Zumbo).- Chapter 17. An Investigation of Writing Processes Employed in Scenario-Based Assessment (Mo Zhang, Danjie Zou, Amery D. Wu, Paul Deane, and Chen Li).- Chapter 18. National and International Educational Achievement Testing: A Case of Multi-Level Validation Framed by the Ecological Model of Item Responding (Bruno D. Zumbo, Yan Liu, Amery D. Wu, Barry Forer, and Benjamin R. Shear).- Chapter 19. On Models and Modeling in Measurement and Validation Studies (Bruno D. Zumbo).
by "Nielsen BookData"