ISBN-13: 9781119263623 / Angielski / Miękka / 2019 / 816 str.
ISBN-13: 9781119263623 / Angielski / Miękka / 2019 / 816 str.
List of Contributors xviiPreface xxiiiPart I Assessing the Current Methodology for Questionnaire Design, Development, Testing, and Evaluation 11 Questionnaire Design, Development, Evaluation, and Testing: Where are We, and Where are We Headed? 3Gordon B. Willis1.1 Current State of the Art and Science of QDET 31.2 Relevance of QDET in the Evolving World of Surveys 111.3 Looking Ahead: Further Developments in QDET 161.4 Conclusion 19References 202 Asking the Right Questions in the Right Way: Six Needed Changes in Questionnaire Evaluation and Testing Methods 25Don A. Dillman2.1 Personal Experiences with Cognitive Interviews and Focus Groups 252.2 My 2002 Experience at QDET 292.3 Six Changes in Survey Research that Require New Perspectives on Questionnaire Evaluation and Testing 332.4 Conclusion 42References 433 A Framework for Making Decisions about Question Evaluation Methods 47Roger Tourangeau, Aaron Maitland, Darby Steiger, and Ting Yan3.1 Introduction 473.2 Expert Reviews 483.3 Laboratory Methods 513.4 Field Methods 553.5 Statistical Modeling for Data Quality 593.6 Comparing Different Methods 633.7 Recommendations 67References 694 A Comparison of Five Question Evaluation Methods in Predicting the Validity of Respondent Answers to Factual Items 75Aaron Maitland and Stanley Presser4.1 Introduction 754.2 Methods 764.3 Results 794.4 Discussion 84References 855 Combining Multiple Question Evaluation Methods: What Does tt Mean When the Data Appear to Conflict? 91Jo d'Ardenne and Debbie Collins5.1 Introduction 915.2 Questionnaire Development Stages 925.3 Selection of Case Studies 935.4 Case Study 1: Conflicting Findings Between Focus Groups and Cognitive Interviews 955.5 Case Study 2: Conflicting Findings Between Eye-Tracking, Respondent Debriefing Questions, and Interviewer Feedback 975.6 Case Study 3: Complementary Findings Between Cognitive Interviews and Interviewer Feedback 1005.7 Case Study 4: Combining Qualitative and Quantitative Data to Assess Changes to a Travel Diary 1045.8 Framework of QT Methods 1105.9 Summary and Discussion 110References 114Part II Question Characteristics, Response Burden, and Data Quality 1176 The Role of Question Characteristics in Designing and Evaluating Survey Questions 119Jennifer Dykema, Nora Cate Schaeffer, Dana Garbarski, and Michael Hout6.1 Introduction 1196.2 Overview of Some of the Approaches Used to Conceptualize, Measure, and Code Question Characteristics 1206.3 Taxonomy of Question Characteristics 1276.4 Case Studies 1326.5 Discussion 141Acknowledgments 147References 1487 Exploring the Associations Between Question Characteristics, Respondent Characteristics, Interviewer Performance Measures, and Survey Data Quality 153James M. Dahlhamer, Aaron Maitland, Heather Ridolfo, Antuane Allen, and Dynesha Brooks7.1 Introduction 1537.2 Methods 1577.3 Results 1747.4 Discussion 182Disclaimer 191References 1918 Response Burden: What is it and What Predicts It? 193Ting Yan, Scott Fricker, and Shirley Tsai8.1 Introduction 1938.2 Methods 1978.3 Results 2028.4 Conclusions and Discussion 206Acknowledgments 210References 2109 The Salience of Survey Burden and Its Effect on Response Behavior to Skip Questions: Experimental Results from Telephone and Web Surveys 213Frauke Kreuter, Stephanie Eckman, and Roger Tourangeau9.1 Introduction 2139.2 Study Designs and Methods 2169.3 Manipulating the Interleafed Format 2199.4 Discussion and Conclusion 224Acknowledgments 226References 22710 A Comparison of Fully Labeled and Top-Labeled Grid Question Formats 229Jolene D. Smyth and Kristen Olson10.1 Introduction 22910.2 Data and Methods 23610.3 Findings 24310.4 Discussion and Conclusions 253Acknowledgments 254References 25511 The Effects of Task Difficulty and Conversational Cueing on Answer Formatting Problems in Surveys 259Yfke Ongena and Sanne Unger11.1 Introduction 25911.2 Factors Contributing to Respondents' Formatting Problems 26211.3 Hypotheses 26711.4 Method and Data 26811.5 Results 27511.6 Discussion and Conclusion 27811.7 Further Expansion of the Current Study 28111.8 Conclusions 282References 283Part III Improving Questionnaires on the Web and Mobile Devices 28712 A Compendium of Web and Mobile Survey Pretesting Methods 289Emily Geisen and Joe Murphy12.1 Introduction 28912.2 Review of Traditional Pretesting Methods 29012.3 Emerging Pretesting Methods 294References 30813 Usability Testing Online Questionnaires: Experiences at the U.S. Census Bureau 315Elizabeth Nichols, Erica Olmsted-Hawala, Temika Holland, and Amy Anderson Riemer13.1 Introduction 31513.2 History of Usability Testing Self-Administered Surveys at the US Census Bureau 31613.3 Current Usability Practices at the Census Bureau 31713.4 Participants: "Real Users, Not User Stories" 32013.5 Building Usability Testing into the Development Life Cycle 32313.6 Measuring Accuracy 32713.7 Measuring Efficiency 33113.8 Measuring Satisfaction 33513.9 Retrospective Probing and Debriefing 33713.10 Communicating Findings with the Development Team 33913.11 Assessing Whether Usability Test Recommendations Worked 34013.12 Conclusions 341References 34114 How Mobile Device Screen Size Affects Data Collected in Web Surveys 349Daniele Toninelli and Melanie Revilla14.1 Introduction 34914.2 Literature Review 35014.3 Our Contribution and Hypotheses 35214.4 Data Collection and Method 35514.5 Main Results 36114.6 Discussion 368Acknowledgments 369References 37015 Optimizing Grid Questions for Smartphones: A Comparison of Optimized and Non-Optimized Designs and Effects on Data Quality on Different Devices 375Trine Dale and Heidi Walsoe15.1 Introduction 37515.2 The Need for Change in Questionnaire Design Practices 37615.3 Contribution and Research Questions 37815.4 Data Collection and Methodology 38015.5 Main Results 38615.6 Discussion 392Acknowledgments 397References 39716 Learning from Mouse Movements: Improving Questionnaires and Respondents' User Experience Through Passive Data Collection 403Rachel Horwitz, Sarah Brockhaus, Felix Henninger, Pascal J. Kieslich, Malte Schierholz, Florian Keusch, and Frauke Kreuter16.1 Introduction 40316.2 Background 40416.3 Data 40916.4 Methodology 41016.5 Results 41516.6 Discussion 420References 42317 Using Targeted Embedded Probes to Quantify Cognitive Interviewing Findings 427Paul Scanlon17.1 Introduction 42717.2 The NCHS Research and Development Survey 43117.3 Findings 43317.4 Discussion 445References 44818 The Practice of Cognitive Interviewing Through Web Probing 451Stephanie Fowler and Gordon B. Willis18.1 Introduction 45118.2 Methodological Issues in the Use of Web Probing for Pretesting 45218.3 Testing the Effect of Probe Placement 45318.4 Analyses of Responses to Web Probes 45518.5 Qualitative Analysis of Responses to Probes 45918.6 Qualitative Coding of Responses 45918.7 Current State of the Use of Web Probes 46218.8 Limitations 46518.9 Recommendations for the Application and Further Evaluation of Web Probes 46618.10 Conclusion 468Acknowledgments 468References 468Part IV Cross-Cultural and Cross-National Questionnaire Design and Evaluation 47119 Optimizing Questionnaire Design in Cross-National and Cross-Cultural Surveys 473Tom W. Smith19.1 Introduction 47319.2 The Total Survey Error Paradigm and Comparison Error 47419.3 Cross-Cultural Survey Guidelines and Resources 47719.4 Translation 47819.5 Developing Comparative Scales 48019.6 Focus Groups and Pretesting in Cross-National/Cultural Surveys 48319.7 Tools for Developing and Managing Cross-National Surveys 48419.8 Resources for Developing and Testing Cross-National Measures 48519.9 Pre- and Post-Harmonization 48619.10 Conclusion 488References 48820 A Model for Cross-National Questionnaire Design and Pretesting 493Rory Fitzgerald and Diana Zavala-Rojas20.1 Introduction 49320.2 Background 49320.3 The European Social Survey 49520.4 ESS Questionnaire Design Approach 49620.5 Critique of the Seven-Stage Approach 49720.6 A Model for Cross-National Questionnaire Design and Pretesting 49720.7 Evaluation of the Model for Cross-National Questionnaire Design and Pretesting Using the Logical Framework Matrix (LFM) 50120.8 Conclusions 512References 51421 Cross-National Web Probing: An Overview of Its Methodology and Its Use in Cross-National Studies 521Dorothée Behr, Katharina Meitinger, Michael Braun, and Lars Kaczmirek21.1 Introduction 52121.2 Cross-National Web Probing - Its Goal, Strengths, and Weaknesses 52321.3 Access to Respondents Across Countries: The Example of Online Access Panels and Probability-Based Panels 52621.4 Implementation of Standardized Probes 52721.5 Translation and Coding Answers to Cross-Cultural Probes 53221.6 Substantive Results 53321.7 Cross-National Web Probing and Its Application Throughout the Survey Life Cycle 53621.8 Conclusions and Outlook 538Acknowledgments 539References 53922 Measuring Disability Equality in Europe: Design and Development of the European Health and Social Integration Survey Questionnaire 545Amanda Wilmot22.1 Introduction 54522.2 Background 54622.3 Questionnaire Design 54822.4 Questionnaire Development and Testing 55322.5 Survey Implementation 56022.6 Lessons Learned 56322.7 Final Reflections 566Acknowledgments 567References 567Part V Extensions and Applications 57123 Regression-Based Response Probing for Assessing the Validity of Survey Questions 573Patrick Sturgis, Ian Brunton-Smith, and Jonathan Jackson23.1 Introduction 57323.2 Cognitive Methods for Assessing Question Validity 57423.3 Regression-Based Response Probing 57723.4 Example 1: Generalized Trust 57923.5 Example 2: Fear of Crime 58023.6 Data 58123.7 Discussion 586References 58824 The Interplay Between Survey Research and Psychometrics, with a Focus on Validity Theory 593Bruno D. Zumbo and José-Luis Padilla24.1 Introduction 59324.2 An Over-the-Shoulder Look Back at Validity Theory and Validation Practices with an Eye toward Describing Contemporary Validity Theories 59524.3 An Approach to Validity that Bridges Psychometrics and Survey Design 60224.4 Closing Remarks 606References 60825 Quality-Driven Approaches for Managing Complex Cognitive Testing Projects 613Martha Stapleton, Darby Steiger, and Mary C. Davis25.1 Introduction 61325.2 Characteristics of the Four Cognitive Testing Projects 61425.3 Identifying Detailed, Quality-Driven Management Approaches for Qualitative Research 61525.4 Identifying Principles for Developing Quality-Driven Management Approaches 61625.5 Applying the Concepts of Transparency and Consistency 61725.6 The 13 Quality-Driven Management Approaches 61825.7 Discussion and Conclusion 632References 63426 Using Iterative, Small-Scale Quantitative and Qualitative Studies: A Review of 15 Years of Research to Redesign a Major US Federal Government Survey 639Joanne Pascale26.1 Introduction 63926.2 Measurement Issues in Health Insurance 64126.3 Methods and Results 64526.4 Discussion 66026.5 Final Reflections 663References 66427 Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey 671Robin L. Kaplan, Brandon Kopp, and Polly Phipps27.1 Introduction 67127.2 The Sleep Gap 67227.3 The Present Research 67427.4 Study 1: Behavior Coding 67527.5 Study 2: Cognitive Interviews 67827.6 Study 3: Quantitative Study 68227.7 Study 4: Validation Study 68627.8 General Discussion 68927.9 Implications and Future Directions 692References 69228 Questionnaire Design Issues in Mail Surveys of All Adults in a Household 697Douglas Williams, J. Michael Brick, W. Sherman Edwards, and Pamela Giambo28.1 Introduction 69728.2 Background 69828.3 The NCVS and Mail Survey Design Challenges 69928.4 Field Test Methods and Design 70428.5 Outcome Measures 70628.6 Findings 70828.7 Summary 71628.8 Discussion 71628.9 Conclusion 719References 72029 Planning Your Multimethod Questionnaire Testing Bento Box: Complementary Methods for a Well-Balanced Test 723Jaki S. McCarthy29.1 Introduction 72329.2 A Questionnaire Testing Bento Box 72529.3 Examples from the Census of Agriculture Questionnaire Testing Bento Box 73329.4 Conclusion 743References 74430 Flexible Pretesting on a Tight Budget: Using Multiple Dependent Methods to Maximize Effort-Return Trade-Offs 749Matt Jans, Jody L. Herman, Joseph Viana, David Grant, Royce Park, Bianca D.M. Wilson, Jane Tom, Nicole Lordi, and Sue Holtby30.1 Introduction 74930.2 Evolution of a Dependent Pretesting Approach for Gender Identity Measurement 75230.3 Analyzing and Synthesizing Results 75930.4 Discussion 764Acknowledgments 766References 766Index 769
PAUL C. BEATTY is Chief of the Center for Behavioral Science Methods at the U.S. Census Bureau.DEBBIE COLLINS is a Senior Research Director at the National Centre for Social Research, UK.LYN KAYE is a consultant in Survey Research Methods, and previously Statistics New Zealand's Senior Researcher.JOSE???LUIS PADILLA is Professor of Methodology of Behavioral Sciences at University of Granada, Spain.GORDON B. WILLIS is Cognitive Psychologist at the National Cancer Institute, National Institutes of Health, USA.AMANDA WILMOT is a Senior Study Director at Westat, USA.
1997-2024 DolnySlask.com Agencja Internetowa