Advances in Questionnaire Design, Development, Evaluation and Testing

Advances in Questionnaire Design, Development, Evaluation and Testing

Edited by  , Edited by  , Edited by  , Edited by  , Edited by  , Edited by 

Free delivery worldwide

Available. Dispatched from the UK in 3 business days


When will my order arrive?

Available. Expected delivery to the United States in 8-11 business days.


Not ordering to the United States? Click here.

Description

A new and updated definitive resource for survey questionnaire testing and evaluation


Building on the success of the first Questionnaire Development, Evaluation, and Testing (QDET) conference in 2002, this book brings together leading papers from the Second International Conference on Questionnaire Design, Development, Evaluation, and Testing (QDET2) held in 2016. The volume assesses the current state of the art and science of QDET; examines the importance of methodological attention to the questionnaire in the present world of information collection; and ponders how the QDET field can anticipate new trends and directions as information needs and data collection methods continue to evolve.


Featuring contributions from international experts in survey methodology, Advances in Questionnaire Design, Development, Evaluation and Testing includes latest insights on question characteristics, usability testing, web probing, and other pretesting approaches, as well as:





Recent developments in the design and evaluation of digital and self-administered surveys

Strategies for comparing and combining questionnaire evaluation methods

Approaches for cross-cultural and cross-national questionnaire development

New data sources and methodological innovations during the last 15 years

Case studies and practical applications



Advances in Questionnaire Design, Development, Evaluation and Testing serves as a forum to prepare researchers to meet the next generation of challenges, making it an excellent resource for researchers and practitioners in government, academia, and the private sector.
show more

Product details

  • Paperback | 816 pages
  • 162 x 224 x 36mm | 1,312g
  • Wiley-Blackwell
  • Hoboken, United States
  • English
  • 1. Auflage
  • 111926362X
  • 9781119263623
  • 1,257,913

Back cover copy

A new and updated definitive resource for survey questionnaire testing and evaluation

Building on the success of the first Questionnaire Development, Evaluation, and Testing (QDET) conference in 2002, this book brings together leading papers from the Second International Conference on Questionnaire Design, Development, Evaluation, and Testing (QDET2) held in 2016. The volume assesses the current state of the art and science of QDET; examines the importance of methodological attention to the questionnaire in the present world of information collection; and considers how the QDET field can anticipate new trends and directions as information needs and data collection methods continue to evolve.

Featuring contributions from international experts in survey methodology, Advances in Questionnaire Design, Development, Evaluation and Testing includes latest insights on question characteristics, usability testing, web probing, and other pretesting approaches, as well as: Recent developments in the design and evaluation of digital and self-administered surveys Strategies for comparing and combining questionnaire evaluation methods Approaches for cross-cultural and cross-national questionnaire development New data sources and methodological innovations Case studies and practical applications

Advances in Questionnaire Design, Development, Evaluation and Testing serves as a forum to prepare researchers to meet the next generation of challenges, making it an excellent resource for researchers and practitioners in government, academia, and the private sector.
show more

Table of contents

List of Contributors xvii


Preface xxiii


Part I Assessing the Current Methodology for Questionnaire Design, Development, Testing, and Evaluation 1


1 Questionnaire Design, Development, Evaluation, and Testing: Where are We, and Where are We Headed? 3
Gordon B. Willis


1.1 Current State of the Art and Science of QDET 3


1.2 Relevance of QDET in the Evolving World of Surveys 11


1.3 Looking Ahead: Further Developments in QDET 16


1.4 Conclusion 19


References 20


2 Asking the Right Questions in the Right Way: Six Needed Changes in Questionnaire Evaluation and Testing Methods 25
Don A. Dillman


2.1 Personal Experiences with Cognitive Interviews and Focus Groups 25


2.2 My 2002 Experience at QDET 29


2.3 Six Changes in Survey Research that Require New Perspectives on Questionnaire Evaluation and Testing 33


2.4 Conclusion 42


References 43


3 A Framework for Making Decisions about Question Evaluation Methods 47
Roger Tourangeau, Aaron Maitland, Darby Steiger, and Ting Yan


3.1 Introduction 47


3.2 Expert Reviews 48


3.3 Laboratory Methods 51


3.4 Field Methods 55


3.5 Statistical Modeling for Data Quality 59


3.6 Comparing Different Methods 63


3.7 Recommendations 67


References 69


4 A Comparison of Five Question Evaluation Methods in Predicting the Validity of Respondent Answers to Factual Items 75
Aaron Maitland and Stanley Presser


4.1 Introduction 75


4.2 Methods 76


4.3 Results 79


4.4 Discussion 84


References 85


5 Combining Multiple Question Evaluation Methods: What Does tt Mean When the Data Appear to Conflict? 91
Jo d'Ardenne and Debbie Collins


5.1 Introduction 91


5.2 Questionnaire Development Stages 92


5.3 Selection of Case Studies 93


5.4 Case Study 1: Conflicting Findings Between Focus Groups and Cognitive Interviews 95


5.5 Case Study 2: Conflicting Findings Between Eye-Tracking, Respondent Debriefing Questions, and Interviewer Feedback 97


5.6 Case Study 3: Complementary Findings Between Cognitive Interviews and Interviewer Feedback 100


5.7 Case Study 4: Combining Qualitative and Quantitative Data to Assess Changes to a Travel Diary 104


5.8 Framework of QT Methods 110


5.9 Summary and Discussion 110


References 114


Part II Question Characteristics, Response Burden, and Data Quality 117


6 The Role of Question Characteristics in Designing and Evaluating Survey Questions 119
Jennifer Dykema, Nora Cate Schaeffer, Dana Garbarski, and Michael Hout


6.1 Introduction 119


6.2 Overview of Some of the Approaches Used to Conceptualize, Measure, and Code Question Characteristics 120


6.3 Taxonomy of Question Characteristics 127


6.4 Case Studies 132


6.5 Discussion 141


Acknowledgments 147


References 148


7 Exploring the Associations Between Question Characteristics, Respondent Characteristics, Interviewer Performance Measures, and Survey Data Quality 153
James M. Dahlhamer, Aaron Maitland, Heather Ridolfo, Antuane Allen, and Dynesha Brooks


7.1 Introduction 153


7.2 Methods 157


7.3 Results 174


7.4 Discussion 182


Disclaimer 191


References 191


8 Response Burden: What is it and What Predicts It? 193
Ting Yan, Scott Fricker, and Shirley Tsai


8.1 Introduction 193


8.2 Methods 197


8.3 Results 202


8.4 Conclusions and Discussion 206


Acknowledgments 210


References 210


9 The Salience of Survey Burden and Its Effect on Response Behavior to Skip Questions: Experimental Results from Telephone and Web Surveys 213
Frauke Kreuter, Stephanie Eckman, and Roger Tourangeau


9.1 Introduction 213


9.2 Study Designs and Methods 216


9.3 Manipulating the Interleafed Format 219


9.4 Discussion and Conclusion 224


Acknowledgments 226


References 227


10 A Comparison of Fully Labeled and Top-Labeled Grid Question Formats 229
Jolene D. Smyth and Kristen Olson


10.1 Introduction 229


10.2 Data and Methods 236


10.3 Findings 243


10.4 Discussion and Conclusions 253


Acknowledgments 254


References 255


11 The Effects of Task Difficulty and Conversational Cueing on Answer Formatting Problems in Surveys 259
Yfke Ongena and Sanne Unger


11.1 Introduction 259


11.2 Factors Contributing to Respondents' Formatting Problems 262


11.3 Hypotheses 267


11.4 Method and Data 268


11.5 Results 275


11.6 Discussion and Conclusion 278


11.7 Further Expansion of the Current Study 281


11.8 Conclusions 282


References 283


Part III Improving Questionnaires on the Web and Mobile Devices 287


12 A Compendium of Web and Mobile Survey Pretesting Methods 289
Emily Geisen and Joe Murphy


12.1 Introduction 289


12.2 Review of Traditional Pretesting Methods 290


12.3 Emerging Pretesting Methods 294


References 308


13 Usability Testing Online Questionnaires: Experiences at the U.S. Census Bureau 315
Elizabeth Nichols, Erica Olmsted-Hawala, Temika Holland, and Amy Anderson Riemer


13.1 Introduction 315


13.2 History of Usability Testing Self-Administered Surveys at the US Census Bureau 316


13.3 Current Usability Practices at the Census Bureau 317


13.4 Participants: "Real Users, Not User Stories" 320


13.5 Building Usability Testing into the Development Life Cycle 323


13.6 Measuring Accuracy 327


13.7 Measuring Efficiency 331


13.8 Measuring Satisfaction 335


13.9 Retrospective Probing and Debriefing 337


13.10 Communicating Findings with the Development Team 339


13.11 Assessing Whether Usability Test Recommendations Worked 340


13.12 Conclusions 341


References 341


14 How Mobile Device Screen Size Affects Data Collected in Web Surveys 349
Daniele Toninelli and Melanie Revilla


14.1 Introduction 349


14.2 Literature Review 350


14.3 Our Contribution and Hypotheses 352


14.4 Data Collection and Method 355


14.5 Main Results 361


14.6 Discussion 368


Acknowledgments 369


References 370


15 Optimizing Grid Questions for Smartphones: A Comparison of Optimized and Non-Optimized Designs and Effects on Data Quality on Different Devices 375
Trine Dale and Heidi Walsoe


15.1 Introduction 375


15.2 The Need for Change in Questionnaire Design Practices 376


15.3 Contribution and Research Questions 378


15.4 Data Collection and Methodology 380


15.5 Main Results 386


15.6 Discussion 392


Acknowledgments 397


References 397


16 Learning from Mouse Movements: Improving Questionnaires and Respondents' User Experience Through Passive Data Collection 403
Rachel Horwitz, Sarah Brockhaus, Felix Henninger, Pascal J. Kieslich, Malte Schierholz, Florian Keusch, and Frauke Kreuter


16.1 Introduction 403


16.2 Background 404


16.3 Data 409


16.4 Methodology 410


16.5 Results 415


16.6 Discussion 420


References 423


17 Using Targeted Embedded Probes to Quantify Cognitive Interviewing Findings 427
Paul Scanlon


17.1 Introduction 427


17.2 The NCHS Research and Development Survey 431


17.3 Findings 433


17.4 Discussion 445


References 448


18 The Practice of Cognitive Interviewing Through Web Probing 451
Stephanie Fowler and Gordon B. Willis


18.1 Introduction 451


18.2 Methodological Issues in the Use of Web Probing for Pretesting 452


18.3 Testing the Effect of Probe Placement 453


18.4 Analyses of Responses to Web Probes 455


18.5 Qualitative Analysis of Responses to Probes 459


18.6 Qualitative Coding of Responses 459


18.7 Current State of the Use of Web Probes 462


18.8 Limitations 465


18.9 Recommendations for the Application and Further Evaluation of Web Probes 466


18.10 Conclusion 468


Acknowledgments 468


References 468


Part IV Cross-Cultural and Cross-National Questionnaire Design and Evaluation 471


19 Optimizing Questionnaire Design in Cross-National and Cross-Cultural Surveys 473
Tom W. Smith


19.1 Introduction 473


19.2 The Total Survey Error Paradigm and Comparison Error 474


19.3 Cross-Cultural Survey Guidelines and Resources 477


19.4 Translation 478


19.5 Developing Comparative Scales 480


19.6 Focus Groups and Pretesting in Cross-National/Cultural Surveys 483


19.7 Tools for Developing and Managing Cross-National Surveys 484


19.8 Resources for Developing and Testing Cross-National Measures 485


19.9 Pre- and Post-Harmonization 486


19.10 Conclusion 488


References 488


20 A Model for Cross-National Questionnaire Design and Pretesting 493
Rory Fitzgerald and Diana Zavala-Rojas


20.1 Introduction 493


20.2 Background 493


20.3 The European Social Survey 495


20.4 ESS Questionnaire Design Approach 496


20.5 Critique of the Seven-Stage Approach 497


20.6 A Model for Cross-National Questionnaire Design and Pretesting 497


20.7 Evaluation of the Model for Cross-National Questionnaire Design and Pretesting Using the Logical Framework Matrix (LFM) 501


20.8 Conclusions 512


References 514


21 Cross-National Web Probing: An Overview of Its Methodology and Its Use in Cross-National Studies 521
Dorothee Behr, Katharina Meitinger, Michael Braun, and Lars Kaczmirek


21.1 Introduction 521


21.2 Cross-National Web Probing - Its Goal, Strengths, and Weaknesses 523


21.3 Access to Respondents Across Countries: The Example of Online Access Panels and Probability-Based Panels 526


21.4 Implementation of Standardized Probes 527


21.5 Translation and Coding Answers to Cross-Cultural Probes 532


21.6 Substantive Results 533


21.7 Cross-National Web Probing and Its Application Throughout the Survey Life Cycle 536


21.8 Conclusions and Outlook 538


Acknowledgments 539


References 539


22 Measuring Disability Equality in Europe: Design and Development of the European Health and Social Integration Survey Questionnaire 545
Amanda Wilmot


22.1 Introduction 545


22.2 Background 546


22.3 Questionnaire Design 548


22.4 Questionnaire Development and Testing 553


22.5 Survey Implementation 560


22.6 Lessons Learned 563


22.7 Final Reflections 566


Acknowledgments 567


References 567


Part V Extensions and Applications 571


23 Regression-Based Response Probing for Assessing the Validity of Survey Questions 573
Patrick Sturgis, Ian Brunton-Smith, and Jonathan Jackson


23.1 Introduction 573


23.2 Cognitive Methods for Assessing Question Validity 574


23.3 Regression-Based Response Probing 577


23.4 Example 1: Generalized Trust 579


23.5 Example 2: Fear of Crime 580


23.6 Data 581


23.7 Discussion 586


References 588


24 The Interplay Between Survey Research and Psychometrics, with a Focus on Validity Theory 593
Bruno D. Zumbo and Jose-Luis Padilla


24.1 Introduction 593


24.2 An Over-the-Shoulder Look Back at Validity Theory and Validation Practices with an Eye toward Describing Contemporary Validity Theories 595


24.3 An Approach to Validity that Bridges Psychometrics and Survey Design 602


24.4 Closing Remarks 606


References 608


25 Quality-Driven Approaches for Managing Complex Cognitive Testing Projects 613
Martha Stapleton, Darby Steiger, and Mary C. Davis


25.1 Introduction 613


25.2 Characteristics of the Four Cognitive Testing Projects 614


25.3 Identifying Detailed, Quality-Driven Management Approaches for Qualitative Research 615


25.4 Identifying Principles for Developing Quality-Driven Management Approaches 616


25.5 Applying the Concepts of Transparency and Consistency 617


25.6 The 13 Quality-Driven Management Approaches 618


25.7 Discussion and Conclusion 632


References 634


26 Using Iterative, Small-Scale Quantitative and Qualitative Studies: A Review of 15 Years of Research to Redesign a Major US Federal Government Survey 639
Joanne Pascale


26.1 Introduction 639


26.2 Measurement Issues in Health Insurance 641


26.3 Methods and Results 645


26.4 Discussion 660


26.5 Final Reflections 663


References 664


27 Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey 671
Robin L. Kaplan, Brandon Kopp, and Polly Phipps


27.1 Introduction 671


27.2 The Sleep Gap 672


27.3 The Present Research 674


27.4 Study 1: Behavior Coding 675


27.5 Study 2: Cognitive Interviews 678


27.6 Study 3: Quantitative Study 682


27.7 Study 4: Validation Study 686


27.8 General Discussion 689


27.9 Implications and Future Directions 692


References 692


28 Questionnaire Design Issues in Mail Surveys of All Adults in a Household 697
Douglas Williams, J. Michael Brick, W. Sherman Edwards, and Pamela Giambo


28.1 Introduction 697


28.2 Background 698


28.3 The NCVS and Mail Survey Design Challenges 699


28.4 Field Test Methods and Design 704


28.5 Outcome Measures 706


28.6 Findings 708


28.7 Summary 716


28.8 Discussion 716


28.9 Conclusion 719


References 720


29 Planning Your Multimethod Questionnaire Testing Bento Box: Complementary Methods for a Well-Balanced Test 723
Jaki S. McCarthy


29.1 Introduction 723


29.2 A Questionnaire Testing Bento Box 725


29.3 Examples from the Census of Agriculture Questionnaire Testing Bento Box 733


29.4 Conclusion 743


References 744


30 Flexible Pretesting on a Tight Budget: Using Multiple Dependent Methods to Maximize Effort-Return Trade-Offs 749
Matt Jans, Jody L. Herman, Joseph Viana, David Grant, Royce Park, Bianca D.M. Wilson, Jane Tom, Nicole Lordi, and Sue Holtby


30.1 Introduction 749


30.2 Evolution of a Dependent Pretesting Approach for Gender Identity Measurement 752


30.3 Analyzing and Synthesizing Results 759


30.4 Discussion 764


Acknowledgments 766


References 766


Index 769
show more

About Paul C. Beatty

PAUL C. BEATTY is Chief of the Center for Behavioral Science Methods at the U.S. Census Bureau.
DEBBIE COLLINS is a Senior Research Director at the National Centre for Social Research, UK.
LYN KAYE is a consultant in Survey Research Methods, and previously Statistics New Zealand's Senior Researcher.
JOSE???LUIS PADILLA is Professor of Methodology of Behavioral Sciences at University of Granada, Spain.
GORDON B. WILLIS is Cognitive Psychologist at the National Cancer Institute, National Institutes of Health, USA.
AMANDA WILMOT is a Senior Study Director at Westat, USA.
show more