Download Design Effects

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Design Effects
Ethan Noel
WEC
Survey Process
 Operational Design
 Two primary issues affect the
choice of the method of data
collection:
 What is the most appropriate
method for a particular research
question?
 What is the impact of a particular
method of data collection on
survey errors and cost?
Define research
objectives
Choose mode of
collection
Choose sampling
frame
Construct and
pretest
questionnaire
Design and
select sample
Design and
implement data
collection
Code and edit
data
Make postsurvey
adjustments
Perform analysis
History and Future of Survey Research
 Groves (2011) – Three major eras
 1930-1960: The Era of Invention
 1960-1990: The Era of Innovation
 1990 – present: “Designed Data” Supplemented by “Organic Data”
 Smith (2013) – New techniques for data collection and how they compare to
survey research
 Data mining
 Internet
 Social Media
 Administrative Data
 Issues?
 Solutions?
Data Collection
 Term can be slightly misleading
 Survey data is usually produced/created at the time of the interview or the
completion of a questionnaire.
 Historically there were 3 basic data collection methods
 Mailing paper questionnaires
 Telephone interviews
 Face-to-face interviews
 Computers altered traditional methods and added new ones
 The combination of modes allows researchers to minimize cost and error
The Evolution of Survey Technology
Paper
Computer
OCR/ICR
FAX
Disk by
mail
E-mail
Web
CATI
TDE
IVR/TACASI
AudioCASI
VideoCASI
Mail
Telephone
CAPI
Face-toface
SAQ
Experimental Integration
 Sniderman and Grob (1996) –Changing emphasis for experimental design
 Split-ballot -> vignette -> CATI
 Non-directive designs – randomized assignment of respondents to question form
without an intent to sway, influence, or control the direction of their response
 Directive designs – experimentation interventions are active and deliberate
 Postdecisional
 Predecisional
 Issue framing
 Context of choice
 Characteristics of chooser
 Mode
Mode Selection Considerations
 Interviewer involvement
 Face-to-face vs. mail survey
 SAQ vs ACASI
 Interaction between interviewer and respondent
 More data with increased interaction -> more control over measurement process
 Privacy
 Different levels of privacy can influence answers
 Interviewer presence
 Modes to improve reporting?
 Channels of communication
 Technology use
Choosing a Mode
 No one ideal mode for all survey applications
 Alternative approaches
 Face-to-face and telephone surveys
 Mail surveys and web surveys
 Proximate modes as reasonable alternatives
 Considerations:
 Sources of error
 Logic/Feasibility
 Coverage and sampling frame
 Appropriateness of topic to the method
 Cost constraints
 Value of timely results
Mode Application
 Bassili (1993) – Comparison of two measures of attitude strength:
accessability and certainty, for the purpose of predicting discrepancies
between voting intentions and voting behavior.
 Accessibility – response latency to a question of intent
 Certainty – question about the finality of the respondent’s intention
 CATI – Computer-assisted telephone interviewing
 27% initial response rate -> 19% final response rate
 What were the benefits to using a CATI mode? Disadvantages?
 What other modes might have sufficed?
Mode Effects
 Aquilino (1994) – The effect of research mode on respondents’ willingness to
reveal illicit or undesirable behavior.
 3 randomly assigned modes:
 SAQ – more likely to admit illicit drug use
 Face-to-face – slightly less likely than with SAQ
 Telephone – Least likelihood of admission
 Privacy concerns are much greater when dealing with sensitive
information.
 Social distance
 Response anonymity increases willingness to reveal sensitive behavior.
Designing a Questionnaire
 Dillman – Social Exchange Theory - questionnaire recipients are most likely
to respond if they expect that the perceived benefits of doing so will
outweigh the perceived costs of responding.
 Respondent burden
 Channels of communication
 Mail questionnaires
 Telephone questionnaires
Considerations
 Sampling
 Available sampling frames
 Mail, web, telephone surveys
 Method choice often has indirect sample design implications
 Cost/efficiency
 Coverage
 Respondent access to phones/Internet
 Researcher access to mailing list
 Considerations for mode choice: speed, importance of precision, existing
inferences (ex: telephone ownership correlated with voting participation)
Considerations
 Response Rates
 Face-to-face > telephone > mail
 No evidence that technology affects response rates in interviewer-administered
surveys
 Self-administered surveys see higher response rates with paper-based methods
than electronic equivalents
 Measurement Quality
 Completeness of the data, social desirability bias, and response effects (wording,
ordering)
 Open-ended questions vs closed questions
 Sources of Error