Download ipcc - DC

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

URL redirection wikipedia , lookup

Transcript
Can Survey Respondents with Visual
Deficits Complete My Web Survey?
DC-AAPOR
Web Survey Workshop
September 10, 2009
Lawrence A. Malakhoff
U.S. Census Bureau
1
Overview
•
•
•
•
•
•
•
•
Defining Accessibility
Testing Methodology
Usage of Color
Visual Focus
Reading Order
Techniques to Reduce Memory Burden
Navigation Instructions
Types of Web Surveys
2
An Unfamiliar Requirement
• A RFP from a Federal
Agency for a Web Survey
requires the software to
conform to Section 508.
• 13.8 million hits on “Section
508” from Google with
references to standards
and checklists.
• The Web survey designer
must understand this
requirement before design
begins.
3
Defining Accessibility
• “Section 508 requires that when Federal
agencies develop, procure, maintain, or use
electronic and information technology,
Federal employees with disabilities have
access to and use of information and data
that is comparable to the access and use by
Federal employees who are not individuals
with disabilities, unless an undue burden
would be imposed on the agency.” (1194.1)
• Applies to Federal Internet & Intranet Web
sites, forms, Web surveys, and desktop
applications since 6/2001.
4
What is an Accessible Web Survey?
• Usable
• Conforms to Section 508
• Enables the screen-reader user to
experience the same visual sequence of
questions, answer choices, skip
patterns and instructions
• Single accessible version, versus
separate versions
• The design process includes
accessibility
5
Automated Tools
• Automated tools available – Cynthia Says
(free), AccVerify, and InFocus
• Tools do not interpret results within context of
the page.
• Content can be accessible, but not usable if it is
unstructured.
6
Testing Methodology
• Focused on persons with visual
impairments.
• Used the InFocus automated testing tool
• Used the Job Access With Speech (JAWS)
screen reader to verify accessibility findings.
• Tested related elements, HELP, FAQs, etc.,
with JAWS & Adobe Acrobat.
7
Usage of Color
• “Color coding shall not be
used as the only means of
conveying information,
indicating an action,
prompting a response, or
distinguishing a visual
element.” (1194.21(i))
• Users with a color deficit
see in shades of gray.
• Click on the green button
or the Go button?
GO
8
Visual Focus
• “A well-defined on-screen indication of the
current focus shall be provided that moves
among interactive interface elements as the
input focus changes. The focus shall be
programmatically exposed so that assistive
technology can track focus and focus
changes.” (1194.21(c))
• Visual Focus is shown as a dotted rectangle
around the current button or link and changes
when the user presses the tab key.
9
Visual Focus is shown when tabbing
10
Reading Order
Reading
starts
Here.
1
Response
options.
Navigation
buttons.
2
3
11
Placement of Instructions and
Memory Burden
• Lengthy instructions can interfere with recall of the original
question.
• Better to list topic, instructions, question, then response
data entry field or options, in that order.
Question?
Topic
•Instruction 1.
•Instruction 1.
•Instruction 2.
•Instruction 2.
•Instruction 3.
•Instruction 3.
response:
Preferred
Question?
12
Inferences and Memory Burden
Please check all that apply to your residence:
Condition 1.
If condition 2 is checked,
conditions 1 and 3
Condition 2.
do not need to
Condition 3.
be present in later questions.
Please tell us the number of rooms.
•If condition 1, …
•If condition 2, …
•If condition 3, …
number:
13
Inferences and Memory Burden(2)
• Use of “his”, “hers”, “he”, “she” is more engaging to the
respondent than “this person.”
• Personal pronouns are preferable to a first name because
they are less likely to be mispronounced by the screen
reader.
How many children did
this person have?
number:
How many children
did she have?
Preferred
14
Stem and Leaf Questionnaire
Structure
• Stem contains the first part of the question
• Two or more conditions (leaves) follow
• The second and later leaves pose a memory
burden for screen-reader users
• Backward navigation may be necessary if
stem text cannot be recalled
• Technically accessible but poor usability
15
Ste
m
These questions deal with your usage of some
accessibility features of MS-Windows just
during the last month. How much of the time
last month did you use:
Leaf 1
Always
Often
Sometimes
Seldom
Never
Mouse Keys
Sticky Keys
Filter Keys
Leaf 3
Leaf 2
16
Navigation Instructions
• Design for linear and
random access
differences
• Instructions to
choose a link to the left
left/right problematic
• Screen-reader users
must guess at
navigation strategy
• “Click the link below”
implies forward
navigation
Lorem ipsum dolor sit
amet, consectetur
adipisicing elit, sed do
eiusmod tempor
incididunt ut labore et
dolore magna aliqua.
Click the button on
the left. Ut enim ad
minim veniam, quis
nostrud exercitation
ullamco laboris nisi ut
aliquip ex ea
commodo consequat.
Click the button on
the right.
right
For more information,
click the link below.
www.census.gov
17
Types of Web Surveys
• Screen-based
• Scrolling/paging
• Screen-based form offers a key advantage
over a scrolling Web Survey – reduction of
memory burden.
18
Recommendations for Accessible
Web Surveys
•
•
•
•
•
Make it usable
Make it accessible
Ensure correct reading order
Create a single accessible version
Use questionnaire structures that reduce user
memory burden
• Use a screen-based Web survey
19
References
• Cynthia Says: http://www.contentquality.com/
• AccVerify:
http://www.hisoftware.com/products/accverify.html
• InFocus:
https://www.ssbbartgroup.com/amp/infocus.php
• Moss, Trenton (2007). The Problem With Automated
Accessibility Testing Tools, retrieved from
http://www.webcredible.co.uk/user-friendly-resources/
web-accessibility/automated-tools.shtml on 4/10/2009
• Window-Eyes: http://www.gwmicro.com/
• JAWS: http://www.freedomscientific.com/
20
References(2)
• Peterson, L.R., & Peterson, M.J. (1959).
Short-term retention of individual verbal
items. Journal of Experimental Psychology,
58, 193-198.
• Usability Basics:
http://www.usability.gov/basics
• Accessible forms:
http://www.jimthatcher.com/webcourse8.htm
• WebAIM: http://www.webaim.org/intro/
• Section 508: http://www.section508.gov/
21
Contact
• [email protected]
• 301-763-3688
• Survey Practice:
http://surveypractice.org/2009/06/29/508-guidelines/
• Questions?
22