Download Final Report - NCSU Libraries

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

United States contract law wikipedia , lookup

Transcript
Card-Sort Usability Testing: Final Report
May 2004
Katherine Dexter Willis
Yinon Bentor
For the Communication, Publications, and Marketing Web Redesign Subgroup
Executive Summary
A card-sort usability test involving the organization of 58 terms (related to general library
information and services) was conducted with the purpose of determining user preferences for
the grouping and labeling of information on the NCSU Libraries web site. There were six
participants.
There were very few suggestions for alternative terminology, and context played an important
role in the understanding of many terms. A participant’s understanding of a term was not related
to the relative importance of the term once its meaning was clarified.
The most confusing terms used were found to be: Citation Tools, A-Z, Forms, Tutorials,
Instruction, and Library Publications. Recommended categories include: Library Information
Library Services, Computing, Directories, My Borrowing Record, Research Assistance, Course
Reserves, Accessibility, Employment, News and Events, Help, and Site Index.
Background
The Communication, Publications, and Marketing (CPM) web redesign subgroup worked with
the Usability testing subgroup to conduct a series of usability tests based on the card-sorting
protocol. The purpose of this usability testing was to examine user preferences for grouping,
labeling, and placement of links and other items on the NCSU Libraries web site, with a focus on
services and information related to communications, publications, and marketing.
Specific test objectives were as follows:
 To determine how users organize the major features of the NCSU Libraries web site.
 To determine the words users employ to describe various features, services, and types of
information available on the web site.
 To identify which services and types of information should, from the user's perspective,
have prominence on the web site.
Methodology
8 participants were recruited for this usability test. 2 were no-shows, so the final data set consists
of 6 participants. The tests were arranged on an individual basis so the test monitors could focus
on the individual comments, questions, and actions of the participants. Katherine Dexter Willis
(Usability subgroup member) and Yinon Bentor (student usability testing assistant) served as test
monitors for this series of tests. The tests were held either in the DML Collaboratory or the DLI
Conference Room.
The basic methodology of the card-sorting test is to develop a list of terms, put them on
individual cards, and then ask participants to arrange the provided cards into categories while
using the think-aloud protocol. For more information about the methodology that served as a
basis for this usability testing, see
http://www.usability.gov/methods/data_collection.html#sorting .
For this test, the terms were developed in conjunction with input from the CPM subgroup. Terms
were derived from the existing NCSU Libraries web site, as well as from informed suggestions
from CPM subgroup members. A total of 58 terms were used, with an emphasis on general
library information and services, particularly as they related to the area of communication,
publication, and marketing. Less emphasis was placed on terms that related to searching and the
research process, since this would have make the scope of the testing unwieldy, and it was an
area that was already being addressed by other subgroups.
During the testing, one of the test monitors took notes on the comments made by the participants.
There was a focus on noting terms that were confusing, terms that were of particular interest and
importance, and any suggestions for alternative terminology or approaches to category
organizations. When testing was completed, the test monitors recorded the category names and
the arrangement of terms within each category.
The list of terms used for the testing is included as Appendix A. The specific task list that
participants and test monitors followed during the testing is included as Appendix B.
Results
6 people participated in this testing. We attempted to recruit a combination of undergraduate and
graduate students, but this was not successful. The 6 participants consisted of 5 graduate students
and 1 Friends of the Library member. The majority were international students enrolled in an
engineering/computer science program. All were fairly familiar with the Libraries.
Key results from the testing are related to: terms found to be particularly confusing, frequently
identified categories, the relative importance of identified categories, terms frequently grouped
together, and alternative terminology suggestions.
Confusing Terms
Based on participant comments, several of the terms used in the testing were identified as being
particularly confusing. These terms were as follows:
Term
Citation Tools
A-Z
Forms
Tutorials
Instruction
Library Publications
Number of Participants Indicating Confusion
3 (50%)
4 (66%)
4 (66%)
5 (83%)
5 (83%)
5 (83%)
Categorization
The table below describes categories frequently established by participants and their relative
importance or priority. The names of the categories were normalized to reflect the most common
name selected for that particular category of information. Two of the participants did not provide
priority ranking of their categories, but the other four participants ranked category importance
and priority, and this data was averaged.
Category Name
Library Information
Library Services
Help
Computing
Directories
My Borrowing Record
Research Assistance
Course Reserves
Employment
News and Events
Accessibility
Site Index
Percentage of Participants Who
Established the Category
67% (4 participants)
67% (4)
67% (4)
50% (3)
50% (3)
50% (3)
50% (3)
50% (3)
67% (4)
67% (4)
50% (3)
50% (3)
Relative Priority Ranking of
Category Importance
HIGH
MEDIUM
MEDIUM LOW
MEDIUM HIGH
MEDIUM LOW
HIGH
MEDIUM LOW
MEDIUM
Not assessed
Not assessed
Not assessed
Not assessed
Several terms were not assessed for importance because they were categorized separately by
participants. Participants felt that that these terms should be listed separately on the web site –
the terms were not standard categories, but still had information that participants should be easily
available.
Frequently Grouped Terms – Category Content
Several terms were frequently grouped together in the same categories by multiple participants.
Based on a 0.70 threshold (meaning an average of 4 or more of the participants), the following
terms were most frequently grouped together in the following categories:
Library
Information
Giving
Library
Publications
Directories
Staff
Phone Numbers
Contact Us
Departments
Policies
Parking
Directions
Virtual Tour
Visitors
Branch Libraries
Hours
Employment
Hill of Beans
Coffee Shop
Friends of the
Library
News & Events
Copyright Issues
Computing
Library Services
Borrowing
Help
Distance Learning
Services
Computers &
Laptops
Nomadic
Computing
Connect From OffCampus
Wireless
Computing
Citation Tools
Recommend A
Purchase
Renewal Services
Where Is?
Site Index
Photocopying
My Borrowing
Record
Interlibrary Loan
Theses/Dissertations
Forms
How Do I?
Course Reserves
Research Assistance
Accessibility
Lost & Found
FAQ’s
Ask A Librarian
Tutorials
Infrequently Grouped Terms – Category Disagreement
There were several terms that participants categorized very differently. In particular, there were 3
terms that all 6 participants put in completely different categories: Reference, Course Reserves,
and Contact Us. There were another 3 terms where at least 5 of the 6 participants disagreed on
appropriate categorization: Copyright Issues, Directions, and Virtual Tour.
Alternative Terminology Suggestions
Participants made very few suggestions for alternative terminology. Terms were renamed a total
of only 12 times. Out of 348 opportunities (6 participants each working with 58 terms), this
means that the alternative terminology suggestion rate was only 3.4 %. This is not a significant
enough data point from which to draw any useful results or conclusions for specific alternative
terminology suggestions.
Findings
Variety in Approach
The participants seemed to find the task enjoyable. Each had a very different approach to the
task, with some talking more and making many modifications throughout the process and others
taking a more deliberate approach to completing the task before providing extended discussion
of their thought process. Half of the participants created a fairly simple categorization scheme,
with only 3 or 4 major categories. The other half created more detailed and complex categories
with some categorization and cross-linking. This may have been due in part to various levels of
comfort with the terms, the library, and with hierarchical structure.
Culture, Authority, and the Lack of Alternative Terminology Suggestions
Language barriers and cultural differences likely affected some of the results, as indicated by the
frequent requests for clarification of terminology by some participants. Few of the participants
made a lot of suggestions for alternative terminology – most seemed willing to work with the
existing terms once they understood their meaning. The impression of the test monitors was that
this may have in part been due to cultural differences and a culturally based unwillingness to
question an established and respected authority such as a library. As an example, one participant
initially thought that the term TripSaver referred to recycling, and another thought it referred
specifically to on-campus document delivery. Neither, however, renamed the term once it was
explained and the meaning clarified. In another example, one participant thought the term
Friends of the Library referred to TRLN members – but again, no effort was made to rename the
term once it was clarified.
The Importance of Context
Based on participant comments, the test monitors believe that context – or lack thereof -- played
an important role in how participants viewed the terms. It is also the impression of the test
monitors that in some cases, terms were categorized without a full understanding of their
meaning, or without an understanding of the subtle similarities and differences between some of
the terms used.
The card sort protocol inherently presents terms in a vacuum, with little information about how
the terms might appear in relation to each other. This is both the benefit and the drawback to the
approach. Participants can create categories as they deem appropriate and without constraint, but
that can be a challenge if the participants do not fully understand the particular terms they are
categorizing. In many cases, once a particular term was explained, the participant then proceeded
to categorize it without questioning the validity of the term itself. One example of this is that
several participants grouped the terms Nomadic Computing and Wireless Computing together,
without questioning the distinction between the two – or indeed whether there is a distinction
worth preserving when providing this information. Most also did not rename either term. The
simple context of “this information is about computer resources” was enough in the setting of the
card-sort environment. When searching for this information in a web-based environment,
however, it is highly unlikely that this context would be sufficient. If both terms were listed on a
web site it is very likely that users would be confused by the inclusion of both terms and the
implied (but likely unclear) distinction between the two.
Understanding and Importance: Two Separate Issues Affected by Context
The results were highlighted by one key conundrum: some of the same terms that were
considered most important to the participants were also considered the most confusing.
Examples include the terms ‘Instruction,’ ‘Citation Tools,’ and ‘Forms.’ Once explained,
participants clearly understood the significance and importance of these terms – but this only
occurred when they were provided with additional information about the term. For example, in
and of itself the term ‘Forms’ was very confusing (forms for what?) – but once it was explained
that this referred to making requests for specific services online (i.e. InterLibrary Loan and
Document Delivery), the term made sense. This is a key component of the results that the test
monitors believe, if carefully addressed, can be mediated by the inherent context that will be
provided by the design of the new web site.
Lack of Category Content Agreement: Implications for Cross-Linking
Participants sometimes had strong opinions on certain topics but were uncertain about the best
implementation of those ideas. For example, most commented that the hours for the Hill of
Beans were important to them and should be easily available, but they were not sure of the best
location for that information (as a stand-alone item, or included with other hours information?).
Also, there was some division about how best to present services and resources for students and
visitors – to highlight them specifically on the homepage (i.e. a link to “Student Jobs” or “Visitor
Information”) or to integrate the information into more general categories that lead to
information about the services and resources of the Libraries. Another example of a complex
term is “Instruction.” This tended to be viewed as both a service (that faculty might want to
specifically request) and as a tool for research assistance (through training and individual
consultations). In cases such as this, it may be helpful to list the term in multiple locations.
The results also indicated several other terms that were categorized very differently by
participants. The terms Reference, Course Reserves, and Contact Us were categorized differently
by all 6 participants. Copyright Issues, Directions, and Virtual Tour were categorized differently
by 5 out of the 6 participants. Part of this may be due to confusion about the actual meaning of
some terms, as discussed above (i.e. What information would be found under the term Copyright
Issues?). Another contributing factor might again be context or lack thereof (Who specifically
would be reached by clicking on Contact Us?). Finally, it may also be that some terms are just so
important that they need to be listed in multiple places on the web site.
Concluding Comments
It is unfortunate that we were unable to successfully recruit undergraduates to participate in this
testing. This in part may have been due to the timing of the study (at the end of the spring
semester). It would have been helpful to have some data from the undergraduate perspective, but
at the same time the test monitors postulate that the results of the study would not have been
vastly different had there been undergraduate participation. The basic categories suggested by
the majority of the test participants are logical and none are particularly surprising. In addition,
the results correlate with other data gathered during the web redesign process.
Recommendations
Analysis of the data leads to the following recommendations for major categories or links on the
NCSU Libraries homepage, along with selected examples of the type of information it is
recommended should be included within each category (including cross-linking between
categories).
Recommended Categories
Library Information
Computing
My Borrowing Record
Course Reserves
Employment
Help
Library Services
Directories
Research Assistance
Accessibility
News and Events
Site Index
Recommended Information in Selected Categories
Cross Linking indicated by Italics
Library
Services
Distance
Learning
Services
Printing
Library
Information
Hours
Directories
Help
Phone
Numbers
Ask a
Librarian
Parking
Staff
InterLibrary
Loan
Instruction
Directions
Directories
[Online
Request]
Forms
Photocopying
Lost and
Found
Course
Reserves
Research
Assistance
Giving/Friends of
the Library
Branch Libraries
Virtual Tour
Policies
Hill of Beans
Research
Assistance
Citation Tools
Computing
FAQ
Instruction
Departments
Where Is…
Theses &
Dissertations
Branch
Libraries
Lost &
Found
How Do I…
Connect From
Off-Campus
Nomadic
Computing
Wireless
Computing
Distance
Learning
Services
Tutorials
Computers
and Laptops
Appendix A: Term List
CITATION TOOLS
REFERENCE
PRINTING
PARKING
LIBRARY PUBLICATIONS
COURSE RESERVES
LOST & FOUND
DIRECTORIES
STAFF
LOCATIONS
A-Z
HELP
ABOUT THE LIBRARY
ACCESSIBILITY
ASK A LIBRARIAN
BORROWING
BRANCH LIBRARIES
CALL NUMBERS
COMPUTERS & LAPTOPS
CONNECT FROM OFF-CAMPUS
CONTACT US
COPYRIGHT ISSUES
DEPARTMENTS
DIRECTIONS
DISTANCE LEARNING SERVICES
EMPLOYMENT
E-RESERVES
FAQ’S
FORMS
FRIENDS OF THE LIBRARY
GIVING
HILL OF BEANS COFFEE SHOP
HOURS
HOW DO I…?
INSTRUCTION
INTERLIBRARY LOAN
JOBS
LIBRARY INFORMATION
LIBRARY SERVICES
MY BORROWING RECORD
RECOMMEND A PURCHASE
NEWS & EVENTS
NOMADIC COMPUTING
PHONE NUMBERS
PHOTOCOPYING
POLICIES
RENEWAL SERVICES
RESEARCH ASSISTANCE
RESERVE ITEMS
SERVICES TO USERS WITH DISABILITIES
SITE INDEX
THESES/DISSERTATIONS
TRIPSAVER
VIRTUAL TOUR
VISITORS
TUTORIALS
WHERE IS…?
WIRELESS COMPUTING
Appendix B: Task List
1. Explain to the user the purpose of this usability test – to help identify what categories of
information should be on the site's home page and what those categories should be called.
Explain that we want to see what groupings of cards make sense to the user and that when the
user has grouped the cards, he/she will be asked to name the groups.
2. Ask the user to talk out loud while working, so as to better understand the user's thoughts and
rationale.
3. Have the user arrange the cards into groups. Let the user add cards, or put cards aside to
indicate topics the user would not want on the site. Minimize interruptions, but encourage the
user to think aloud.
4. Give the user a different colored card for each group and ask the user to name the group. What
words would the user expect to see on the home page or second-level page that would lead the
user to that particular group of cards?
5. Finally, ask the user to create hierarchies of the groups based on which groups they consider to
be critical, important, or unimportant.