Download slides - Center for LifeLong Learning & Design (L3D)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
MAPS ‘n’ Me
(Memory Aiding Prompting System)
Stefan Carmien 3.30.05
L3D presentation
Overview
• Part 1- Rehearsal of CHI
presentation
• Part 2 - Recent work
• Part 3 - Left to do
End User Programming and
Context Responsiveness in
Handheld Prompting Systems
for Persons with Cognitive
Disabilities and Caregivers
Stefan Carmien
Center for LifeLong Learning and Design
CHI 05
April 2005
Introduction to MAPS: design
problem
• Designing for persons with cognitive disabilities
• Assistive Technology and support for Activities of
Daily Life (ADL)
• Abandonment Problem
 (Re)configuration
Related Research
• Traditional prompting support
• Distributed Cognition
• Scaffolding & Situated Action
• Existing Computer Based Prompting
Tools
Why other applications did not
thrive
• Universe of One
• Reconfiguration Issues
• Change & Safety
Unique Requirements
• Metadesign
• Low entry threshold
• Dual user interface
• Need to allow tailoring of script annotation for errors
MAPS design
• Script editor target user
• Design by composition & modification;
• Error trapping / script annotation
Handheld Prompter
Script Editor
System Evaluation
• Proof of concept testing with glider
script
– why did it succeed
• Script editor testing
– Protocol
– iterative design / changes
Implications
• Adopting a dual user interface for complex
assistive technology devices → mitigate some of
the causes for device abandonment.
• A dynamic bridge can be made between plans and
events that holds much promise for mobile and
ubiquitous computing applications.
PLANS FOR FURTHER STUDY
• realistic environments with dyads
• testing assumptions about error
trapping and extracting real errors
Thanks
This work is supported by:
– Coleman Institute for Cognitive Disabilities
– The RERC on Advancing Cognitive Technologies funded by
the National Institute on Disability and Rehabilitation
Research (NIDRR), U.S. Department of Education under
Grant #H133E040019.
– National Science Foundation SGER: Designing and
developing mobile computing infrastructures and
architectures to support people with cognitive disabilities
and caregivers in authentic everyday tasks”, National
Science Foundation Special Grant for Exploratory
Research (#IIS-0456043)
– Imagine!
Part 2
• Recent work
– Script Annotation extensions to
caregivers script editor
– Image experiment
– Papers & Presentations
Script annotation extension
• Iterative design (with Anja)
– Dump the task segment / typical task model
– Start over by making an initial list of all
reasonable:
• Trappable conditions (and tests to capture)
• Corrective actions
• Listing these allowed me to create
structural classes of error tests
Script annotation continued
• These classes of tests were implemented
via a data-driven interface
– allowing addition of new tests without
changing the code of the interface
Script annotation continued
• Further examination lead to division of
these tests into:
– Error trapping/error correction
• i.e. if you left the house without your purse
then….
– Tests as structural elements of the script
• i.e. when your bus is within 30 feet of the bus
stop - display the next prompt (“here is your bus,
get ready to get on”)
Demo
here
Image recognition study
• MAPS uses prompting for task support
• Prompts are made up of images and
verbal insturctions
• Images - but what kind?
Image Experiment
• Finished pilot studies last summer
• Preliminary analysis supports the
proposition that icons are not as
useful (for this population) for
image recognition as photos
Papers
• Pac Rim
• CHI
• HCII (the eternal tools paper)
Part 3
• Left to do:
–
–
–
–
–
Image experiment
Merging script annotations to lifeline service
Realistic studies
Working with Melissa's data
Oh, and writing a dissertation…..
Image Experiment follow up
• I have now 4 ‘typical’ and 4 young adult
with cognitive disabilities subjects lined
up
• Goal is 15 of each
• Results will be submitted in a paper to
one of the cognitive science/psychology
journals
Merging MAPS annotations into
Lifeline service
• Maps Db structure is ready
• MAPS and Lifeline designers need to
agree on:
– syntax
– Where to ‘slice smartness’
– How to support dynamic script generation (on
the prompter)
• Short Comment on AIMS
Realistic studies
• Projected format is several sets of
pairs of persons with cognitive
disabilities and caregivers
• Drawn from BSVD and Imagine!
Populations
• Use the system for typical tasks for
several weeks
Realistic studies continued
• I will work out the tasks with the pairs
and provide very minimal assistance in:
– Segmenting task
– Image collection & voice recording (actually
quite a bit of assistance for this part)
– Assembling scripts
– Loading script to prompter
– Using scripts to guide task completion
Realistic Studies Concluded
• Shadowing users
• Categorizing errors
• Enumerating possible corrections
• Open question about whether this will be a test that
includes lifeline functionality - possible use of a stub
for wireless i.e.Handheld interface for sensor simulator
to be used in Wizard of OZ studies of MAPS/LifeLine by
URA or DLCRA
Working with Melissa's data
• We have obtained a copy of the HyperResearch
ethnographic tool
• Melissa’s research (interviews of existing use of AT with
caregivers of young adults with cognitive disabilities) last
summer produced 20 3-22 page transcripts
• I am using her transcripts and previous codes to re-code
the data from my perspective and will share the results
• HyperResearch also provides some interesting data mining
and hypothesis testing tools that I will be testing with this
data and the data from my realistic studies
That’s All!
• Oh yes, and write the
dissertation……………
Thank You