Download [SLIDE 1]

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

National Institute for Health and Care Excellence wikipedia , lookup

Bad Pharma wikipedia , lookup

Adherence (medicine) wikipedia , lookup

Electronic prescribing wikipedia , lookup

Transcript
Component15/Unit 7 – Audio Transcript
SLIDE 1: Usability and Human Factors
No narration
Slide 2: Outline
Today we will discuss an important topic to Usability and Human Factors:
Decision Support Systems (DSS), a Human Factors Approach. Decision Support
systems have been used in a variety of industries including finance,
transportation and public works since the 1970s. It had its’ beginnings in health
since the late 1950s, but became a more active area of research and
development in the mid 70s. With the growing penetration of clinical information
systems, DSS is an effective vehicle for providing real-time guidance to
clinicians. DSS includes lower technology solutions such as paper-based
guidelines. However, we will focus on computer-based clinical decision support
systems (CDSS) and Computer Provider Order Entry Systems.
The first part of the lecture will focus on the basics of human decision-making.
Then, we will explore CDSS and consider its potential and pitfalls. We will
explore some of the barriers to productive use and consider some options for
improving design.
SLIDE 3: Patient Safety
Patient safety has increasingly become a focal concern of all stakeholders in the
health care enterprise. You have been introduced to the Landmark Institute of
Medicine Report, which detailed the extraordinary number of preventable deaths
that are attributable to human error. There are significant complexities in
medication management, which pose substantial risk for hospitalized patients
and a non-negligible risk for ambulatory care patients.
We can characterize several phases of the medication delivery process:
prescribing, dispensing, administration, and monitoring. Each of these phases
provides opportunities for confusion, miscommunication and error.
SLIDE 4: Human Factors Approach
You may have been introduced to the concept of human factors in several
lectures. The focus is on human beings and their interactions with
products/equipment, tasks and environments. The broad goal is to design
systems and system components to match the capabilities and limitations of
humans who use them. The process we are focally concerned with is decision
making.
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
1
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
SLIDE 5: Understanding Decisions
Making decisions is something that every human being routinely does. Some
decisions have greater importance than others. The study of decision-making
has been a focal concern of cognitive psychology research for more than 60
years. We can characterize 3 components of a decision process:
1. Choice options and courses of actions.
2. Beliefs about objective states, processes and events in the world including
outcomes states and means to achieve them. This reflects our
understanding of the state of affairs and the means to change them.
3. Desires, values or utilities that describe the consequences associated with
the outcomes of each action-event combination. Some outcomes are
more satisfactory than others and some risks are more consequential than
others.
In simple terms, good decisions are those that effectively choose means that are
available in a given situation to achieve the individual's goals. As we will discuss,
CDSS can transform the means to shape decisions in achieving an individual’s
goals.
SLIDE 6: Medical Decision Making Research
There are two focal areas of work in medical decision-making research. The first
involves an understanding of how decisions are made. The second involves the
means to facilitate and improve the decision making process in order to improve
patient outcomes. In the next few slides, we will focus on objective one
understanding how decisions are made. Following that, we will concentrate
largely on issues related to the second objective, namely how to facilitate and
improve the decision making process.
SLIDE 7: Heuristics and Biases
It is not surprising that humans are fallible reasoners and flawed decision
makers. We routinely witness how societies’ “professional” decision makers like
judges, politicians and coaches make poor decisions. The Heuristics and Biases
approach was pioneered by Tversky and Kahneman who for many years
endeavored to understand the nature of human decision-making. Heuristics are
rules of thumb that we routinely use for making decisions. They can be adaptive
or maladaptive. Biases are systematic deviations from normative standards.
They can significantly impact the process of decision-making and have been well
documented in the context of health-related decisions by clinicians as well as
patients. Hindsight and confirmation are two of the well-documented biases.
SLIDE 8: Hindsight Bias
Decisions, especially those that were mistakes, seem very transparent after the
fact. When one reviews the errors of a family member, friend or colleague, the
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
2
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
situation seems so obvious. In fact, when errors are analyzed after the fact,
blaming a culprit seems to be the clear thing to do. This kind of thinking is a
product of hindsight bias. This occurs when decision makers inflate the
probability of a prior judgment (e.g., diagnose a patient) on the basis of
subsequent available information.
There have been numerous studies documenting hindsight bias. One such study
provided clinicians with a set of cases on paper and asked them to rate the
probability of a given diagnosis. When they were told the actual diagnosis in
advance of the case, their probability judgments were significantly higher. The
important point is that the probabilities should always be the same given a set of
facts such as patient’s symptoms and laboratory findings. The inflated
probabilities are a product of hindsight bias.
SLIDE 9: Hindsight Bias: So What?
So, what does all this mean? Well, biases have consequences for present and
future decision-making. If physicians assume they would have predicted a clinical
outcome, they may fail to learn from a case. Unusual or noteworthy cases may
seem more mundane because they can appear less unusual after the facts are
known. We also talked about error attribution. If the answer seems obvious, then
it is easy to overlook a host of mitigating factors that may resurface and lead to
more serious errors.
SLIDE 10: Confirmation Bias
Confirmation bias is something that we are all guilty of from time to time. If we
have already made up our minds, we are less receptive to the possibility that our
decision choice is wrong. We may fail to attend to the evidence. Overconfidence
in one’s judgment causes a decision maker to favor one hypothesis over another.
We may selectively attend to data and not give adequate weight to alternative
possibilities.
SLIDE 11: The Cost of Confirmation Bias
One finding in medical decision-making is that clinicians are prone to order
laboratory tests that may yield no new information about the patient’s state. They
may merely serve to confirm one’s prior hypothesis.
SLIDE 12: Classic DM Problem (Eddy, 1982)
There is ample evidence to suggest that clinicians experience difficulty engaging
in probabilistic reasoning. In particular, the application of Baye’s Theorem has
been problematic. Eddy’s classic study is an interesting case in point. He
presented clinicians with the following problem:
Estimate the probability that a woman has breast cancer given that she has a
positive mammogram on the basis of the following information:
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
3
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
The probability that a patient has breast cancer is 1%. (This provides the prior
probability).
If the patient has breast cancer, the probability that the radiologist will correctly
diagnose it is 80%. (This provides the sensitivity or hit rate).
If the patient has a benign lesion (no breast cancer), the probability that the
radiologist will misdiagnose it is 9.6%. (This provides the false positive rate).
According to Baye's rule, the probability that this patient has breast cancer is
about 8%. Eddy found that 95 out of 100 physicians estimated that the probability
of breast cancer after a positive mammogram to be around 75%, a highly inflated
value. The test result and its’ sensitivity seem to be the most salient feature and
the base rate is largely ignored. The deviation between individual's responses
and the normative response as indicated by Baye’s Theorem are explained by a
bias in which the clinician selectively attends to certain variables that are salient
and ignores others such as base rates. Biases are, more generally,
predispositions to reason in ways that are not consistent with probability theory.
SLIDE 13: Framing Effect
The framing effect refers to the fact that alternative representations of a problem
can give rise to different judgments and preferences. Preference for a particular
course of action is different when a problem is posed in terms of potential gain
rather than potential loss even though the underlying situation is identical. On the
next slide, we will provide an example.
SLIDE 14: Survival vs. Mortality
For example, when a cancer patient is given a prognosis in terms of survival
years, the perception is different from probability of mortality. The difference is
just in the presentation, not in the substance. McNeil et al (1986) presented a
hypothetical lung cancer decision scenario to physicians and patients. One
treatment option was radiation therapy, which had an immediate higher survival
(lower mortality) rate, but a lower 5-year survival rate. The other treatment option
was surgery. There were two frames for explaining the problem.
 Frame 1: treatments were described in terms of survival rates
 Frame 2: treatments were described in terms of mortality rates
McNeil found that subjects in the survival frame, expressed a clear preference for
surgery. In the mortality frame, the two choices were preferred almost equally.
One possible explanation is that the positive framing leads to more risk averse
choices, while the negative framing increases risk-seeking decision making.
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
4
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
SLIDE 15: DM in Naturalistic Settings
The previously described research on decision-making has largely been done in
laboratory contexts and not in real world settings. It is important to keep in mind
that decision making in naturalistic settings can take on a certain character. They
are embedded in a broader social context and involve multiple players. Decisions
are best thought of as decision-action cycles rather than discrete decisions. In a
given setting, there may be substantial stress and time pressure.
SLIDE 16: Decision Support Systems
Increasingly humans have access to a wide range of tools that assist us in
making a range of decisions from buying a car to maintaining our diet. Decision
Support Systems are interactive computer-based systems that help individuals
use communications, data, and knowledge to solve problems and make
decisions. DSS may also include paper-based guidelines and decision charts.
But for the present purposes, we will restrict ourselves to computer-based DSS.
It is important to note that DSS are intended to assist and guide human
decisions, not render decisions in an automated fashion.
Decision Support Systems have been used in a wide range of settings and fields
of work including banks, insurance companies and hospitals.
SLIDE 17: Clinical Decision Support Systems
Clinical Decision Support Systems (CDSS) are tools that provide clinicians, staff
and patients with knowledge and person-specific information, presented at
appropriate times, to enhance health and health care. They do not provide
generic medical advice, but consider patient-specific data to assist clinicians at
the point of care. They have been designed to be used for a wide range of
medical decisions including decisions for prevention, screening, diagnosis,
treatment, drug dosing, test ordering, and/or chronic disease management.
SLIDE 18: Star Trek Tricorder: The Ultimate Clinical Decision Support Tool
A tricorder is a multifunction handheld device used for sensor scanning, data
analysis, and recording data. In a medical context, it is used by doctors to help
diagnose diseases and collect bodily information about a patient. It is the ultimate
decision support tool. Moreover, it is safe to say that few contemporary CDSS
tools can measure up to the tricorder. Of course, the tricorder is the stuff of
science fiction and it’s not likely that we will have any such device any time soon.
SLIDE 19: Forms of CDSS Advice
CDSS advice comes in several different forms. Alerts and reminders are the
most common ones used in the context of clinical information systems. Alerts
inform clinicians of potentially negative consequences from following a particular
course of action. For example, a system may prompt a clinician that a patient has
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
5
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
a particular allergy and that prescribing a particular drug of choice may have
adverse affects. A reminder would be used to let a clinician know that it may be
time for a patient with diabetes to have examinations of their feet and eyes given
that a certain period of time has elapsed since their last examination.
SLIDE 20: The Case for Clinical Decision Support
Although Clinical Decision Support Systems have been the subject of
considerable controversy in medicine, there is much to recommend them for.
One of the central premises in the use of CDS systems is to make better use of
the ever-increasing available body of medical knowledge. There has been
prodigious growth in our understanding of health and disease in recent years, yet
this knowledge is slow to penetrate clinical practice. CDS tools offer great
possibilities for making knowledge available to the practicing clinician at the point
of care.
The following list includes some of the possible uses for CDS:
 Reduced medication errors that cause adverse medical events. I think this
speaks for itself.
 Improved management of specific acute and chronic conditions.
 Improved personalization of care for patients. The idea is that CDS tools can
offer considerable precision in tailoring therapeutic choices to patients.
 Best clinical practices consistent with medical evidence. This is a primary goal
of CDSS in general.
 Cost-effective and appropriate prescription medication use. Cost-effectiveness
of CDSS has been somewhat difficult to demonstrate. However, there is no
question that better treatment, fewer medical errors and better patient
monitoring and management could reduce costs over time.
SLIDE 21: Degrees of CDSS Computerization
How much guidance should a Clinical Decision Support System offer? This has
been a much debated issue for several decades. Early expert systems, although
they were rarely used in clinical practice, were oriented to providing answers
rather than assists to clinicians. At the other end of the continuum, are systems
that offer no guidance whatsoever. In fact, some folks argue that any such
guidance removes precious autonomy from clinicians. In between the two
extremes, there are a wide range of options. For example, the system may offer
a complete set of alternatives for a therapeutic regimen or it may narrow it down
to 2 or 3 alternatives. It may go ahead and automatically execute the action (e.g.,
order investigative tests).
SLIDE 22: Degrees of Computerization Continued (6-10)
It may allow a human to veto the suggested course of action. Most of these
choices are hypothetical. Some advocate the choice on the previous slide, which
would allow the clinician to approve a suggested course of action. Most of the
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
6
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
choices from 6 to 10 are hypothetical and would not be allowed in practice.
However, it provides one with a sense of the continuum and its’ effect on
clinicians’ decision-making.
SLIDE 23: Computerized Provider Order Entry Systems (CPOE)
As you know by now, Computerized Provider Order Entry or CPOE Systems
support electronic entry of clinical orders for the treatment of patients. They are
unarguably one of the flagship applications of clinical informatics. They have
increasingly become essential instruments of patient care. These systems enable
the partial automation of the medication ordering process. Decision support tools
are an integral part of a CPOE system. In fact, much of decision support
research has been conducted in the context of CPOE systems. E-Prescribing
systems are outpatient systems that support a subset of the functions of CPOE
systems.
SLIDE 24: Promise of Order-Entry Systems
Although CPOE systems remain a controversial technology for reasons that will
be discussed later, it offers significant promise. It has the potential for
significantly reducing medication errors. Most adverse events in patients occur at
the stage of drug ordering. CPOE can result in improvements in response time,
efficiency of dispensing and delivery of medication.
SLIDE 25: Some Advantages of CPOE Systems
The following list enumerates some of the advantages of CPOE systems:
Orders can arrive to the pharmacy in less time than they would otherwise. They
can be easily integrated into medical records and decision-support systems
They can also be easily linked to drug-drug interaction warnings—a very
important feature for reducing the possibilities of medical error and so forth.
In addition, there are claims that suggest that CPOE systems can generate
significant financial savings, although the truth of the matter is that the promise of
costs savings has not yet come to fruition.
SLIDE 26: An example Drug-Drug Interaction Scenario
This scenario was developed by Gil Kuperman and colleagues. It was published
as an online data supplement in the Journal of the American Medical Informatics
Association. It illustrates how a drug-drug interaction decision support scenario
may work in the context of a CPOE system. I will read the scenario out loud:
“When ordering a new medication, a prescriber may not be aware that two drugs
interact, or may not be keeping in mind the other medications that the patient is
taking. As an example, consider the case of a hospitalized patient who is being
treated with venlafaxine (Effexor) for chronic depression and develops an
infection with a drug resistant bacterium requiring treatment with linezolid, a new
antimicrobial agent. The interaction between linezolid and venlafaxine (serotonin
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
7
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
syndrome -- altered mental status, including agitation, confusion and coma,
neuromuscular hyperactivity, and autonomic dysfunction) is very severe but may
not be known to the practitioner. While writing the order for linezolid, an alert
screen can warn the practitioner that these two drugs should not be used
together. The alert screen may offer the prescriber the opportunity to cancel the
order, to discontinue the existing medication that interacts with the newly ordered
medication, or to order a test that could detect the interaction or monitor therapy.
The alert screen may prompt the physician to have a conversation with the
patient regarding potential side effects of the medications. Any of these
consequences of the decision support software could be beneficial.”
In this scenario, the CDSS flags a potentially serious problem by alerting the
clinician and offering a set of alternative actions including cancelling the order
and offering constructive suggestions to raise the awareness of potential side
effects with the patient. This example illustrates the nature of an interaction with
a system and a positive outcome. As we discuss later, some interactions are less
successful.
SLIDE 27: Challenges with Order Entry
There are well known challenges associated with the implementation and use of
CPOE systems. The systems are not easy to learn. They impose their own
workflow on clinical care. This may be disruptive and even interfere with normal
channels of communication. For example, a clinician might have previously had a
conversation with a pharmacist prior to deciding on a medication regimen for a
complicated patient problem. The CPOE system may serve to reduce the
likelihood of such a conversation. In fact, pharmacists who may once have had
considerable influence on therapeutic decision-making may be reduced to
dispensing medication. These systems often take more time than paper
ordering—at least until clinicians develop substantial facilities with the new
system. The most important potential problem is the possibility of new types of
error.
SLIDE 28: CPOE Paradox
This slide recaps the essential dilemma of CPOE systems at this point in time.
There is presently a substantial body of literature on CPOE systems and the use
of CDS systems. CPOE has been characterized as an important intervention to
reduce prescribing errors and yet the evidence-base for their effectiveness is
limited thus far. On one hand, some studies have shown electronic prescribing
with CPOE significantly increases prescribing quality in hospital inpatients.
However, they have also been shown to introduce new types of errors in the
decision making process.
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
8
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
SLIDE 29: Cognitive Evaluation of Interaction with a CDSS
We are now going to discuss a couple of studies that document problems
associated with the use of CPOE systems and decision support. This first one is
by Jan Horsky and colleagues.
He studied the interaction with a particular CPOE system in view to:
 Characterize the interaction in terms of effectiveness
 Changes to ordering behavior
 Opportunities for error attributable to the interaction process
The example described in the publication involves decision support for heparin
dosing, a medication that is used to reduce blood clotting (an anticoagulant). The
dose is based on a calculation of the patient’s weight.
SLIDE 30: Weight-Based Heparin Ordering
Horsky used a set of artificial scenarios designed to elicit decision support
responses. The CPOE triggers a decision-support alert and calculates the dose
automatically.
SLIDE 31: Methods
The study employed a cognitive walkthrough analysis conducted by expert
analysts. They also conducted a usability testing experiment in which physicians
were asked to enter appropriate orders according to a clinical scenario using a
CPOE system. The scenario required a drug dose adjustment for an ongoing
anticoagulation therapy. Subjects were instructed to verbalize their thoughts (or
think aloud) while completing the task. Their interaction with the system was
captured on video.
SLIDE 32: CPOE Screen
This is the CPOE screen seen by the clinicians. In the scenario, they were in the
midst of ordering and an alert screen popped up.
SLIDE 33: Weight-based IV Heparin Protocols
Let’s take a closer look at the alert screen. Alerts are often constituted by
remarkably complex messages. The critical information is that the appropriate
dose should be a bolus of 8000 units. But the key information is embedded in an
extended text and not readily visible amidst the busy message.
SLIDE 34: Results - Presentation Salience
Not surprisingly, the clinicians experienced difficulty drawing the correct inference
from the alert. This resulted in considerable confusion and extended periods of
time to process the alert.
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
9
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
SLIDE 35: Results - User Behavior
The alert was not coextensive with typical workflow. In one case, a subject
performed all the calculations by hand and 6 others used the values only as a
reference. Subjects also experienced some difficulty understanding how the
system performed the computation and were skeptical about its veracity.
SLIDE 36: Summary
The format of presentation of the alert was clearly suboptimal. In general, the
benefits of decision support were not realized. Part of the problem is that the
CDSS process was not consistent with typical workflow patterns. The authors
suggest that a different representational form would enable a quick perceptual
judgment and could reduce the extra cognitive effort. For example, if the alert
would include only the calculated dose in the frame with a clear description of
how the result was computed. It would also be helpful to differentiate suboptimal
configurations on the screen and replace them with displays based on sound
usability principles. Admittedly, this is harder than it seems.
SLIDE 37: Role of CPOE Systems in Facilitating Medical Errors
The study by Horsky and colleagues identified some problems associated with
decision support and CPOE workflow. In 2005, Koppel and colleagues published
a very influential study examining the ways in which a CPOE system facilitated
medical errors. The study which was published in JAMA (Journal of the American
Medical Association) used a series of methods including interviews with
clinicians, observations and a survey to document the range of errors. According
to the authors, the system facilitated 22 types of medication error and many of
them occurred with some frequency. The errors were classified into 2 broad
categories: 1) Information errors generated by fragmentation of data and failure
to integrate the hospital’s information systems and 2) human-machine interface
flaws reflecting machine rules that do not correspond to work organization or
usual behaviors.
SLIDE 38: Information Errors: Fragmentation and Systems Integration
Failure 1
We are going to illustrate the first class of errors with 2 subcategory examples. It
is a well-known phenomenon that users come to rely on technology and often
treat it as an authoritative source that can be implicitly trusted. In this case,
clinicians relied on CPOE displays to determine the minimum effective dose or a
routine dose for a particular kind of patient. However, there was a discrepancy
between their expectations and the dose listing. The dosages listed on the
display were based on the pharmacy’s warehousing and not on clinical
guidelines. For example, although normal dosages are 20 or 30 mg, the
pharmacy might stock only 10-mg doses, so 10-mg units are displayed on the
CPOE screen. Clinicians mistakenly assumed that this was the minimal dose.
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
10
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
SLIDE 39: Information Errors: Fragmentation and Systems Integration
Failure 2
Medication discontinuation failures are a commonly documented problem with
CPOE systems. The system expects a clinician to 1) order new medications and
2) cancel existing orders that are no longer operative. Frequently, clinicians fail to
cancel the existing orders leading to duplicative medication orders; thereby,
increasing the possibility of medical errors. Perhaps, a reminder that prior orders
exist and possibly need to be canceled may serve to mitigate this problem.
SLIDE 40: Human-Machine Interface Flaws
As is the case with other clinical information systems, CPOE systems suffer from
a range of usability problems. The slide describes 3 kinds of problems. When
selecting a patient, it is relatively easy to select the wrong patient because
names and drugs are close together, the font is small and patients’ names do not
appear on all screens. In a similar note, physicians can order medications at
computer terminals not yet "logged out" by the previous physician. This can
result in either unintended patients receiving medication or patients not receiving
the intended medication. When patients undergo surgery, the CPOE system
cancels their previous medications. Physicians must reenter CPOE and
reactivate each previously ordered medication. Once again, a reminder to do so
may serve to reduce the frequency of such mistakes.
The Koppel study is well worth reading. It has received considerable acclaim and
notoriety. It is worth pointing out that the CPOE system employed in that study
used an old style interface and was not consistent with modern graphical user
interfaces. Nevertheless, this system was widely used and several of the
problems noted in this study have also been observed in the use of other
systems.
SLIDE 41: Automation Bias
Earlier in the lecture, we were introduced to decision biases. Biases reflect
systematic deviations from normative standards and lead to skewed or
imbalanced decisions. An automation bias is constituted by a preference to lean
on technology and in this case, follow the directions of a decision support system
even if it does not jive with one’s intuition or training. This leads to 2 classes of
errors: 1) Errors of omission, as in the case where a clinician is less vigilant in
checking drug orders because they assume the computer will have already done
the work and 2) Errors of commission in which clinicians tacitly assume that if
they were not warned about a potentially dangerous drug order by the computer,
then there is nothing to worry about.
SLIDE 42: Anti-Automation Bias
The flip side of automation bias is an anti-automation bias characterized by a
distrust or disdain for alerts and reminders. This often leads to advice being
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
11
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
ignored. Clinicians routinely disable or ignore the alarms or alerts on clinical
monitoring devices. Sometimes the reasons are legitimate. For example, some
systems exhibit high false alarm rates for their alerts. Other times, clinicians
simply do not want to be disrupted by intrusive alerts and readily dismiss them
without giving the advice its’ proper due. Overriding of alerts is a pervasive
problem in the use of CPOE systems. Some studies have documented override
rates of over 90%--which is extraordinary in my opinion.
SLIDE 43: Barriers to Prescriber Decision-Making and Clinical Workflow
Medication alerts should not only warn prescribers about potential problems, but
also provide enough information so they can be appropriately resolved. In a
recent study, Russ and colleagues (2009) observed medication prescribing
during routine patient care and identified 15 barriers associated with medication
alerts.
SLIDE 44: 15 Barriers to Prescriber Decision-Making
Some of the barriers have to do with interface issues such as problems
associated with the display of the alert. An alert may not provide adequate
information as to why it was triggered. Decisions often involve tradeoffs and
some alerts fail to specify the relative risk of patient harm. Alerts can be repeated
over the course of an interaction and clinicians invariably become annoyed by
such system behavior.
Slide 45: Barriers Continued
If alerts appear too frequently, they can lead to information overload and
prescriber desensitization; thereby, increasing the potential for missing key
alerts. One complaint about alerting systems is the failure to distinguish between
serious problems and less harmful ones. For example, the difference between a
genuine allergy and drug sensitivity is often clinically important in making
medication choices. Allergies can be very serious; whereas, sensitivities are not
typically serious. Decision support systems may seem non-intuitive and even
perplexing to the user. Alert systems may not adequately reveal their
capabilities/limitations to the prescriber; therefore, the full functionality of the alert
system becomes ambiguous. This can increase the level of distrust and perhaps
lead to anti-automation bias.
SLIDE 46: Barriers Elaborated
In the system evaluated by Russ and colleagues, they found that poor screen
display was evidenced by poorly presented alert text: the haphazard grouping of
multiple alerts in a single pop-up window and the need for scrolling to see a
series of alerts. Inadequate alert specification occurred when alerts did not show
all clinically relevant information needed for decision-making. We already talked
about the unclear level of risk for a given alert and whether it overrides other
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
12
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
factors that were part of the original decision strategy. For example, should a
clinician discontinue an effective medication because the patient shows evidence
of drug sensitivity? That is clearly a difficult decision to make.
SLIDE 47: Human Factors and Information Management
We will now briefly consider 2 important constructs used in human factors and
decision making research: situation awareness and mental workload. CDS
systems variably support situation awareness and impose varying levels of
mental workload.
It has been well established in human factors research that situation awareness
and high mental workload can impair memory, problem identification and
decision-making.
SLIDE 48: Situation Awareness
Situation awareness is a construct used in contemporary decision research to
characterize an awareness of what is happening around you with an
understanding of what that means to you now and in the future. It can be
characterized according to 3 levels:
1. Perception of elements in the environment (e.g., cues/stimuli from patient
pulse, color, weight change, chart, EHR, nurse),
2. Comprehension of the meaning of those elements (by integrating the
disparate pieces of information and determining what is salient),
3. Projection of future status so that decisions can be made
Situation awareness is a product of expertise and experience. Poor situation
awareness can lead to impaired decision making.
SLIDE 49: Mental Workload
Information management problems increase, MWL increases.
Time pressure makes it more important that CDS automation be easy to use and
useful:
 When you are under time pressure, you have less time and patience to
navigate through poorly designed technology.
 Under time pressure, humans can adapt and still perform reasonably well by
exerting more mental effort or by concentrating harder.
 However, under more significant mental workload, individuals can no longer
adapt or compensate in order to maintain cognitive performance. Every human
has a threshold where decision-making becomes impaired because of
impossibly high mental workload conditions.
 Demands imposed by the system (e.g., clinician needing to remember the
important facts of the most recent patient visit while starting the next patient’s
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
13
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.
visit) can readily exceed the attentional resources or mental capacity of the
person. This condition is ripe for medical error and compromises patient safety.
SLIDE 50: CPOE/CDSS Design Recommendations
We know ways to build a better machine and to make it usable by clinicians.
Khajouei and Jaspers offer some design recommendations.
Interfaces must explicitly map to workflow patterns of clinicians. CDSS systems
must support, rather than impede, clinical workflows through speedy, available,
and usable algorithms that provide parsimonious, clear, concise, and actionable
warnings and advice. There should be clues or cues in the interface to optimally
support users in medication ordering. They recommend that a clinician should
not have to negotiate more than 3 screens to respond to an alert. They also
characterize the ways in which the display can be better organized and the alert
displayed more prominently on the screen.
SLIDE 51: Designing for Better Workflow
Karsh also proposes a set of general guidelines to improve clinical information
systems and the use of CDSS by designing for better workflow. Clinical systems
should help clinicians see the right amount of the right type of data wherever and
whenever needed. It is often tricky to determine the “right amount”. If too much
information is available on a given screen or display, it leads to clutter and
confusion in finding important information. On the other hand, the alternative is to
have the user negotiate additional layers of screens to find the information they
need. Clinical information should be accessible in the shortest possible amount
of time. A common problem is that the information may be spread across several
clinical information systems making it difficult to find the right information. Data
from disparate sources should be aggregated for completeness so that clinicians
are not forced to go to multiple different systems to obtain important information.
SLIDE 52: Concluding Thoughts
Computer-based decision support systems offer great promise for the reduction
of errors in medicine and facilitation of superior patient care. However, at this
point in time, results on CDSS efficacy have been equivocal. CDSS can lead to
new types of errors, which present something of a daunting, but not
insurmountable problem. Adherence to usability/human factors principles can
lead to superior design and enhanced performance.
SLIDE 51: References
Component 15/Unit 7
Health IT Workforce Curriculum
Version 2.0/Spring 2011
14
This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of
the National Coordinator for Health Information Technology under Award Number 1U24OC000003.