Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Social determinants of health wikipedia , lookup
Epidemiology wikipedia , lookup
Public health genomics wikipedia , lookup
Health system wikipedia , lookup
Health equity wikipedia , lookup
Reproductive health wikipedia , lookup
Race and health wikipedia , lookup
Adherence (medicine) wikipedia , lookup
Rhetoric of health and medicine wikipedia , lookup
Component15/Unit 4b – Audio Transcript SLIDE 1: Human Factors and Health Care [NO NARRATION] SLIDE 2: Patient Safety We are going to switch gears and focus on patient safety and human error. Patient safety has been an issue of concern for a couple of decades, but the community was galvanized by the Institute of Medicine Report from 1999. This report communicated the very surprising fact that 98,000 preventable deaths every single year are attributable to human error which happens to be the 8th leading cause of death in this country. SLIDE 3: Harvard Medical Practice Study The Harvard Medical Practice Study was published several years prior to the Institute Of Medicine (IOM) report and was a landmark study at the time. Based on an extensive review of patient charts in New York State, they were able to determine that an adverse event occurred in almost 4% of the cases. An adverse event refers to any unfavorable change in health or side effect that occurs in a patient who is receiving the treatment. They further determined that almost 70% of these adverse events were caused by errors and more than a quarter were due to negligence. SLIDE 4: Why do Errors Happen? We have established that errors are a matter of serious concern. Let’s take a step back and analyze the nature of errors. According to one of the pioneers in this field, James Reason, error is the failure of a planned sequence of mental or physical activities to achieve its intended outcome when these failures cannot be attributed to chance. Too often the term ‘human error’ connotes blame and a search for the guilty culprits, suggesting some sort of human deficiency or irresponsible behavior. Often we cannot isolate a single cause. Human factors researchers emphasize the need for a systems-centered approach. James Reason introduced an important distinction between latent and active failures. SLIDE 5: Active Failure Active failure represents the face of error. The effects of active failure are immediately felt. In health care, active errors are committed by providers such as nurses, physicians, or pharmacists who are actively responding to patient needs. SLIDE 6: Latent Conditions (Reason, 1997) The latent conditions are less visible, but equally important. Latent conditions are enduring systemic problems that may not be evident for some time, but they Component 15/Unit 4b Health IT Workforce Curriculum Version 2.0/Spring 2011 This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. 1 combined with other system problems to weaken the system defenses and make errors possible. There is a lengthy list of potential latent conditions including poor interface design of important technologies, communication breakdown between key actors, gaps in supervision, inadequate training, and absence of a safety culture in the workplace—a culture that emphasizes safe practices and the reporting of any conditions that are potentially dangerous. SLIDE 7: Hindsight Bias We can only analyze errors after they happened. Errors often seem to be glaring blunders after the fact. This leads to blame assignment or the search for a single cause. However, it is exceedingly difficult to recreate the situational context, stress, shifting attention demands and competing goals that characterize a situation prior to the occurrence of an error. Retrospective analysis is subject to hindsight bias. Hindsight bias masks the dilemmas, uncertainties, demands and other latent conditions that were operative prior to the mishap. Things simply look very different when you are looking back at a situation with 20-20 hindsight and a more nuanced and systems-centered approach is warranted. SLIDE 8: Space Shuttle Challenger Disaster The space shuttle Challenger disaster along with the Three Mile Island nuclear reactor accident represents watershed events in the history of human factors analysis. In both cases, careful scrutiny of the events revealed multiple faults. In the case of the Challenger disaster, the proximal cause of the accident was the failure of an O-Ring seal causing a booster rocket to explode at takeoff. However, closer study revealed a litany of latent conditions including poor communications and lack of preparation for cold weather conditions that enabled such an event to occur. SLIDE 9: Deepwater Horizon Explosion The Deepwater Horizon explosion leading to the disastrous oil spill in the Gulf of Mexico is a very recent event. We probably won’t fully understand the causes of that mishap for quite some time. But already we are hearing of lapses in safety measures and a lack of communication between BP and the operators of the oilrig. Almost certainly, there were a host of latent conditions that rendered such an accident as possible. It is unlikely that we will find any one cause for the disaster. It is more likely the result of a multiplicity of factors that created synergistic conditions for such an event to occur. SLIDE 10: Reason “Swiss Cheese” Model of Error Reason employs a Swiss cheese metaphor to explain the sequence of events or circumstances that lead to errors. The far end that is most visible to us is the active failure, which may result from someone having committed an unsafe act. However, there is a set of conditions in the form of latent failures that render an organization like a hospital more susceptible to a mishap or adverse event. There Component 15/Unit 4b Health IT Workforce Curriculum Version 2.0/Spring 2011 This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. 2 are a series of holes that are just large enough to weaken one’s defenses and make the circumstances ripe for human error. SLIDE 11: Human Errors I’d like to introduce a couple of important distinctions that will help us understand the nature of human error. We can distinguish between slips in which the actor selected the appropriate course of action, but it was executed inappropriately. A mistake involves an inappropriate course of action reflecting an erroneous judgment or inference (e.g., a wrong diagnosis or misreading of an x-ray). SLIDE 12: Mistakes Mistakes may either be knowledge-based owing to factors such as incorrect knowledge or they may be rule based in which case the correct knowledge was available, but there was a problem in applying the rules or guidelines. Let us consider a couple of examples. SLIDE 13: Example: Error One Mr. B is a 45-year-old male being treated for dehydration secondary to nausea, vomiting and diarrhea. Mr. B has been in the intensive care unit for 4 days receiving intravenous fluids via an IV catheter in his right forearm. As Mr. B stabilizes, the physician orders to start P.O. fluids (fluids by mouth) and discontinue the IV fluids. Note, the order is to discontinue the IV fluids, not the IV. Typically, the RN will stop the IV fluid and convert the IV to a saline lock that may be used for intermittent infusions as necessary. Unfortunately, the nurse removed the entire IV catheter when it should have been converted to a saline drip. SLIDE 14: Example: Error One (cont.) This capture error can be classified as a slip automatic activation of a welllearned routine that overrides the current intended activity. The nurse intended to convert the IV to a saline drip; however, she discontinued all fluids unintentionally. Let’s assume a nurse changed the settings of an infusion pump to administer medications at a rate 4 times the default value. A second nurse who takes over responsibility for the first nurse’s patient assumes that the setting of the pump is at the default value and administers a certain medication resulting in an overdose. This is a mistake known as a mode error. A mode error arises when we perform an action appropriate for one mode, but we are mistakenly in another mode and don’t realize it. Component 15/Unit 4b Health IT Workforce Curriculum Version 2.0/Spring 2011 This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. 3 SLIDE 15: Example: Error Two Let’s consider another case: Mr. Jones is assigned to a team of nurses for the dayshift. One nurse is responsible for giving all of the medication to the patients on the team. The other nurse is responsible for all assessments and treatments. Mr. Jones complains of pain to the treatment nurse. Rather than delay the administration of the pain medication while waiting for the medication nurse, the treatment nurse obtains the narcotic and administers it to Mr. Jones. The treatment nurse forgets to document on the medication record that she had given Mr. Jones some Demerol for pain. When making her rounds, the medication nurse asks Mr. Jones if he is in pain. Mr. Jones again replies yes. The medication nurse reviews the medication record and notes that there is no documentation of pain medication given. Therefore she medicates Mr. Jones with Demerol (again). Within one hour, Mr. Jones is lethargic and has respiratory depression. He has to be transferred to the Intensive Care Unit (ICU) for closer monitoring due to a Demerol overdose. SLIDE 16: Example: Error Two (cont.) This error is a case of repetition of action slip. This refers to a repetition of a correctly performed action. Each nurse medicated the patient according to the physician's orders; however, due to the error of "no documentation" the patient received a repeated dose of Demerol. Unfortunately, errors owing to documentation problems are all too common. SLIDE 17: Interdependence of The Health Care System Let’s come back to a system-centered point of view. The following quote by the authors of To Err is Human nicely captures the interdependence of the health care system. “Healthcare is composed of a large set of interacting systems-paramedic, and emergency, ambulatory, inpatient care and home health care; testing and imaging laboratories; pharmacies that are connected in loosely coupled but intricate networks of individuals, teams procedures, regulations, communications, equipment and devices that function with diffused management in a variable and uncertain environment”. The variable and uncertain environment is typical of health care too and rather atypical of almost any other domain. Nuclear power plants and airplane cockpits can best be construed as tightly coupled systems in which things normally proceed in a rather orderly and routine fashion. It is the breaks from the routine that lead to trouble. On the other hand, the practice of medicine is more complex and varied with less certain outcomes. SLIDE 18: Systems Approach to Adverse Events in Health Care Henriksen characterizes a systems approach to adverse events. Errors result from a pattern similar to the Swiss cheese model. However, the different contributing factors are spelled out in greater detail beginning with the external Component 15/Unit 4b Health IT Workforce Curriculum Version 2.0/Spring 2011 This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. 4 environment and continuing on to different dimensions of the environment such as the human-system interface. SLIDE 19: Systems Approach to Adverse Events Continued This figure represents a continuation of the previous slide in introducing sequences of factors that have the potential to contribute to error and adverse events. The individual is always the last chain of defense and the one who will invariably assume some blame for any mishap. SLIDE 20: Time Course of Medical Error Medical errors can be characterized as a progression of events. There is a period of time when everything is operating smoothly. Then some unsafe practice unfolds resulting in a kind of error, but not necessarily leading to an adverse event. For example, if there is a system of checks and balances that are part of routine practice or if there is a systematic supervisory process in place, the vast majority of errors will be trapped and defused in this middle zone. If these measures or practices are not in place, an error can propagate and cross the boundary to become an adverse event. At this point, the patient has been harmed. In addition, if an individual is subject to a heavy workload or intense time pressure, then that will increase the potential of an error resulting in an adverse event. Component 15/Unit 4b Health IT Workforce Curriculum Version 2.0/Spring 2011 This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. 5