Download C4ISR-Med Battlefield Medical Demonstrations and Experiments

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Medical ethics wikipedia , lookup

Patient safety wikipedia , lookup

Electronic prescribing wikipedia , lookup

Transcript
C4ISR-Med
Battlefield Medical Demonstrations and Experiments
Lockheed Martin ATL
January, 2012
PoC: Susan Harkness Regli
[email protected]
Overview
Lockheed Martin (LM) has built a demonstration prototype and human-in-the-loop simulation
infrastructure, called C4ISR-Med, to address information flow challenges in combat casualty
care. We have conducted human factors research at multiple sites that has shown the criticality of
incident and treatment data capture at the point of injury (POI). In addition, triage of a casualty
situation requires fast and stressful decision-making with very little intelligence about patient
status. In current operations, significant information exists that could 1) improve medical
Situational Awareness (SA) and aid medics in assessing situations and 2) enable safe,
nonintrusive electronic reporting of casualty care data to enhance patient outcomes and medical
health records.
Battlefield trauma care is the first level of in-­‐theater medical care. Field medics are responsible for triaging and treating wounded personnel. Currently, the decision about who is treated first is made quickly under high stress and with little information. Vital signs can be difficult to measure in the battlefield environment, leaving field medics to make treatment decisions with little data. At the POI, there is a significant lack of consistent documentation of medically relevant information; field medics resort to writing the details of treatment and medications on bandages or medical tape on the patient’s skin. As casualties move from POI to combat support hospitals, each transfer point introduces a new risk of information breakdown. C4ISR-Med addresses these battlefield medical information challenges by leveraging tactical
intelligence technologies and LM system expertise, including a tactical intelligence collection
tool that uses spoken language input to create digital versions of standard reports.1 We use proven
human factors techniques to incorporate subject matter expertise into the design of medic-specific
tools and usage scenarios. We have a human-in-the-loop simulation and experimentation
infrastructure at the LM Center for Innovation to evaluate the feasibility and effects of increased
medical “intel” on the battlefield. And we will continue to conduct iterative demonstrations and
experimentation of C4ISR-Med to engage the user community and evaluate new solutions for
transition readiness.
The three key outcome goals of the C4ISR-Med effort are:
•
•
•
A Seamless User Interaction for medic triage and casualty reporting
Medical Intel to all levels of care for patient and tactical benefit
A Flexible simulation infrastructure to plug-and-play new solutions
1 The tactical intel research and development for this collection tool (I2W or Interface to the Warfighter), was funded by the Office of Naval Research as well as SOCOM SORDAC S&T; the software runs on small devices running the AndroidOS due to guidance from SOCOM. © 2013 Lockheed Martin Corporation. All Rights Reserved. To guide our work towards these goals, we have depicted an overall C4ISR-Med vision (Figure
1). In the next section we describe each segment of the vision in detail.
Figure 1. C4ISR-­‐Med Vision
C4ISR-Med Prototype and Simulation
The C4ISR-Med vision encompasses the battlefield situation from prior to injury, through
casualty incident(s) Level 1 care, and on to Level 2 care at the field hospital. The C4ISR-Med
project has created prototype software as well as simulation capabilities at each key point to
enable demonstration and provide a testbed for experimentation with best-of-breed solutions.
The C4ISR-Med software runs on a wide variety of Android OS-based hardware, providing
multiple form factors to fit different contexts of use. The hardware used in the simulation includes
small form-factor Android devices (complete with GPS and wifi capability), small and large
phones, and tablets. For our simulation, the role-based allocation of equipment is as follows:
• All personnel wear physiological sensors or have simulated sensor data
• Individual squad members wear small devices or carry phones for sensor data collection.
• The medic wears an Android device to receive alerts and to activate spoken language
processing. The medic carries a phone for review of vitals and for processing speech into
combat casualty reports. Speech can be captured using Bluetooth microphone, wired
microphone, or by speaking directly into the phone.
• The squad leader carries phone for lightweight blue force tracking and status review as
well as digital entry of 9-line report.
2 The transport medic can have a phone and/or tablet for review of vitals and reports as
well as creating additional reports.
• The field hospital has a tablet for review of incoming data.
Using Figure 1 as a guide, we will step through the usage of these tools, as well as the simulation
elements that constitute our demonstration environment. The demonstration takes place in three
staged areas: Triage (Point of Injury), Medevac, and Field Hospital.
•
Step 1: Pre-mission baseline vitals & assess readiness
An underlying premise of the C4ISR-Med system is that warfighters will be wearing small,
unintrusive physiological sensors to monitor important vital signs both pre- and during a mission.
LM is investigating of sensors available now as well as developing our own sensors that can be
worn on the body or incorporated into existing clothing/equipment to avoid adding weight.
For demonstration, the C4ISR-Med system can either simulate sensor data readings or integrate
data from real sensors on a person or medical training mannequin. The integration infrastructure
makes it easy to incorporate new sensors for experimentation. In the CONOPs, the data from
sensors would be collected wirelessly by Android devices for each warfighter. Prior to
deployment, baseline measurements for each individual would be taken in varying operational
conditions to determine vitals thresholds that might indicate injury.
Step 2: Injury incident triggers alert from body-worn sensors and small Android devices
When an incident occurs that produces casualties, the medic receives an audio alert that indicates
he should look at his device to review who is injured (names in red) and where they are located
relative to his position (Figure 2, left).
For demonstration, the algorithms that determine what vitals thresholds indicate injuries are
rudimentary, although they are based on interviews with subject matter experts that gave insight
into what is likely to happen when a particular type of injury occurs. The development of more
accurate and sensitive algorithms is an area of potential research and experimentation.
Figure 2. (left) The medic reviews injured personnel identity and position. (right) The medic can use a larger interface to choose a name and review the vitals for that warfighter. Step 3: Medic can review vitals info and blue force position
If the situation is rapidly evolving and the medic can get to the injured immediately, there is no
need to review any other information before starting treatment. If, however, the medic has time
while an area is being cleared or while in transit, he can take out his phone to review the vitals of
3 the patients (Figure 2, right) or see where everyone is on a map (Figure 3). The map interface was
developed as a tool for tactical intel2 and would more likely be used by a squad leader, but the
information is also available to the medic.
Figure 3. The medic or squad leader can review blue force position and status.
Step 4: Medic captures injury and treatment info at site via voice
When the medic begins treatment, he can begin or continue documentation at any point during the
casualty care. To begin documentation, he taps twice anywhere on his device screen to start the
speech processing on the phone. Note that he does not need to take the phone out of his pocket or
interact with it in any way to turn the speech processing on or off; this is a deliberate technology
advancement to enable the reporting to be done using the hands as little as possible so they can
remain free to treat the patient.
Information known about the warfighter from a pre-entered profile (e.g., unit, allergies) will be
automatically populated in the report, as well as the time of the report. If the vitals signs are
available from sensors, the vital signs are pre-populated to reduce the amount of information that
needs to be entered by the medic. This is an example set of utterances to create a report:
“Report for P-F-C Smith.”
“Injuries due to IED blast. Lower left leg is amputated. AVPU is unconscious.”
“Applied TQ. BP is 145 over 95, pulse is 165, respiratory rate is 14.”
“Inserted a nasal pharyngeal, started IV of hextend 500 milliliters.”
“Save report.”
The information can be entered in shorter statements or in a different order. The system parses the
speech into a report format based on the Tactical Combat Casualty Care card (Figure 4, left).
Step 5: Digital Medevac request sent
The squad leader uses manual text entry to populate a digital 9-line report, including information
requested from the medic about the wounded and the equipment needed on the evacuation
vehicle. The 9-line could accept speech input as well; however, for the demonstration we show
text entry to highlight that there are multiple ways to interact with the system based on
operational constraints for silent (using text) vs. hands-free (using speech) data entry.
2 The map interface also leverages capabilities from Interface to the Warfighter (I2W). 4 Figure 4. (left) Report created using spoken language utterances. (right) Field Hospital review of reports on incoming wounded.
Step 6: Medevac adds reports and transmits all reports to TOC and hospital
When the Medevac arrives, all reports are transferred to the C4ISR-Med tablet on the Medevac
vehicle. For the demonstration the transfer occurs over a wireless network, but the simulation
infrastructure is designed to allow for testing over varying network conditions to evaluate
performance in degraded environments. Reports are transferred to the field hospital and the TOC
opportunistically so preparations for wounded and tactical responses can begin.
While on the Medevac, the transport medic can review the reports, review current vitals, and
create new reports if changes in patient status occur and/or additional treatment needs to be
documented. For demonstration, the Medevac is simulated as a vehicle in an octagon-shaped
simulation environment that provides an immersive experience of driving a vehicle through a
variety of environments (e.g., an Afghan countryside or town). The simulation provides options
to test how equipment might be used in transit through loud environments including gunfire.
Step 7: Field Hospital receives advance casualty data
As the reports come in, the field hospital can review all of the received reports (Figure 4, right)
and begin to prepare resources and personnel to treat the incoming wounded upon arrival. For the
demonstration we created a report review screen to be used on the Medevac and at the field
hospital, but the report data could also be integrated into existing systems and electronic health
records to enhance long-term record keeping with data about treatment at the POI or in transit.
The field hospital is simulated by a life-size “virtual wall” 3D-display of avatars operating in a
hospital environment. The transport medic can deliver the patient (person or mannequin) to the
field hospital and interact with the avatar to play out conversations that might occur upon arrival.
This simulation provides flexibility of multiple scenarios for experimentation or training.
Step 8: Incident added to EHR
Finally the incident data can be added to the patient’s electronic health record. While the
demonstration does not reach as far as long-term electronic health records, it is important to note
that data about POI incidents, treatments, and patient outcomes will be available in digital format.
The information that is captured will not only be larger in volume, but will also be parsed into
records with tagged fields indicating what was entered for injury type, treatment, and medication.
These records will be valuable not only for the long-term care of individual patients, but also for
data analysis of treatments and medicines versus patient outcomes after a battlefield injury.
5 Demonstration Controller
A single demonstration controller running on an Android device controls all the actions
happening on the C4ISR-Med devices. This provides the simulation with a large degree of
flexibility because we can move simulated blue force positions (or use actual GPS), change
simulated sensor readings, and create injury events during the scenario enactment. We will continue to make design and demonstration enhancements to support quantitative assessment with human-­‐in-­‐the-­‐loop experiments in 2013. 6