Download Cyber-Multi-Attribute Task Battery (C-MATB)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Deep packet inspection wikipedia , lookup

Cyber-security regulation wikipedia , lookup

Cracking of wireless networks wikipedia , lookup

Computer and network surveillance wikipedia , lookup

Security-focused operating system wikipedia , lookup

Computer security wikipedia , lookup

Cyberwarfare wikipedia , lookup

Cybercrime countermeasures wikipedia , lookup

Cyberattack wikipedia , lookup

Transcript
Developing a Cyber Multi-Attribute
Task Battery and Cognitive Model for
Human Performance Evaluation in
Cyber Operations
(F4FGA05076J003)
PI:
Co-PIs:
Brett J. Borghetti, PhD (AFIT)
Christina F. Rusnock, PhD (AFIT)
Greg J. Funke, PhD (711th HPW)
AFOSR Program Review:
Trust And Influence Program
(Jun 13-17, 2016, Arlington, VA)
DISTRIBUTION A. Approved for public release: distribution unlimited.
Brief Background on Cyber Security
• Crucial to the success and security of modern
commercial, industrial, and governmental organizations
– Expected to proliferate exponentially, resulting in an
estimated 1000% increase in unique, malicious
software by 2025 (Maybury, 2015)
2
DISTRIBUTION A. Approved for public release: distribution unlimited.
Brief Background on Cyber Security
• “State of the art” is a system within which human
analysts work collaboratively with computer systems to
identify and respond to cyber threats
– Intrusion detection: “the process of monitoring the
events occurring in a computer system or network
and analyzing them for signs of possible incidents,
which are violations or imminent threats of violation of
computer security policies, acceptable use policies, or
standard security practices” (Scarfone & Mell, 2007)
3
DISTRIBUTION A. Approved for public release: distribution unlimited.
Brief Background on Cyber Security
• Modern cyber defense systems inspect traffic moving
within and through a network
1. Intrusion Detection System (IDS) performs initial
inspection of network data
• Compares network events to a database of
“signatures,” i.e., profiles of known malicious
network activity
• Network activity that is similar to a known
signature generates an alert
• Strong liberal bias, consequently most alerts are
false alarms (D’Amico & Whitley, 2007)
4
DISTRIBUTION A. Approved for public release: distribution unlimited.
Brief Background on Cyber Security
2. Alert must then be investigated by a human computer
network defense analyst (network analyst for brevity)
– Triage analysts (first line network analysts)
• Perform rapid interrogation of network sensor
data to determine the credibility of alerts
– Escalation analyst (second line network analysts)
• Consult multiple sources of data to determine
whether an intrusion truly occurred
• Broadly follow a standard pattern of activity (Dye,
in press)
5
DISTRIBUTION A. Approved for public release: distribution unlimited.
Brief Background on Cyber Security
• Network defenders report high levels of chronic, jobrelated stress (Chapelle et al., 2013)
• Cyber defense-like tasks elicit acute, task-related stress
and workload (Mancuso et al., 2015; Sawyer et al., 2014,
Greenlee et al., in press), which may reduce situation
awareness and performance (Champion et al., 2012)
• Assured success in cyber requires better understanding
and performance augmentation of network analysts
(Maybury, 2015)
6
DISTRIBUTION A. Approved for public release: distribution unlimited.
Motivation / Scope
• Synthetic task environments (STEs) provide cognitively
similar tasks without training required for the real system
• STEs play a vital role in understanding real-world
environments
– For Pilots / Operators, MATB is well known & used for decades
– No STE for triage and escalation analysts
• Goal: Develop and validate STE for triage and
escalation analysts
7
DISTRIBUTION A. Approved for public release: distribution unlimited.
Project Goals
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Conduct Task Analysis with Network Analyst SMEs (AFIT)
Select cognitive tasks to replicate in Cyber-MATB (AFIT)
Build interface prototype for experiments (AFIT)
Build task model in IMPRINT (AFIT)
Calibrate Cyber-MATB (AFIT)
Build Cyber-STE for initial human participant experiments (711
HPW)
Conduct First Cyber Experiment Proof of Concept (711 HPW)
Proof of Concept Results Analysis / Publication (711 HPW)
Develop/conduct additional experiments (711 HPW)
Distribution & Marketing (711 HPW)
8
DISTRIBUTION A. Approved for public release: distribution unlimited.
Progress Towards Goals
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Conduct Task Analysis with Cyber Line Analyst SMEs (AFIT)
Select cognitive tasks to replicate in Cyber-MATB (AFIT)
Build interface prototype for experiments (AFIT)
Build task model in IMPRINT (AFIT)
Calibrate Cyber-MATB (AFIT)
Build Cyber-STE for initial human participant experiments (711
HPW)
Conduct First Cyber Experiment Proof of Concept (711 HPW)
Proof of Concept Results Analysis / Publication (711 HPW)
Develop/conduct additional experiments (711 HPW)
Distribution & Marketing (711 HPW)
9
DISTRIBUTION A. Approved for public release: distribution unlimited.
Cyber Task Analysis
• Primary Source: 33rd Network Warfare Squadron
• Focus group: Line Analysts (triage) & Lead Analyst
– details about operational process & procedures
– alert severity distribution
• Operational Environment: HP ArcSight (network traffic
monitoring aggregator)
10
DISTRIBUTION A. Approved for public release: distribution unlimited.
Initial Prototype STE
• Includes Alerts on main screen and detailed packet capture
information on second screen
• Participants select whether an alert is a threat or not
• Configurable IVs: Alerts per minute; Severity Distribution
• Collected data: mouse & keyboard activity; operator selections
• Example Derivable DVs: Alerts handled per minute; Accuracy
11
DISTRIBUTION A. Approved for public release: distribution unlimited.
Attentional Model
• Created Attentional Task Model
– Operator’s attention focus at each moment in time
• Developed Improved Performance Research Integration
Tool (IMPRINT) model of the initial prototype
– Simulates operator behavior and decisionmaking
– Based on SME input
– Generates simulated performance on prototype task
12
DISTRIBUTION A. Approved for public release: distribution unlimited.
C-MATB Calibration Study
Human
Performance
IMPRINT
Predictions
After peak performance, as workload goes up, performance goes down
Peak Performance
Condition
A1
A2
Amean
Mean
IMPRINT
VACP
Workload
Score
17.60
16.90
17.25
B1
17.10
B2
18.00
Bmean
17.55
Note. Part. = participant.
Mean Part.
TLX
Workload
Score
40.84
35.00
37.92
Mean
IMPRINT
Recall
.72
.72
.72
Mean Part.
Recall
.78
1.00
.89
55.00
54.72
54.86
.41
.40
.40
.61
.78
.70
13
DISTRIBUTION A. Approved for public release: distribution unlimited.
TH
711
Cyber Intruder Alert
Testbed (CIAT)
• IDS Alerts Tab
Not a
Threat
Threat
An example of the Alerts tab. Features of this tab include the IDS alert window
(left), the signature title window, which presents the title of the alert and the
associated signature number (center), and the response buttons (i.e., “Not a
Threat,” “Threat,” right). In the image, the alerts are color coded based on their
potential harm to the network (the top alert is the alert currently under
investigation).
14
DISTRIBUTION A. Approved for public release: distribution unlimited.
TH
711
Cyber Intruder Alert
Testbed (CIAT)
• Packet Capture (PCap) Tab
An example of the PCap tab. Packets in this list are arranged in the serial order
they arrive on the network. Each packet in this tab has seven data fields,
corresponding to: 1) the number of the packet, which is determined by its serial
position in the packet list; 2) the time the packet arrived on the network; 3) the
source internet protocol (IP) address of the packet; 4) the destination IP
address of the packet; 5) the protocol type; 6) the packet length (i.e., its size);
15
and 7) any additional information associated with the packet.
DISTRIBUTION A. Approved for public release: distribution unlimited.
TH
711
Cyber Intruder Alert
Testbed (CIAT)
• Query Tab
An example of the Query tab. Features of this tab include the query window
(top left), the “Search” and “Clear” buttons (top right), and the response window
(bottom). The query window is not limited to signature numbers, however – it
will search the experimenter-developed database for any specified terms and
display the associated information.
16
DISTRIBUTION A. Approved for public release: distribution unlimited.
TH
711
Cyber Intruder Alert
Testbed (CIAT)
• Network Tab
An example of the Network tab. Entries on the list include two pieces of
information about each computer: 1) the IP address of the computer, and 2)
any additional, potentially relevant information about that computer, such as
membership in a subnetwork or potential vulnerabilities (due to missing
patches, etc.).
17
DISTRIBUTION A. Approved for public release: distribution unlimited.
711TH CIAT Architecture
• Programmed in C#
• Runs on Microsoft Access database
• Microsoft Access file includes sample alerts
– Alerts modified from real Cisco signatures
(Baumrucker et al., 2003)
18
DISTRIBUTION A. Approved for public release: distribution unlimited.
711th Experiment
• Coordinated vs. Uncoordinated displays
– 45 Alerts (10 actual threats), no time limit
– 46 inexperienced participants (AF + local community)
– Operator marks alerts as “threat” or “not a threat”
• Hypothesis: Coordinated displays ↑ efficiency; ≈ efficacy
• Supported efficiency improvement:
– Coordinated completion 50% faster (p<0.001)
• Efficacy also improved for detection
– Coordinated condition correctly identified 95% of threats vs.
85% in uncoordinated condition (p=0.01)
• Impact: Supports software change requested by
operational community
19
DISTRIBUTION A. Approved for public release: distribution unlimited.
CIAT Availability
• CIAT program
– Open source
– Available for download via research gate
• Includes executable, source code, and
help/installation file
– Increasing relevant research in cyber operation
contexts
20
DISTRIBUTION A. Approved for public release: distribution unlimited.
CIAT Availability
• CIAT program currently shared with:
– Victor Finomore (United States Air Force Academy)
– Dr. Christopher Wickens & Alex Vieane (Colorado
State University)
– Dr. Robert Gutzwiller (SPAWAR)
– Dr. Jeremiah Still (Old Dominion University)
21
DISTRIBUTION A. Approved for public release: distribution unlimited.
Future Work (Year 2)
• Continue validation with cyber (USAFA) SMEs
• Build/validate additional simulated capabilities
– e.g., Network firewall log
• Change from “tabbed” to “windowed” environment
• New experiment
– Effects of task interruptions on cyber defenders
• Primary: CIAT + secondary email task
– Collecting performance & physiological data (eye
tracking, heart rate, center of pressure in seat pan)
• Data collection July-Dec 2016
22
DISTRIBUTION A. Approved for public release: distribution unlimited.
List of Publications Attributed to the Grant
• Dye, G.: Using IMPRINT to Guide Experimental Design of
Simulated Task Environments. Technical Report AFIT-ENG-MS-15J-052. The Air Force Institute of Technology (In press)
www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA626785
• Funke, G., Dye, G., Borghetti, B., Mancusio, V., Greenlee, E., Miller,
B., Menke, L., Brown, R., & Vieane, A. (in press). Development and
validation of the Air Force Cyber Intruder Alert Testbed (CIAT).
Proceedings of the 7th International Conference on Applied Human
Factors and Ergonomics.
• Vieane, A., Funke, G., Mancuso, V., Greenlee, E., Dye, G.,
Borghetti, B., Miller, B., Menke, L., Brown, R. (in press). Coordinated
displays to assist cyber defenders. Proceedings of the Human
Factors and Ergonomics Society Annual Meeting.
23
DISTRIBUTION A. Approved for public release: distribution unlimited.