Download IntrusionDetectionTestingandBenchmarkingMethodologies

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Airborne Networking wikipedia , lookup

Wireless security wikipedia , lookup

Distributed firewall wikipedia , lookup

Cracking of wireless networks wikipedia , lookup

Computer security wikipedia , lookup

Transcript
Intrusion Detection Testing and
Benchmarking Methodologies
Nicholas Athanasiades, Randal Abler, John Levine,
Henry Owen, and George Riley
School of Electrical and Computer Engineering
Georgia Institute of Technology
Information Networking Security and Assurance Lab
National Chung Cheng University
1. Introduction
 Beginning of the Intrusion Detection Evaluation
DARPA(1998~1999)
LARIAT (Lincoln Adaptable Real-time Information
Assurance Test-bed)(2000~2001)
 Most common methodologies
 Traffic generation is one of the most difficult ones
Synthetic traffic not represent the realities of an actual
network
SmartBits
Scripting tools
Information Networking Security and Assurance Lab
National Chung Cheng University
2
2. Existing Tools and Testing Methodologies
 A. DARAPA Environment
 B. LARIAT Environment
 C. Nidsbench and IDS Wakeup
 D. IDSwakeup
 E. Flame Thrower
 F. WebAvalanche/WebReflector
 G. Tcpreplay
 H. Fragrouter
 I. Hping2
 J. Iperf
Information Networking Security and Assurance Lab
National Chung Cheng University
3
2. Existing Tools and Testing Methodologies
A. DARAPA Environment
Approach
An off-line (Tune and optimize) and an on-line (actual
testing) evaluation executed
Tcpreplay
Protocol/traffic activity
HTTP, X window, SQL, SMTP, DNS, FTP, POP3, Finger,
Telnet, IRC, SNMP, and Time
Information Networking Security and Assurance Lab
National Chung Cheng University
4
2. Existing Tools and Testing Methodologies
 A. DARAPA Environment
Solaris
SunOS
Linux
Denial of Service
(11 types, 43
instances)
Back, Neptune, Ping of
death, Smurf, syslog,
Land, apache2,
Mailbomb, Process table,
UDP storm
Back, Neptune, Ping of
death, Smurf, Land,
apache2, Mailbomb,
Process table, UDP storm
Back, Neptune, Ping of
death, Smurf, teardrop,
Land, apache2, Mailbomb,
Process table, UDP storm
Remote to Local
(14 types, 17
instances)
Dictionary, ftp-write,
guest, phf, http tunnel,
xlock, xsnoop
Dictionary, ftp-write, guest, Dictionary, ftp-write, guest,
phf, http tunnel, xlock,
imap, phf, named, http
xsnoop
tunnel, sendmail, xlock,
xsnoop
User to Root (7
type, 38 instances)
Eject, ffbconfig,
Fdformat, ps
Loadmodule, ps
Perl, xterm
Surveillance/
Probe (6 types, 22
instances)
Eject, nmap, Port sweep,
Satan, mscan, saint
Eject, nmap, Port sweep,
Satan, mscan, saint
Eject, nmap, Port sweep,
Satan, mscan, saint
Figure 1 Attacks in the 1998 DARPA evaluation
Information Networking Security and Assurance Lab
National Chung Cheng University
5
2. Existing Tools and Testing Methodologies
 A. DARAPA Environment
1999: the goals shifted to testing complete systems
Changes and additions
 Victim Windows NT added
 New stealthy attacks added
 Two new types of analysis performed
• An analysis of misses and high-scoring false alarms
• Participants were allowed to submit information aiding in the
identification of many attacks and their appropriate response
 Detection of novel attacks without first training
Information Networking Security and Assurance Lab
National Chung Cheng University
6
2. Existing Tools and Testing Methodologies
 B. LARIAT Environment
LARIAT “emulates the network traffic from a small
organization connected to the Internet”
Many phases
 Network discovery phase
 Then, initializes the network and configures the hosts
 The test’s conditions are set up
Traffic generation is done through the use of defined service
models
 Modified a Linux Kernel that allow their software to generate
background traffic
Part of a government project and not publicly available
Information Networking Security and Assurance Lab
National Chung Cheng University
7
2. Existing Tools and Testing Methodologies
 C. Nidsbench
 A NIDS Test Suite released in 1999
 Made up of the components tcpreplay, idtest and fragrouter
 D. IDSwakeup
 Like Nidsbench
 It generates false attacks, a false positive test utility
 Consists of IDSwakeup and utilizes hping and iwu
 E. Flame Thrower






Commercial load stress tool used to identify network infrastructure weaknesses
Produces transaction in order to test network infrastructure and applications
Supports HTTP/HTTPS 1.0, 1.1 and SSL
It can emulate over two million IP address
FirewallStressor measure throughput under attack conditions
Flame Thrower intended for testing firewalls
Information Networking Security and Assurance Lab
National Chung Cheng University
8
2. Existing Tools and Testing Methodologies
 F. WebAvalanche/WebReflector
Commercial network appliances used in the testing of IDS
WebAvalanche is a stress-testing appliance
WebReflector emulates the behavior of large Web,
application and data server environments
Support such as HTTP 1.0/1.1, SSL, RTSP/RTP and FTP
Measure percent dropped packets, latencies, maximum
number of users and new user arrival rates
 G. Tcpreplay
Allows captured traffic to be played back on a network at
different speeds
Tcpdump or snoop
Information Networking Security and Assurance Lab
National Chung Cheng University
9
2. Existing Tools and Testing Methodologies
 H. Fragrouter
 An attack generation tool
 For testing anti-evasion techniques and fragmentation queues
 I. Hping2
 A command-line packet assembler and analyzer
 Allows one to create and transmit custom ICMP, UDP, and TCP
packets
 Fingerprint remote operating systems
 J. Iperf
 Measures bandwidth, delay jitter and datagram loss
 Used as a background traffic source
Information Networking Security and Assurance Lab
National Chung Cheng University
10
4. Examples of Intrusion Detection Evaluation
Environments
DARPA Like Environment
Custom Software
Advanced Security Audit Trail Analysis on
Unix
Vendor Independent Testing Lab
Trade Magazine Evaluation
Information Networking Security and Assurance Lab
National Chung Cheng University
11
DARPA Like Environment
 5 components
Traffic generating
Victim was “an anonymous FTP server running on a Sun
UltraSparc-1 using a Solaris 2.5 OS
Attack Injection programs
The in house reference programs counted the number of hung
connection at the victim server as a measure of attack
effectiveness. They used a metric called virulence. Virulence
described the intensity of an attack situation.
The evaluation method was to use 10, 15, 30, 40 and 60
attacking hosts each utilizing rates of varying rates of attacks
per second.
Information Networking Security and Assurance Lab
National Chung Cheng University
12
Custom Software
A software platform that simulates intrusions
and tests IDS effectiveness
Criteria used included
Broad Detection Range
Economy in resource usage
Resilience to stress
The benchmark platform was base on Expect
and Tool Command Language Distributed
Programming (TCL-DP) package
Information Networking Security and Assurance Lab
National Chung Cheng University
13
Advanced Security audit trail Analysis on
uniX
The test consisted of the following scenarios
Trojan horse
Attempted break-ins
Masquerading
Suspicious connections
Black listed addresses
Nosing: numerous moves through directories
Privilege abuse
Information Networking Security and Assurance Lab
National Chung Cheng University
14
Vendor Independent Testing Lab
NSS tests a broad range of features of IDS
Convenience: ease of installation, deployment and
management
UI: reporting and alerts delivered
Attack signatures
Accuracy
Peripheral issues like licensing, documentation and
log management
Information Networking Security and Assurance Lab
National Chung Cheng University
15
Vendor Independent Testing Lab
NSS’s test-bed
P3 1GHz 768 MB RAM running Windows 2000
SP2, FreeBSD 4.4 or Red Hat 6.2/7.1
Ghost image
100M Ethernet with CAT-5, Intel NetStructure 40T
routing Switches and Intel auto-sensing 10/100
network cards
IDS installed on a dual-homed PC on each subnet
No firewall used
Information Networking Security and Assurance Lab
National Chung Cheng University
16
Vendor Independent Testing Lab
NSS five types of tests
Attack recognition
SAN top 20 and/or ICAT top 10 vulnerability lists
Performance under load
Back Orifice ping
64-byte, 1514-byte packets/25,50,75 and 100 percent of
network load
Adtech AX/4000 Broadband Test System and SmartBits
SMB6000
Information Networking Security and Assurance Lab
National Chung Cheng University
17
Vendor Independent Testing Lab
NSS five types of tests
IDS evasion techniques
Tools: Fragrouter and whisker
Stateful operation test
Tools: stick and snot used to generate false alerts
Host performance
Network load, CPU and memory utilizations were
monitored
Information Networking Security and Assurance Lab
National Chung Cheng University
18
Trade Magazine Evaluation
Interesting approach
IDSs in the production network of an ISP
Deployed four machines
The metrics were accuracy, ease of use, and
uptime
Information Networking Security and Assurance Lab
National Chung Cheng University
19
Conclusion
Information Networking Security and Assurance Lab
National Chung Cheng University
20