Download Cobham Analytic Solutions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
The Challenges of Repeatable Experiment Archiving –
Lessons from DETER
Stephen Schwab
SPARTA, Inc. d.b.a. Cobham Analytic Solutions
May 25, 2010
Overview of DETER
DETER Highlights
-3 distributed clusters, ~500 nodes
-Combination of DETER developed
software and legacy Emulab
DETER Capabilities
SEER
-Federation
-Security Experimentation
Environment (SEER)
-Templates
DETER
CEDL
Emulab
DRAGON
Federator GMPLS
Emulab
Provisioned
Connectivity
Credentials
WAIL
USERS
Plug-ins to
configure
federants
Internet
GMPLS
DDoS Experiment on DETER (circa 2005)
Background Traffic:
REPLAY | NTCG | HARPOON
HIGH FIDELITY TRAFFIC
Topology:
CORE
AS-11357
BUILDING-BLOCKS |
JUNIPER ROUTER CORE
REALISTIC CONNECTIVITY AND
SCALE-DOWN
Attack Traffic:
DETER-INTEGRATED ATTACK
SCRIPTING
AUTOMATION OF VARIETY OF
SCENARIOS UNDER STUDY
Instrumentation:
PACKET AND HOST STATISTICS
CAPTURE | SPECTRAL ANALYSIS
| METRICS CALCULATION |
INTEGRATED VISUALIZATION
SEER: TOOLBOX FOR RIGOROUS
INVESTIGATION OF RESULTS
ATTACK TRAFFIC
BACKGROUND
TRAFFIC
Security Experiment Methodology & Tools (circa 2005)
DETER -- integrated workbench & tools for experimenters…
TOPOLOGY
?
TRAFFIC ATTACK DATA-CAPTURE
PALETTESs
Experimenter’s select from a palette of predefined
elements: Topology, Background and Attack Traffic,
and Data Capture and Instrumentation
METHODOLOGY
& GUIDANCE
Our Methodology frames standard, systematic
questions that guide an experimenter in selecting
and combining the right elements
EXPERIMENT
AUTOMATION
Experiment Automation increases repeatability and
efficiency by integrating the process within the
DETER testbed environment
… but this level of abstraction leads to major drawbacks
Worm/Botnet Experiment (2009)
•831 Virtual Nodes on 63 Physical PCs
Experiment Specification
Large and Complex Experiments are more suitably constructed
• by combining abstract elements
• modeling different aspects (topology, traffic, networking devices, etc.)
• with constraints on behavior
Example of such an experiment on previous slide
• hand-crafted (e.g. hand-compiled) experiment from abstract elements
Initial Approach: Archiving it All
Intuition drawn from analogy with physical (discovery) sciences…
Record all aspects of experiment to ensure (ideal) reproducibility
• Software
• Artifacts being investigated (often the researcher’s new system!)
• Operating Environment (OS, standard software on clients, servers in experiment
scenario, network routers, firewalls, etc.)
• Experiment & Test Infrastructure (initialization, control, data collection, data
reduction, data analysis, data visualization, …)
• Hardware
• All end-systems and routers/switches
• All firmware
• All chips/chipset variants (Tulip 21140As are not Tulip 21140Es!)
• Procedures
• All scripts and manual interactions required to run the experiment
•… networked systems require large and growing (unbounded) detail
to describe precisely… which ideal reproducibility would seem to demand
Challenges to Ideal Archiving
Separating Invariants from Contingencies
• An experiment requires certain properties; these are the essence of the experiment
• But every configuration detail must be specified; these are contingencies – merely
choices (perhaps important to record)
• Repeatability should be primarily defined with respect to explicit invariants
Experiment Internals
• Publications do not capture full details because increasing complexity of software,
hardware and networking technologies result in (exponential?) growth in description of
these aspects
• Peer-review process does not provide incentives to capture full details (noted in other
position papers)
• Funding agencies do not provide sufficient funding to do so (how much detail can, will and
should be demanded? Where is the limit on returns for dollars invested?)
Granularity of Reuse
• Individual researchers are interested in examining, studying and re-running different
elements of any given experiment
• Experiments that archive everything do not clearly delineate the various pieces
Future DETER Capability & Vision
DETER is developing the capability to
• Specify Experiments Declaratively
• Reason about the software (or hardware) alternatives that may be available to
realize each element in a testbed
• Select implementations that are sufficient to perform the experiment correctly
• … and ensure detection of fidelity-loss through the use of monitored invariants
• Resolve global conflicts among local element to implementation mappings
DETER vision is to foster
• Reuse through sharing of tools, technology, results & ideas among researchers
• … and to promote this vision by providing abstractions, models and elements
that are supported by our experiment life-cycle framework and tools
• Facilitate individuals and researchers focused on specific topics to create their
own abstractions, models and elements