Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
ON THE OPTIMAL ALLOCATION OF ADVERSARIAL RESOURCES STYLIANOS GISDAKIS, PANOS PAPADIMITRATOS Adviser: Frank,Yeong-Sung Lin Present by Chris Chang AGENDA Introduction System and adversarial model Adversarial tactics Analysis Conclusion and future work AGENDA Introduction System and adversarial model Adversarial tactics Analysis Conclusion and future work INTRODUCTION Wireless sensor networks cover a broad range of mission-critical applications The nature of these applications, often operating in hostile and adverse environments, makes security indispensable. There has been a wide gamut of security schemes for wireless sensor networks, for example: managing cryptographic keys securing communication detecting faulty data aggregation detecting sybil attacks INTRODUCTION When the adversary compromises multiple sensor nodes; that is, controls their operation, and extracts their private/secret cryptographic material and such an adversary can replicate such cryptographic keys and insert her own misbehaving nodes at will. Clearly, the more numerous the cryptographic keys and the nodes under the control of the adversary are, the higher her strength is. INTRODUCTION Assuming that security mechanisms are put in place and considering the above-mentioned strong adversary, taking over a significant fraction of the system nodes. Where and how should the adversary hit, or equivalently how should the adversary allocate her resources in order to distort the most of the data collected by the victim network? INTRODUCTION In this paper, the most important thing is when the entire or a large part of the network is of interest for an adversary that does not have overwhelming power. We model the mission critical network as a set of parts where the adversary can attack. The better the choice of attack points, that is, network parts where the attack is mounted, the higher the impact and thus the “gain” of the adversary INTRODUCTION In this paper, we do not venture to reveal vulnerabilities of this WSN. Rather, we analyze the tactics of the adversary, exactly to shed light on how vulnerable a mission-critical sensor network can be as a function of the adversarial strength. We see that the problem of identifying an optimal attack, is computationally hard. Thus, we develop an efficient heuristic approach to determine a close-to-optimal attack tactic. AGENDA Introduction System and adversarial model Adversarial tactics Analysis Conclusion and future work SYSTEM AND ADVERSARIAL MODEL System and adversarial model 1. 2. 3. System Model Adversarial Model Problem Statement SYSTEM AND ADVERSARIAL MODEL (SYSTEM MODEL) We model a wireless sensor network (WSN) as a set, S, of n clusters S = {C 1 , C 2 , ..., C N }. We do not dwell on the cluster formation, e.g., the communication topology formation. (clusters are formed according to the requirements of the supported application) The number of benign nodes within a cluster C i is defined by a function F ben (C i ) : S → N . SYSTEM AND ADVERSARIAL MODEL (SYSTEM MODEL) The valuations of all the clusters of the network are encoded as a vector V = {V 1 , V 2 , ..., V N }. These values of are either proportional to the number of benign nodes within the specific cluster or context specific. We term U total to be the utility gained by the adversary by controlling the whole network so that . SYSTEM AND ADVERSARIAL MODEL (SYSTEM MODEL) Nodes are equipped with a set of cryptographic keys used to ensure the confidentiality, the integrity and the authenticity of the communications among nodes and the sink. SYSTEM AND ADVERSARIAL MODEL (ADVERSARIAL MODEL) We assume that adversarial resources fall into two categories : 1. 2. physical devices (R Phy ) cryptographic keys (R Crypto) R Phy is the number of sensor nodes the adversary controls, either by having introduced them to the system or by having compromised formerly deployed benign nodes. R Phy << n, not a large fraction of the total number of sensor nodes SYSTEM AND ADVERSARIAL MODEL (ADVERSARIAL MODEL) R Crypto is the cryptographic keys that attacker possesses result from benign node compromised. R Phy ≤ R Crypto (which means that a compromised node can have more than one cryptographic key.) we require that one single key cannot be used by more than one node simultaneously in order to not trigger sybil detection schemes. SYSTEM AND ADVERSARIAL MODEL (ADVERSARIAL MODEL) In our model, the adversary is aware of the allocation of benign nodes within each cluster (i.e knows F ben (C i )). Function F val : [0, n] → R+ maps cluster C i to its value. The attacker is aware of F val, and as a result can quantify the utility gained by controlling each cluster. SYSTEM AND ADVERSARIAL MODEL (ADVERSARIAL MODEL) We consider data manipulation, so that the view of the data the WSN user gets is not the actual one but the one the adversary wishes for it. This attack can be launched against each and any of the clusters. If the manipulation takes place, arbitrarily or within a level wished by the adversary, then we say the adversary won over the cluster. We term the utility of the adversary as U mal . SYSTEM AND ADVERSARIAL MODEL (ADVERSARIAL MODEL) Considering two generic types of attack to control a cluster and to manipulate the produced measurements : 1. 2. Local Majority Attack Stealthy Data Attack: Local Majority Attack : The adversary controls the majority of the nodes in the cluster and she can impair or affect any data collection process. With the help of a smaller fraction of nodes an attacker can still affect data collection. Deviations can be products of false measurements injected by the malicious nodes. SYSTEM AND ADVERSARIAL MODEL (ADVERSARIAL MODEL) Stealthy Data Attack : To remain undetected in case misbehavior detection mechanisms are in place, adversary controlled nodes report data that differ no more than δ from the measurements reported by benign cluster members. SYSTEM AND ADVERSARIAL MODEL (PROBLEM STATEMENT) The adversary can choose which clusters to attack, and this means, in our model, to choose which clusters she will deploy adversarial nodes (out of the available R Phy ). Then, for each of the nodes allocated, the adversary can choose how many keys to equip each of those nodes with (out of the R Crypto ). Our goal is to identify the optimal allocation of resources that maximizes her U mal given the deployment of the benign nodes of the WSN. AGENDA Introduction System and adversarial model Adversarial tactics Analysis Conclusion and future work ADVERSARIAL TACTICS First, the adversary decides on the subset M of clusters, which when controlled will maximize U mal. In order to attack a cluster she must allocate at least one physical device into each of the clusters of M. |M| ≤ R Phy ADVERSARIAL TACTICS Since each cluster has a value, the problem becomes the definition of the subset of clusters that will yield a U mal as close to U total as possible, but does not exceed a predefined value termed as target. This is equivalent to the Subset Sum S(V, U total ) that is an NP-Complete combinatorial optimization problem. ADVERSARIAL TACTICS To calculate the U mal for a given subset M the attacker should define the optimal distribution of R Crypto among the clusters in the subset. Besides, R Crypto doesn’t allow the attacker to control all of the clusters of a subset M, and the problem is equivalent to the 0-1 Knapsack Problem of combinatorial optimization. ADVERSARIAL TACTICS Adversarial tactics 1. 2. Cluster Selection Resource Allocation per Selected Clusters ADVERSARIAL TACTICS (CLUSTER SELECTION) In this paper, we use Genetic Algorithms (GAs) whose main algorithmic structure is termed a Chromosome to help us to solve the subset sum problem. A chromosome : a candidate solution to the optimization problem. Two basic operators of genetic algorithms are Mutation Cross-Over ADVERSARIAL TACTICS (CLUSTER SELECTION) We apply this Genetic Algorithm as a heuristic for the Cluster Selection problem. Chromosomes whose number of genes is equal to R Phy (the adversary is physically constrained to R Phy .) ADVERSARIAL TACTICS (CLUSTER SELECTION) Each gene includes an integer value that represents the index or an identifier of some cluster. For example, consider chromosome {C 1 , C 5 , C 8 , C 20 } : This candidate solution (chromosome) describes a scenario with an adversary constrained to R Phy = 4 Attacking clusters C 1 , C 5 , C 8 , and C 20. ADVERSARIAL TACTICS (CLUSTER SELECTION) In each evolution of the genetic algorithm, chromosomes are evaluated based on the maximum utility they can achieve. After a defined number of iterations (evolutions of the algorithm), the GA converges to an optimal (or near optimal in case of large scale networks) resource allocation of both R Crypto and R Phy . ADVERSARIAL TACTICS (RESOURCE ALLOCATION PER SELECTED CLUSTERS) F cost : [0, Max] → N : the function that assigns a cost to each cluster. To launch a majority attack against a cluster C i The function of cost is : F cost (C i ) = F ben (C i ) + 1 To launch attacks against data aggregation The function of cost is : F cost (C i , Δ, δ) → N This function is the amount of malicious nodes required in order to produce a deviation from the average aggregate equal to Δ by reporting values that are a percentage δ of the average value produced by the rest of the nodes in the cluster. ADVERSARIAL TACTICS (RESOURCE ALLOCATION PER SELECTED CLUSTERS) For a chromosome with N genes the problem of optimal allocation of R Crypto is formulated as follows: ADVERSARIAL TACTICS (RESOURCE ALLOCATION PER SELECTED CLUSTERS) This maximization program is in fact the formal definition of the 0-1 Knapsack Problem. In the paper, we use Dynamic Programming to solve the 0-1 Knapsack Problem . The output of this algorithm is a vector ,which defines which clusters of the chromosome under evaluation should be selected in order to achieve maximum utility within the resource constraints set by R Crypto . AGENDA Introduction System and adversarial model Adversarial tactics Analysis Conclusion and future work ANALYSIS 1. 2. Simulation Setup Results ANALYSIS (SIMULATION SETUP) We implemented our model using the JGAP [5] genetic algorithm package. For the dynamic programming part of the model we implemented a basic dynamic programming algorithm. The setup of the experiments included configurations with clusters assigned a number of benign sensor nodes. In every simulation, the attacker was provided with a number of compromised nodes and a number of cryptographic resources. ANALYSIS (SIMULATION SETUP) ANALYSIS (RESULTS) ANALYSIS (RESULTS) ANALYSIS (RESULTS) ANALYSIS (RESULTS) AGENDA Introduction System and adversarial model Adversarial tactics Analysis Conclusion and future work CONCLUSION AND FUTURE WORK We consider resilience in the presence of strong and intelligent adversaries which cannot in principle have overwhelming power. It is necessary for the attacker to know where and how to attack the victim network to maximize the impact of her exploit. CONCLUSION AND FUTURE WORK we develop an efficient and effective heuristic that can guide the adversary, notably the allocation of the adversary’s resources, to solve this computationally hard problem, and we also find that our approximate solution is nearoptimal. We will expand our investigation to make it a two-sided one, to cover both the attacker and the system security designer. The latter is in fact our ultimate target. Thanks for your listening