Download lec02 - Indiana University Computer Science Department

Document related concepts
no text concepts found
Transcript
B 504 / I 538:
Introduction to
Cryptography
Spring 2017 • Lecture 2
Ryan Henry
Assignment 0 is due
on Tuesday!
1
Ryan Henry
Discrete probability 101
Ryan Henry
Q: Why discrete probability?
A: I’m the prof, and I said so!
A: Security definitions rely
heavily on probability theory
2
Ryan Henry
Probability distributions
Defⁿ: A (discrete) probability distribution over a
finite set S is a function Pr:S→[0,1] such that
∑ Pr(x)=1.
x∈S
• The set S is called the sample space of Pr
• The elements of S are called outcomes
3
Ryan Henry
Probability distributions
Defⁿ: A (discrete) probability distribution over a
finite set S is a function Pr:S→[0,1] such that
∑ Pr(x)=1.
x∈S
$\sum_{x\in S}\Pr(x)$
3
$\Pr\colon S\to[0,1]$
Ryan Henry
Probability distributions
What kind of
nonsensical “definition”
was that?
A: Actually, it was perfectly sensible
Let’s unpack it…
4
Ryan Henry
Probability distributions, unpacked
• S lists every conceivable outcome of a random process
• The function Pr associates a likelihood to each outcome
– That is, Pr(x) describes “how likely” x is to happen
• The range [1,0] ensures each outcome is associated
with a probability that “makes sense”
– Pr(x)=0 means:
– Pr(x)=1 means:
– 0<Pr(x)<1 means:
“x NEVER happens”
“x ALWAYS happens”
“x SOMETIMES—but NOT ALWAYS—happens”
• ∑ Pr(x)=1 ensures exactly one outcome happens
4
x∈S
(in any given “trial” of the random process)
Ryan Henry
When should I
“unpack” mathematical
definitions like
you just did?
5
EACH AND
EVERY
TIME YOU
ENCOUNTER
ONE!!
A:
Ryan Henry
Probability distributions
• Eg.:
²
S = {0,1} = {00,01,10,11}
Pr : 00
01
10
11
↧ ↧ ↧ ↧
1/4 + 1/8 + 1/2 + 1/8 = 1
6
Ryan Henry
Probability distributions
• Eg.:
²
S = {0,1} = {00,01,10,11}
Pr : 00
01
10
11
↧ ↧ ↧ ↧
1/4 + 1/8 + 1/2 + 1/8 = 1
6
Ryan Henry
Probability distributions
• Eg.:
²
S = {0,1} = {00,01,10,11}
Pr : 00
01
10
11
1/4
1/4
1/4
1/4
↧ ↧ ↧ ↧
• Common/important distributions:
1. Uniform distribution:
6
∀x∈S, Pr(x)=1 / |S|
Ryan Henry
Probability distributions
• Eg.:
²
S = {0,1} = {00,01,10,11}
Pr : 00
01
10
11
1/4
1/4
1/4
1/4
↧ ↧ ↧ ↧
• Common/important distributions:
1. Uniform distribution:
6
∀x∈S, Pr(x)=1 / |S|
Ryan Henry
Probability distributions
• Eg.:
²
S = {0,1} = {00,01,10,11}
Pr : 00
01
10
11
1
0
0
↧ ↧ ↧ ↧
0
• Common/important distributions:
1. Uniform distribution:
2. Point distribution at x0:
6
∀x∈S, Pr(x)=1 / |S|
Pr(x0)=1∧∀x≠x0, Pr(x)=0
Ryan Henry
Probability distributions
• Eg.:
²
S = {0,1} = {00,01,10,11}
Pr : 00
01
10
11
1
0
0
↧ ↧ ↧ ↧
0
• Common/important distributions:
1. Uniform distribution:
2. Point distribution at x0:
6
∀x∈S, Pr(x)=1 / |S|
Pr(x0)=1∧∀x≠x0, Pr(x)=0
$\Pr(x_0)=1\wedge\forall x\neq x_0, \Pr(x)=0$
Ryan Henry
Events
Defⁿ: If Pr:S→[0,1] is a probability distribution,
then any subset E⊆S is called an event. The
probability of E is
Pr[E]≔∑Pr(x).
x∈E
• Convention:
7
– Square brackets ⇒ probability of event
– Parentheses
⇒ probability of outcome
Ryan Henry
Events
Defⁿ: If Pr:S→[0,1] is a probability distribution,
then any subset E⊆S is called an event. The
probability of E is
Pr[E]≔∑Pr(x).
x∈E
• Convention:
7
– Square brackets ⇒ probability of event
– Parentheses
⇒ probability of outcome
Ryan Henry
Events
Q: How many events are there in S?
|S|
A: 2 , including the two “trivial” events
1. The universal event; i.e., E=S
2. The empty event (or null event); i.e., E=Ø
Also includes the “elementary events”; i.e., the
singleton set {x} for each x∈S
Note: Each outcome is part of many events!
7
Ryan Henry
Events
• Eg.: S={0,1}⁸
E={x∈S|lsb2(x)=11}
Q: What is Pr[E]?
A: Pr(0000011)+Pr(00000111)+⋯+Pr(11111111)
– To be more precise, need to fix a distribution Pr!
Q: Suppose Pr:S→[0,1] is the uniform distribution.
Now compute Pr[E].
A: Pr[E]=1/4
7
(how did we compute this?)
Ryan Henry
Counting theorem
Thm (Counting theorem): If Pr:S→[0,1] is a
uniform distribution and E⊆S is an event, then
Pr[E]= |E|/|S|
Proof: Pr[E] =∑ Pr(x)
x∈E
=∑ 1/|S|
x∈E
(defⁿ of Pr[E])
(defⁿ of uniform distribution)
=1/|S|+⋯+1/|S|
|E| times
8
=|E|/|S|
☐
Ryan Henry
Complimetary events
That’s a mighty
fine probability you’ve
got there!
9
Ryan Henry
Complementary events
Defⁿ: If Pr:S→[0,1] is a probability distribution
and E⊆S is an event,then the complement of E
is
S∖E≔{x∈S|x∉E}
10
Ryan Henry
Complementary events
Defⁿ: If Pr:S→[0,1] is a probability distribution
and E⊆S is an event,then the complement of E
is
S∖E≔{x∈S|x∉E}
10
Ryan Henry
Complementary events
Defⁿ: If Pr:S→[0,1] is a probability distribution
and E⊆S is an event,then the complement of E
is
S∖E≔{x∈S|x∉E}
• Intuitively, Ē is the event “E does not occur”
• Notation: Complement of E is often denoted
Ē (note the “bar” over E), which is read “ E bar”
10
Ryan Henry
Complementary events
Defⁿ: If Pr:S→[0,1] is a probability distribution
and E⊆S is an event,then the complement of E
is
S∖E≔{x∈S|x∉E}
• Intuitively, Ē is the event “E does not occur”
• Notation: Complement of E is often denoted
Ē (note the “bar” over E), which is read “ E bar”
10
Ryan Henry
Complement rule
Thm (Complement rule): If Pr:S→[0,1] is a
probability distribution and E⊆S is an event, then
Pr[Ē]=1−Pr[E]
Proof:
Pr[Ē] =∑ Pr(x)
(defⁿ of Pr[Ē])
x∈Ē
=∑Pr(x)
(defⁿ of Ē)
=∑ Pr(x)−∑ Pr(x)
(rearranging)
=1−∑ Pr(x)
(defⁿ of Pr)
=1−Pr[E]
(defⁿ of Pr[E])
x∈S∖E
x∈S
x∈E
11
x∈E
☐ Ryan Henry
Union bound
Thm (Union bound): If Pr:S→[0,1] is a probability
distribution and E,F⊆S are events, then
Pr[E∪F]≤Pr[E]+Pr[F]
12
Ryan Henry
Union bound
Thm (Union bound): If Pr:S→[0,1] is a probability
distribution and E,F⊆S are events, then
Pr[E∪F]≤Pr[E]+Pr[F]
12
Ryan Henry
Union bound
Thm (Union bound): If Pr:S→[0,1] is a probability
distribution and E,F⊆S are events, then
Pr[E∪F]≤Pr[E]+Pr[F]
Proof: Pr[E∪F] =∑Pr(x)
(defⁿ of Pr[E∪F])
x∈E∪F
=∑ Pr(x)+∑ Pr(x)−∑Pr(x)
(inclusion-exclusion)
=Pr[E]+Pr[F]−Pr[E∩F]
(defⁿ of Pr[E] & Pr[F])
≤Pr[E]+Pr[F]
(defⁿ of Pr)
x∈E
12
x∈F
x∈E∩F
☐
Ryan Henry
Union bound
Thm (Union bound): If Pr:S→[0,1] is a probability
distribution and E,F⊆S are events, then
Pr[E∪F]≤Pr[E]+Pr[F]
• Corollary: If E∩F=Ø, then Pr[E∪F]=Pr[E]+Pr[F]
12
Ryan Henry
Union bound
Thm (Union bound): If Pr:S→[0,1] is a probability
distribution and E,F⊆S are events, then
Pr[E∪F]≤Pr[E]+Pr[F]
• Corollary: If E∩F=Ø, then Pr[E∪F]=Pr[E]+Pr[F]
12
Ryan Henry
Union bound
Thm (Union bound): If Pr:S→[0,1] is a probability
distribution and E,F⊆S are events, then
Pr[E∪F]≤Pr[E]+Pr[F]
• Corollary: If E∩F=Ø, then Pr[E∪F]=Pr[E]+Pr[F]
– If E∩F=Ø, we call E and F mutually exclusive events
12
Q: Is the converse of the above corollary true?
A: No! We might have A∩B≠Ø but Pr[A∩B]=0! Ryan Henry
Random variables
Defⁿ: Let Pr:S→[0,1] be a probability distribution
and let V be an arbitrary finite set.
A random variable is a function X:S→V.
• Sample space S is a “list” of all possible outcomes
• Distribution Pr says “how likely” each outcome is
13
• Random variable X assigns a “meaning” or
“interpretation” to each outcome
Ryan Henry
Random variables
• Eg.: Let S={0,1}ⁿ and let X:S→{0,1} such that
X(y)≔lsb(y)
U
V
• If Pr is the uniform
lsb=0
distribution, then
Pr[X=0]=Pr[X=1]=1/2
lsb=1
·
·
• A random variable X “induces” a probability
distribution on its range V via:
13
Pr[X=v]≔Pr[X⁻¹(v)]
Ryan Henry
The uniform random variable
Defⁿ: Let Pr:S→[0,1] be a uniform distribution.
The identity function X:S→S is called the
uniform random variable on S.
14
Ryan Henry
The uniform random variable
Defⁿ: Let Pr:S→[0,1] be a uniform distribution.
The identity function X:S→S is called the
uniform random variable on S.
• Notation: We write x∊S to denote that x is
output by the uniform random variable on S
• Other common notations:
x←S
14
or
R
x←S
or
$
x←S
Ryan Henry
The uniform random variable
Defⁿ: Let Pr:S→[0,1] be a uniform distribution.
The identity function X:S→S is called the
uniform random variable on S.
• Notation: We write x∊S to denote that x is
output by the uniform random variable on S
• Other common notations:
x←S
14
or
R
x←S
or
$
x←S
Ryan Henry
Independent events
Defⁿ: If Pr:S→[0,1] is a probability distribution,
then two events E,F⊆S are independent if
Pr[E∩F]=Pr[E]·Pr[F]
15
Ryan Henry
Independent events
Defⁿ: If Pr:S→[0,1] is a probability distribution,
then two events E,F⊆S are independent if
Pr[E∩F]=Pr[E]·Pr[F]
15
Ryan Henry
Independent events
Defⁿ: If Pr:S→[0,1] is a probability distribution,
then two events E,F⊆S are independent if
Pr[E∩F]=Pr[E]·Pr[F]
• Intuitively, E and F are independent events if E
occurs with the same probability whether or
not F occurs, and vice versa
15
Ryan Henry
Independent random variables
Defⁿ: Two random variables X≔S→V and Y≔S→V
are independent if ∀a,b∈S,
Pr[X=a∧Y=b]=Pr[X=a]·Pr[Y=b]
• Intuitively, X and Y are independent random
variables if the event X=a occurs with the
same probability whether or not Y=b occurs,
and vice versa for every possible choice of a,b
16
Ryan Henry
Independent random variables
• Eg.: Pr:{0,1}ⁿ→[0,1] is the uniform distribution
– X:{0,1}ⁿ→{0,1} and Y:{0,1}ⁿ→{0,1} are random
variable such that X(r)≔lsb(r) and Y(r)≔msb(r)
Then, for any b0,b1∈{0,1},
Pr[X=b0∧Y=b1]=Pr[r=b1b0]=1/4
and
Pr[X=b0]·Pr[Y=b1]=(1/2)·(1/2)=1/4
Hence, X and Y are independent random variables.
16
Ryan Henry
Hold on!
What if n=1!?
Then X(r)=Y(r)!
Good observation!
Lesson: always check for
implicit/hidden assumptions!
16
Ryan Henry
Exclusive OR
• The exclusion-OR of two strings is their
bitwise addition modulo 2
x y
0
0
1
1
17
0
1
0
1
0
1
1
0
⊕
0 1 1 0 0 1 0 1
1 0 1 1 0 1 0 0
Ryan Henry
Exclusive OR
• The exclusion-OR of two strings is their
bitwise addition modulo 2
x y
0
0
1
1
17
0
1
0
1
0
1
1
0
⊕
0 1 1 0 0 1 0 1
1 0 1 1 0 1 0 0
Ryan Henry
Exclusive OR
• The exclusion-OR of two strings is their
bitwise addition modulo 2
x y
0
0
1
1
17
0
1
0
1
0
1
1
0
⊕
0 1 1 0 0 1 0 1
1 0 1 1 0 1 0 0
1 1 0 1 0 0 0 1
Ryan Henry
An important property of XOR
Thm (XOR preserves uniformity): If X:{0,1}ⁿ→{0,1}ⁿ
is a uniform random variable and Y:{0,1}ⁿ→{0,1}ⁿ is
an arbitrary random variable that is independent
of X, Z≔X⊕Y is a uniform random variable.
Proof: Left as an exercise (see Assignment 1).
18
☐
Ryan Henry
Conditional probability
Defⁿ: If Pr:S→[0,1] is a probability distribution
and E,F⊆S with Pr[F]≠0, the conditional
probability of E given F is
Pr[E|F]≔Pr[E∩F]⁄Pr[F]
• Pr[E|F] is the probability that E occurs given
that F also occurs
19
• Alt. defⁿ: Pr[E∩F]≔Pr[E|F]·Pr[F]
Ryan Henry
Law of Total Probabilities
Thm (Law of Total Probabilities): If Pr:S→[0,1] is a
probability distribution and E,F⊆S with Pr[F]≠0, then
Pr[E]=Pr[E|F]·Pr[F]+Pr[E|Ĕ]·Pr[Ĕ]
Proof: Pr[E] =∑Pr(x)
x∈E
=∑Pr(x)+∑Pr(x)
x∈E∩F
x∈E∩Ĕ
=Pr[E∩F]+Pr[E∩Ĕ]
20
=Pr[E|F]·Pr[F]+Pr[E|Ĕ]·Pr[Ĕ]
☐
Ryan Henry
Bayes’ Theorem
Thm (Bayes’ Theorem): If Pr:S→[0,1] is a
probability distribution and E,F⊆S with Pr[F]≠0, then
Pr[E|F]=Pr[F|E]·Pr[E]/Pr[F]
Proof: Pr[E|F] =Pr[E∩F]/Pr[F]
21
prob.)
(defⁿ of Pr[E|F])
=Pr[F∩E]/Pr[F]
(commutativity of ⋂)
=Pr[F|E]·Pr[F]/Pr[E]
(alt. defⁿ of cond.
☐
Ryan Henry
Expectation
Defⁿ: Suppose V⊆ℝ. If X:S→V is a random
variable, then the expected value of X is
Exp[X]≔∑ Pr[X=v]·v
v∈V
22
Ryan Henry
Expectation
Defⁿ: Suppose V⊆ℝ. If X:S→V is a random
variable, then the expected value of X is
Exp[X]≔∑ Pr[X=v]·v
v∈V
22
Ryan Henry
Expectation
Defⁿ: Suppose V⊆ℝ. If X:S→V is a random
variable, then the expected value of X is
Exp[X]≔∑ Pr[X=v]·v
v∈V
Fact: Expectation is linear; that is, Exp[X+Y]=Exp[X]+Exp[Y]
Fact: If X and Y are independent random variables, then
22
Exp[X·Y]=Exp[X]·Exp[Y]
Ryan Henry
Markov’s Inequality
Thm (Markov’s Inequality): Suppose V⊆ℝ. If X:S→V
is a non-negative random variable and v>0, then
Pr[X≥v]≤Exp[X]/v
Proof:
Exp[X] =∑ Pr[X=x]·x
(defⁿ of Exp[X])
x∈V
=∑ Pr[X=x]·x+∑ Pr[X=x]·x
23
x<v
x≥v
x<v
x≥v
(regrouping)
≤∑ Pr[X=x]·0+∑ Pr[X=x]·v = Pr[X≥v]·v
☐
Ryan Henry
Variance
Defⁿ: Suppose V⊆ℝ. If X:S→V is a random
variable, then the variance of X is
Var[X]≔Exp[(X−Exp[X])²]
• Intuitively, Var[X] indicates how far we expect X to deviate
from Exp[X]
• Fact:
24
• Fact:
Var[X]=Exp[X2]−(Exp[X])²
Var[aX+b]=a² Var[X]
Ryan Henry
Chebyshev’s inequality
Thm (Chebyshev’s Inequality): Suppose V⊆ℝ.
If X:S→V is a random variable and δ>0, then
Pr[|X−Exp[X]|≥δ]≤Var[X]⁄δ
²
²
²
=Pr[|X−Exp[X]|
≥δ
]
Proof: Pr[|X−Exp[X]|≥δ]
25
≤Exp[(X−Exp[X])² ]⁄δ²
(Mar k ov’s)
= Var [ X ] / δ²
( D ef ⁿ )
☐
Ryan Henry
Chernoff’s bound
Thm (Chernoff’s bound): Fix ε>0 and b∈{0,1},
and let X1,…,XN be independent random
variables on {0,1} such that Pr[Xi=b]=1/2+ε
for each i=1,…,N.
The probability that the “majority value” of
2
-ε
the Xi is not b is at most ℯ N⁄2.
26
Ryan Henry
Markov v. Chebyshev v. Chernoff
• Suppose we have a biased coin that homes up heads
with probability 0.9 and tails with probability 0.1
• Consider random variable X that counts the number
of tails after N=100 tosses. What is the probability
that X is greater than or equal to 50?
• Markov:
27
Pr[X≥50]≤Exp[X]⁄50=10⁄50
=
0.2
• Chebyshev: Pr[|X−Exp[X]|≥40]≤Var[X]⁄402 =
0.005625…
• Chernoff:
0.000335…
Pr[X≥50]≤ℯ-0.4
2
∙100⁄2
=
Ryan Henry
That’s all for today, folks!
Ryan Henry
Related documents