Download X - Logic Circuits, Algorithms and Complexity Theory

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
2011.10.24
Theory of Computational Complexity
Ao Zhu
Iwama & Ito Lab
Graduate School of Informatic, Kyoto University
Textbook: Probability and Computing: Randomized Algorithms and Probabilistic
Analysis 2.4~3.3
2.4 The Geometric Distribution
Definition 2.8:
A geometric random variable X with parameter p is given by the
following probability distribution on n = 1, 2, ... :
Pr (X = n) = (1 - p) n-1 p
Lemma 2.8:
For a geometric random variable X with parameter p and for n > 0,
Pr (X = n + k I X > k) = Pr (X = n).
Lemma 2.9:
Let X be a discrete random variable that takes on only nonnegative
integer values. Then
E[X] = ∑∞
Pr (X ≥ i )
i=1
The expectation of geometric random variable with parameter p is 1/p.
The expectation of coupon collector problem is n ln n + Θ(n)
2.5 The Expected Run-Time of Quicksort
Input: A list S = {X1, ... , Xn} of n distinct elements over a totally
ordered universe.
Output: The elements of S in sorted order.
1. If S has one or zero elements, return S. Otherwise continue.
2. Choose an element of S as a pivot; call it x.
3. Compare every other element of S to X in order to divide the
other elements into two sublists:
(a) S1 has all the elements of S that are less than x:
(b) S2 has all those that are greater than x.
4. Use Quicksort to sort S1 and S2.
5. Return the list S1, x, S2.
Theorem 2.11:
Suppose that, Whenever a pivot is chosen for Random Quicksort, it is
chosen independently and uniformly at random from all possibilities.
Then, for any input, the expected number of comparisons made by
Random Quicksort is 2n In n + O(n).
3.1 Markov’s Inequality
Theorem 3.1:
Let X be a random variable that assumes only nonnegative values. Then,
for all a > 0,
Pr⁡(𝑋 ≥ 𝑎) ≤
𝐸[𝑋]
𝑎
3.2 Variance and Moments of a Random Variable
Definition 3.1:
The kth moment of a random variable X is E [Xk]
Definition 3.2:
The variance of a random variable X is defined as
Var[X] = E[(X – E[X])2] = E[X2] – (E[X])2
The standard deviation of a random variable X is
σ[X] = √𝑽𝒂𝒓[𝑋]
Definition 3.3:
The covariance of two random variables X and Y is
Cov(X, Y) = E[(X - E[X])( Y - E[ Y])].
Theorem 3.2:
For any two random variables X and Y.
Var[X + Y] = Var[X] + Var[Y] + 2 Cov(X, Y).
Theorem 3.3:
If X and Y are two independent random variables, then
E[X · Y] = E[X] · E[Y].
Corollary 3.4:
If X and Y are independent random variables, then
Cov(X, Y) = 0
Var[X + Y] = Var[X]+ Var[Y].
Theorem 3.5:
Let X1, X2, ... , Xn be mutually independent random variables. Then
𝒏
𝒏
𝑽𝒂𝒓 [∑ 𝑋𝑖 ] = ∑ 𝑽𝒂𝒓[𝑋𝑖]
𝒊=𝟏
𝒊=𝟏
Variance of a Binomial Random Variable with parameter n and p is
np(1-p).
3.3 Chebyshev’s Inequality
Theorem 3.6:
For any a >0,
Pr⁡(|𝑋 − 𝑬[𝑋]| ≥ 𝑎) ≤
𝑉𝑎𝑟[𝑋]
𝑎2
Corollary 3.7:
For any t > 1,
Pr⁡(|𝑋 − 𝑬[𝑋]| ≥ 𝑡 · 𝜎[𝑋]) ≤
Pr⁡(|𝑋 − 𝑬[𝑋]| ≥ 𝑡 · 𝜎[𝑋]) ≤
1
𝑡2
𝑉𝑎𝑟[𝑋]
𝑡2 (𝐸[𝑋])2
Lemma 3.8:
The variance of a geometric random variable with parameter p is
(1 - p)/p2.
Using this approach and Chebyshev’s inequality we can get.
𝑛2𝜋2
1
Pr(|𝑋 − 𝑛𝐻| ≥ 𝑛𝐻) ≤ 6 = O⁡ (
)
(𝑛𝐻)2
𝑙𝑛2𝑛
Related documents