Download An introduction to free probability

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Probability wikipedia , lookup

Randomness wikipedia , lookup

Transcript
An introduction to free probability
Christian Stump
April 29, 2015
Our path to free probability
Combinatorics
Random matrix theory
Probability theory
Free probability
Quantum mechanics
Integrable systems
Operator
theory
∗
C -algebras
von-Neumann algebras
Rep. theory of Sn
Voiculescu 1987
Free group factors isomorphism problem
1/1
Classical probability
A (general) framework for probability theory is given by
a sample space Ω of possible states
a σ-algebra B of events E ⊆ Ω
a (countably additive) probability measure P(E ) ∈ [0, 1] with P(Ω) = 1
random variables given by measurable functions X : Ω → R
I
pushing P forward to a measure µ on R
an expectation E(X ) =
I
R∞
−∞
xµ(x)dx of such a random variable X
assuming certain integrability conditions
2/1
Moments of random variables
The moment sequence mn (X ) n≥0 of a random variable X : Ω → R with
measure µ is given by
Z ∞
mn (X ) = E(X n ) =
x n µ(x)dx.
−∞
We always assume that all moments are finite.
Recall
m0 (X ) = 1
mean or expectation is E(X ) = m1 (X )
variance is V(X ) = m2 (X ) − m1 (X )2
3/1
Examples of moments
1. The constant random variable has moments mn = c n .
4/1
Examples of moments
1. The constant random variable has moments mn = c n .
2. The standard Gaussian distribution of measure µ(x) = √12π e −x
moments
(
(n − 1)(n − 3) · · · 3 · 1 n even
mn =
.
0
n odd
2
/2
has
4/1
Examples of moments
1. The constant random variable has moments mn = c n .
2. The standard Gaussian distribution of measure µ(x) = √12π e −x
moments
(
(n − 1)(n − 3) · · · 3 · 1 n even
mn =
.
0
n odd
2
/2
has
4/1
Examples of moments
1. The constant random variable has moments mn = c n .
2. The standard Gaussian distribution of measure µ(x) = √12π e −x
moments
(
(n − 1)(n − 3) · · · 3 · 1 n even
mn =
.
0
n odd
2
/2
has
We have
1
mn = √
2π
Z
∞
m −x 2 /2
x e
−∞
1
dx = √
2π
Z
∞
x m−1 xe −x
2
/2
dx
−∞
Integration by parts yields
mn = (n − 1)mn−2
The results follows with m0 = 1 and m1 = 0.
4/1
Moments of sums of independent random variables
Let X , Y : Ω → R be random variables.
X , Y independent if P X ≤ a, Y ≤ b = P X ≤ a P Y ≤ b
If X , Y independent then
E(XY ) = E(X )E(Y )
a
b
a
implying E(X Y ) = E(X )E(Y b ) and
n X
n
mn (X + Y ) =
mk (X )mn−k (Y ).
k
k=0
... we next switch to cumulants to linearize this formula.
Note
It is actually enough to assume subindependence: for any polynomials p, q
E p(X )q(Y ) = E p(X ) E q(Y ) .
5/1
Moments and cumulants
Definition (Moment-cumulant formula, Thiele 1889)
The cumulant sequence cn (X ) n≥1 of a random variable X : Ω → R with finite
moments mn (X ) is defined by the recursive formula
mn (X ) =
n−1 X
n−1
mk (X )cn−k (X )
k
k=0
Pn
Let X , Y be independent variables. Then mn (X + Y ) = k=0 kn mk (X )mn−k (Y )
and
cn (X + Y ) = cn (X ) + cn (Y ), cn (λX ) = λn cn (X ) for λ ∈ R
6/1
Moments and cumulants
Definition (Moment-cumulant formula, Thiele 1889)
The cumulant sequence cn (X ) n≥1 of a random variable X : Ω → R with finite
moments mn (X ) is defined by the recursive formula
mn (X ) =
n−1 X
n−1
mk (X )cn−k (X )
k
k=0
=
X
Y
c|B| (X ),
π∈Part(n) B∈π
where Part(n) is the set of all set partitions of {1, . . . , n}.
Pn
Let X , Y be independent variables. Then mn (X + Y ) = k=0 kn mk (X )mn−k (Y )
and
cn (X + Y ) = cn (X ) + cn (Y ), cn (λX ) = λn cn (X ) for λ ∈ R
6/1
Moments and cumulants
The first few cumulants have special names
Mean:
c1 (X ) = m1 (X )
Variance:
c2 (X ) = m2 (X ) − c1 (X )2
= m2 (X ) − m1 (X )2
Skewness:
c3 (X ) = m3 (X ) − 3c2 (X )c1 (X ) − c1 (X )3
= m3 (X ) − 3m2 (X )m1 (X ) + 2m1 (X )3
7/1
Examples of cumulants
1. The constant random variable has cumulants cn = (1, 0, 0, . . .).
8/1
Examples of cumulants
1. The constant random variable has cumulants cn = (1, 0, 0, . . .).
2. The standard Gaussian distribution of measure µ(x) =
cumulants
cn = (0, 1, 0, 0, . . .).
2
√1 e −x /2
2π
has
Proof: later & easy
8/1
Central limit theorem using cumulants
Theorem (A classical central limit theorem)
Let X1 , . . . : Ω → R be independent, identically distributed (or iid) random
variables with mean 0, variance 1 and finite moments of all orders, and let
SN =
X1 + · · · + XN
√
.
N
then SN converges with N → ∞ to the standard Gaussian distribution X .
Enough to show lim mn (SN ) = mn (X ) or lim cn (SN ) = cn (X ) = (0, 1, 0, 0, . . .):
cn (SN ) = cn (N −1/2 (X1 + · · · + XN ))
= N −n/2 (cn (X1 ) + · · · + cn (XN )) = N 1−n/2 cn (X1 ).
We thus have
c1 (SN ) = N 1/2 c1 (X1 ) = 0
c2 (SN ) =
cn (SN ) = N
c2 (X1 )
2−n
2
= 1
cn (X1 ) → 0 for n > 2
9/1
Mixed moments and cumulants
Definition
Let X1 , X2 , . . . : Ω → R be random variables. Define the mixed moments as
mn (X1 , . . . , Xn ) = E(X1 · · · Xn )
and the mixed cumulants cn (X1 , . . . , Xn ) by
X Y
mn (X1 , . . . , Xn ) =
c|B| (Xi : i ∈ B).
π∈Part(n) B∈π
mn (X ) = mn (X , X , . . . , X )
cn (X ) = cn (X , X , . . . , X )
Examples
m1 (X1 ) = c1 (X1 ),
m2 (X1 , X2 ) = c2 (X1 , X2 ) + c1 (X1 )c1 (X2 )
c2 (X1 , X2 ) = m2 (X1 , X2 ) − m1 (X1 )m1 (X2 ) covariance of X1 and X2 .
10 / 1
Mixed moments and cumulants
Theorem (Rota 1964)
Let X , Y : Ω → R be random variables. Then
X , Y subindependent
⇔
all properly mixed cumulants of X and Y vanish
0 = c2 (X , Y )
0 = c3 (X , Y , Y ) = c3 (X , X , Y )
0 = c4 (X , Y , Y , Y ) = c4 (X , X , Y , Y ) = c4 (X , X , X , Y )
..
.
G. C. Rota On the foundations of combinatorial theory I
Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete
11 / 1
Graphs vs. connected graphs
n
Let an = 2(2) be the number of labelled graphs G = (V , E ) with
[n]
V = [n] = {1, . . . , n}, E ⊆
2
and let bn denote the number of connected labelled graphs.
Observation
The quantities an and bn are related by the moment-cumulant formula
an =
n−1 X
n−1
ak bn−k =
k
k=0
X
Y
b|B| ,
π∈Part(n) B∈π
This is an instance of a phenomenon where
an counts the number of “structures” on [n], and
bn counts the number of “connected structures”,
then these numbers are related by a “moment-cumulant type formula”.
12 / 1
Graphs vs. connected graphs
Corollary
The cumulants cn of the standard Gaussian distribution are (0, 1, 0, 0, . . .).
We have
(
mn =
(n − 1)(n − 3) · · · 3 · 1
0
n even
n odd
This counts perfect matchings of {1, . . . , n} (why?).
cn thus counts connected perfect matchings.
1
4
1
2
3
4
1
2
3
4
2
3
13 / 1
Graphs vs. geometrically connected graphs
A graph G = (V , E ) with
V = [n] = {1, . . . , n},
E⊆
[n]
2
is geometrically connected if the union of its edges in the geometric
representation around a circle is connected:
1
4
1
2
3
4
1
2
3
4
2
3
X
Let b̃n denote the number of geometrically connected graphs.
14 / 1
Graphs vs. geometrically connected graphs
A graph G = (V , E ) with
V = [n] = {1, . . . , n},
E⊆
[n]
2
is geometrically connected if the union of its edges in the geometric
representation around a circle is connected:
1
10
2
9
3
8
4
7
5
6
Let b̃n denote the number of geometrically connected graphs.
14 / 1
Graphs vs. geometrically connected graphs
A graph G = (V , E ) with
V = [n] = {1, . . . , n},
E⊆
[n]
2
is geometrically connected if the union of its edges in the geometric
representation around a circle is connected. Let b̃n denote the number of
geometrically connected graphs.
Observation
The quantities an and b̃n are related by the noncrossing moment-cumulant
formula
X Y
an =
b̃|B| ,
π∈NC (n) B∈π
where NC (n) is the set of all noncrossing set partitions of {1, . . . , n}
Noncrossing moment-cumulant formula
free probability theory!
15 / 1
Moments and noncrossing cumulants
Definition (Noncrossing moment-cumulant formula)
The noncrossing cumulants κn (X ) of X : Ω → R are defined by
X Y
mn (X ) =
κ|B| (X ),
π∈NC (n) B∈π
and the mixed noncrossing cumulants κn (X1 , . . . , Xn ) of X1 , X2 , . . . : Ω → R by
X Y
mn (X1 , . . . , Xn ) =
κ|B| (Xi : i ∈ B).
π∈NC (n) B∈π
Immediate questions:
Why is this a sensible definition?
Does the analogue of the standard Gaussian distribution exist? What is it?
16 / 1
Moments and noncrossing cumulants
Theorem (Noncrossing analogue of Gaussian distribution)
The real random variable X given by Wigner semicircle distribution
µX (t) =
1 p
4 − t2
2π
has support [−2, 2], even moments m2n (X ) = Cat(n) =
noncrossing cumulants
κn (X ) = (0, 1, 0, 0, . . .) .
2n
1
n+1 n
, and thus
Nontrivial proof!
17 / 1
A nc central limit theorem via random matrix theory
Definition (Wigner’s semicircle law, Wigner 1950s)
Let Yij : Ω → R with 1 ≤ i ≤ j be iid random variables and


Y11 Y12 · · · Y1N

1 
 Y12 Y22 · · · Y2N 
XN = √  .
.
.
.
..
..
.. 

N  ..
Y1N Y2N · · · YNN
be a symmetric random matrix with (real) eigenvalues λ1 (XN ) ≤ . . . ≤ λN (XN ).
The empirical spectral distribution of XN is given by the discrete measure
o
1 n
µXN (x) = # 1 ≤ i ≤ N : λi (XN ) = x .
N
18 / 1
A nc central limit theorem via random matrix theory
Theorem (Wigner’s semicircle law, Wigner 1950s)
Under natural conditions on the mean and variance, the empirical spectral
distribution µXN converges almost surely to the Wigner semicircle distribution. In
particular, λ1 (XN ) → −2, λN (XN ) → 2.
19 / 1
Free probability from classical probability via abstraction
Measure theory: focus on sample space
I
derived concepts: events, random variables
measurable sets and functions
Probability theory: focus on events and their probabilities
I
derived concepts: random variables and expectations
Free probability theory: focus on the
algebra of random variables and their expectations
20 / 1
Free probability from classical probability via abstraction
Definition
A noncommutative probability space is a pair (A, τ ) where
A is a C-algebra with 1, and
τ : A → C is a linear functional such that τ (1) = 1.
A free random variable is an element X ∈ A.
The moment sequence of X is
mn (X ) = τ (X n ).
(Noncommutative here is meant as not necessarily commutative.)
Classical probability: A = L∞ (Ω, B, P) and τ = E.
Originally considered: C ∗ -algebras and von Neumann-algebras.
21 / 1
Free probability and the nc moment-cumulant formula
Let (A, τ ) be a noncommutative probability space.
Definition (Voiculescu 1987)
Two random variables X , Y ∈ A are freely independent if
τ f1 (X )g1 (Y ) · · · fk (X )gk (Y ) = 0
for all polynomials f1 , g1 , . . . , fk , gk such that τ fi (X ) = τ gi (Y ) = 0.
arose in the study of the still open problem whether two different free
groups have isomorphic von Neumann group algebras
used to solve previously intractable problems in operator theory
Theorem (Speicher 1997)
X , Y freely independent
⇔
all properly mixed noncrossing cumulants of X and Y vanish
22 / 1
Our path to free probability
Combinatorics
Random matrix theory
Probability theory
Free probability
Quantum mechanics
Integrable systems
Operator
theory
∗
C -algebras
von-Neumann algebras
Rep. theory of Sn
Voiculescu 1987
Free group factors isomorphism problem
23 / 1
References and further reading
Alexandru Nica, Roland Speicher Lectures on the combinatorics of free
probability, LMS Lecture Note Series 335, 2006
Jonathan Novak Three lectures on free probability MSRI Publications (in
press)
Terence Tao’s blog post on free probability, terrytao.wordpress.com
Todd Kemp’s lecture notes Introduction to random matrix theory, Nov 2013
Jonathan Novak, Piotr Śniady What is ... a free cumulant, Notices of the
AMS 58(2), 2011
Philippe Biane Free probability and combinatorics, Proceedings of the
International Congress of Mathematicians II 2002.
24 / 1