Download Expected numbers at hitting times

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Theorem wikipedia , lookup

Central limit theorem wikipedia , lookup

Law of large numbers wikipedia , lookup

Karhunen–Loève theorem wikipedia , lookup

Proofs of Fermat's little theorem wikipedia , lookup

Four color theorem wikipedia , lookup

Transcript
Expected numbers at hitting times
Colin McDiarmid,
Department of Statistics,
University of Oxford.
Abstract
We determine exactly the expected number of hamilton cycles in
the random graph obtained by starting with n isolated vertices and
adding edges at random until each vertex degree is at least two. This
complements recent work of Cooper and Frieze. There are similar
results concerning expected numbers for example of perfect matchings,
spanning trees, hamilton paths and directed hamilton cycles.
1
1
Introduction
Let Kn denote the complete graph on the vertex set Vn = {1, . . . , n} with
set E of N = (n2 ) edges. Given a permutation or linear order π on E the
corresponding graph process G̃n = G̃n (π) is the sequence G0 , G1 , . . . , GN of
graphs on Vn where Gi contains the first i edges in E under π. We assume
that each of the N ! permutations on E is equally likely, so that we obtain a
random graph process G̃n on Vn (see Bollobás [3], page 39).
Let P be any increasing property of graphs such that Kn ∈ P. The
hitting time τ (G̃n , P) is the least m such that Gm ∈ P. For i = 1, 2 let D(i)
be the property of having each vertex degree at least i. We define G(i)
n to
be the graph Gm in the random graph process G̃n where m = τ (G̃n , D(i) ).
Thus G(i)
n is the random graph obtained by adding edges at random until
each vertex degree is at least i.
Let H be the property of having a hamilton cycle. Clearly τ (G̃n , D(2) )
≤ τ (G̃n , H) always. One of the most attractive results in the theory of
random graphs is that
(1.1)
P (τ (G̃n , D(2) ) = τ (G̃n , H)) → 1 as n → ∞.
This result appeared in Komlós and Szemerédi [6] and was proved by Bollobás
[2] (see also Ajtai, Komlós and Szemerédi [1]). Another way of stating (1.1)
is
P (G(2)
n ∈ H) → 1 as n → ∞.
Cooper and Frieze [4] extended this result as follows. For any graph G
let hc(G) be the number of hamilton cycles in G. Thus hc(Kn ) = 12 (n − 1)! =
2
nn+o(n) . In [4] it is shown that, for any ε > 0,
(1.2)
(1−ε)n
P (hc(G(2)
) → 1 as n → ∞.
n ) ≥ (log n)
Does this result give the right order of magnitude for hc(G(2)
n )? It is
well known (see Bollobás [3] page 61) that if mi = mi (n) =
1
n{log
2
n+
(i − 1)loglog n + w(n)} where w(n) → ∞ as n → ∞, then
P (τ (G̃n , D(i) ) ≤ mi ) → 1 as n → ∞.
(1.3)
(Here log means natural logarithm.) Also, if a graph G has n vertices and
me n
m edges then of course hc(G) ≤ (m
n ) ≤ ( n ) . In fact we shall see (assuming
that m ≥ n/2) that
µ
(1.4)
hc(G) ≤
¶n
2m
−1
n
.
It follows immediately from (1.3) and (1.4) that
n
P (hc(G(2)
n ) ≤ (log n + 2 loglog n) ) → 1 as n → ∞.
Thus indeed the inequality (1.2) of Cooper and Frieze is about best possible.
But now let us consider expected numbers of hamilton cycles. Suppose
that we look at the random graph Gn,m2 , with vertex set Vn and m2 edges,
where all (mN2 ) such graphs are equally likely; or at the random graph Gn,p
with vertex set Vn where the N possible edges appear independently with
probability p = m2 /N . Note that if C is any fixed hamilton cycle in Kn then
µ
P (C in Gn,m2 ) =
Πn−1
i=0
¶
µ
m2 − i
m2
≤
N −i
N
3
¶n
= pn .
Hence
E[hc(Gn,m2 )] ≤ E[hc(Gn,p )]
= hc(Kn )pn
= (log n)n e−n+o(n) ,
as suggested in [4], following the theorem.
However our interest is in the expected number of hamilton cycles in
G(2)
n ,
(1.5)
and here the story is quite different.
Theorem
E[hc(G(2)
n )] = hc(Kn )n!
³
2(n−3)!
(2n−3)!
−
(2n−6)!
(3n−6)!
´
∼ hc(Kn )16(πn)1/2 4−n .
We prove this result in the next section. It was of course trivial to determine E[hc(Gn,p )] : we shall see that it is surprisingly easy to determine
E[hc(G(2)
n )]. The same approach allows us to obtain similar results concerning expected numbers of perfect matchings in G(1)
n , of spanning trees and
(C)
hamilton paths in G(1)
n and in Gn , and of directed hamilton cycles in cer-
tain random digraphs. Here G(C)
denotes the random graph obtained by
n
adding edges at random until the graph is connected. These further results
are presented in sections 3 to 6 below.
Further possible interesting investigations of a similar nature may occur
to the reader, for example concerning directed hamilton paths.
4
Hamilton cycles in G(2)
n
2
Bounding hc(G)
We must prove the inequality (1.4) concerning the number hc(G) of
hamilton cycles in a graph G.
(2.1)
Proposition.
Suppose that the graph G = (V, E) has n vertices
and m edges, and each vertex degree dv > 0. Then
µ
¶n
2m
hc(G) ≤ Πv∈V (dv − 1) ≤
−1
n
Proof.
.
For each vertex u ∈ V let hpu (G) be the number of hamilton paths
starting at u. Let us prove first that
(2.2)
hpu (G) ≤ du Π {(dv − 1) : v ∈ V \{u}} .
Suppose that G is a counterexample with |V | minimal. Clearly each dv ≥ 2.
Let u ∈ V and let W be the set of neighbours of u. The inequality (2.2)
must hold for the graph G0 obtained from G by deleting vertex u. Hence
hpu (G) =
≤
≤
P
w∈W
P
w∈W
P
hpw (G0 )
d0w Π{(d0v − 1) : v ∈ V \{u, w}}
w∈W (dw
− 1)Π{(dv − 1) : v ∈ V \{u, w}}
= du Π{(dv − 1) : v ∈ V \{u}}.
We have now proved (2.2). It follows that
(2.3)
hc(G) ≤ Πv∈V (dv − 1)
5
For to establish (2.3) we may assume that each dv ≥ 2. Fix some vertex
u ∈ V and let W and G0 be as above. Then since du ≥ 2,
hc(G) ≤
1 X
hpw (G0 ) ≤ Πv∈V (dv − 1).
2 w∈W
Finally to complete the proof of proposition (2.1), we may use the inequality
that ‘geometric mean ≤ arithmetic mean’.
2
Two standard results
We shall need the following two standard results below.
(2.4)
For any positive integers α and β
Z 1
0
(2.5)
xα (1 − x)β dx =
α! β!
.
(α + β + 1)!
(Stirlings formula, see for example [3] page 4.) As n → ∞,
n! ∼ (2πn)1/2 (n/e)n .
Proof of theorem (1.5)
It is important to generate the random graph process G̃n in a convenient
way. We may assume that we start with a family (Xe : e ∈ E) of independent random variables, each uniformly distributed on [0, 1]; and that G̃n
corresponds to ordering the edges e by increasing value of Xe . Note that if
0 < p < 1 then we may form a random graph Gn,p by including those edges
e with Xe ≤ p.
Now let C be any fixed hamilton cycle in Kn say C = (1, 2, . . . , n) and
let us focus on the edge e = {1, 2}. Write ‘e last in G(2)
n ’ to denote the event
that e was the last edge added in G̃n to form G(2)
n . For i = 1, 2 let Ei be the
6
set of n − 3 edges in Kn given by
E1 = {{1, j} : j = 3, . . . , n − 1},
E2 = {{2, j} : j = 4, . . . , n}.
For 0 < x < 1 and i = 1, 2 let Ai (x) be the event that Xf ≤ x for each
edge f in C other than e, and Xf > x for each edge f in Ei . Note that
(2)
P (C in G(2)
n , e last in Gn |Xe = x) = P (A1 (x) ∪ A2 (x)) .
Also,
P (A1 (x)) = P (A2 (x)) = xn−1 (1 − x)n−3 ,
and
P (A1 (x) ∩ A2 (x)) = xn−1 (1 − x)2n−6 ,
so
P (A1 (x) ∪ A2 (x)) = 2xn−1 (1 − x)n−3 − xn−1 (1 − x)2n−6 .
Hence
(2)
(2)
P (C in G(2)
n ) = nP (C in Gn , e last in Gn )
=n
R1
0
P (A1 (x) ∪ A2 (x))dx
−n
= 2n (n−1)!(n−3)!
(2n−3)!
= n!
n
2(n−3)!
(2n−3)!
−
(n−1)!(2n−6)!
(3n−6)!
(2n−6)!
(3n−6)!
o
by (2.4)
.
This yields the first part of theorem (1.5), and we may use Stirling’s formula
(2.5) to complete the proof.
2
7
3
Perfect matchings in G(1)
n
Let pm(G) denote the number of perfect matchings in a graph G. Thus for
n even
pm(Kn ) =
n!
1
2n/2 (n/2)!
= n( 2 +o(1))n as n → ∞.
It is known (see Bollobás [3] page 166) that
³
´
P pm(G(1)
n ) > 0 → 1 as n → ∞, n even.
(3.1)
Theorem For n even,
½
E[pm(G(1)
n )]
=
∼ pm(Kn )
Proof
³
¾
2(n−2)!
( 3n
−2)!
2
pm(Kn )( n2 )!
27πn
2
´1/2 ³
−
2
271/2
(2n−4)!
( 5n
−4)!
2
´n
.
Let M be a fixed matching in Kn and focus on a particular edge
e = {1, 2} say in M . For i = 1, 2 let Ei be the set of edges {{i, j} : j =
3, . . . , n} in Kn .
For 0 < x < 1 and i = 1, 2 let Ai (x) be the event that Xf ≤ x for each edge
f in M other than e, and Xf > x for each edge f in Ei . Note that
(1)
P (M in G(1)
n , e last in Gn |Xe = x) = P (A1 (x) ∪ A2 (x)).
Also, much as before we find that
n
P (A1 (x)) = P (A2 (x)) = x 2 −1 (1 − x)n−2 ,
and
n
P (A1 (x) ∩ A2 (x)) = x 2 −1 (1 − x)2n−4 ,
8
so
n
n
P (A1 (x) ∪ A2 (x)) = 2x 2 −1 (1 − x)n−2 − x 2 −1 (1 − x)2n−4 .
Hence
n
(1)
(1)
P (M in G(2)
n ) = 2 P (M in Gn , e last in Gn )
=
n R1
{2xn/2−1 (1
2 0
=
( n2 )!
n
− x)n−2 − x 2 −1 (1 − x)2n−4 }dx
½
¾
2(n−2)!
−2)!
( 3n
2
−
(2n−4)!
−4)!
( 5n
2
by (2.4). Now use Stirling’s formula (2.5).
9
2
4
Spanning trees
Let C be the property of being connected, and let st(G) be the number of
spanning trees (complexity) of a graph G. Thus st(Kn ) = nn−2 . It is well
known (see Bollobás [3] page 152) that
(1)
P (st(G(1)
n ) > 0) = P (Gn ∈ C) → 1 as n → ∞.
(4.1)
(C)
We are interested here in E[st(G(1)
n )] and in E[st(Gn )] and in the difference
(1)
between these quantities. Of course st(G(C)
n ) ≥ st(Gn ) always, and by (4.1)
equality holds with probability → 1 as n → ∞.
(4.2)
Theorem
For n ≥ 3,
n−3
E[st(G(1)
n!(n − 2)!/(2n − 3)!
n )] = (n − 1)
∼ (nn−2 )(8/e)(πn)1/2 4−n ,
and
(1)
E[st(G(C)
n ) −st(Gn )]
=
1
2
o
n
Pn−2 n−2 k−2
n!(k(n−k)−1)!
n−k−2
(
)k
(n
−
k)
k=2 k−1
(k(n−k)+n−2)!
243
1/2 4 n
∼ (nn−2 )( 16e
( 27 ) .
2 )(3πn)
Now it is not hard to see that for n ≥ 3, P (G(1)
n is not connected)
> 4/(n + 3). (Just consider the process at the stage when the number of
components drops to 3.) Thus we have
(1)
E[st(G(C)
n ) − st(Gn )]
(1)
(1)
= E[st(G(C)
n )|Gn not connected] P (Gn not connected)
>
4
n+3
(1)
E[st(G(C)
n )|Gn not connected].
10
(1)
Thus we obtain the surprising (?) result that E[st(G(C)
n )|Gn not connected]
is exponentially smaller than E[st(G(1)
n )].
Proof
Given a spanning tree T of Kn and an edge e = {i, j} ∈ E where
i < j define f (T, e) as follows: if e is not in T let f (T, e) = 0, and if e is in
T let f (T, e) be the number of vertices in the component containing vertex
i of the graph obtained from T by deleting the edge e. Note that for each
1 ≤ k ≤ n − 1 and for each edge e ∈ E, there are
k−2
g(n, k) = (n−2
(n − k)n−k−2
k−1 ) k
spanning trees T of Kn such that f (T, e) = k.
For each pair (T, e) let A(T, e) be the event that the spanning tree T is
in Gn(C) and the edge e is last in G(C)
n . If f (T, e) = k, 1 ≤ k ≤ n − 1, and
0 < x < 1 then
P (A(T, e)|Xe = x) = xn−2 (1 − x)k(n−k)−1 ;
and so
P (A(T, e)) =
=
R 1 n−2
(1 − x)k(n−k)−1 dx
0 x
(n−2)!(k(n−k)−1)!
(k(n−k)+n−2)!
by (2.4).
Now let us use AT,e as the indicator for the event A(T, e). Then
st(G(C)
n ) =
=
P P
e
T
AT,e
P Pn−1 P
e
k=1
{AT,e : T satisfies f (T, e) = k}.
Hence
E[st(G(C)
n )] =
n−1
X
k=1
11
ak ,
where by the above
ak = (n2 )g(n, k)
(n−2)!(k(n−k)−1)!
(k(n−k)+n−2)!
n!(k(n−k)−1)!
k−2
= 12 (n−2
(n − k)n−k−2 (k(n−k)+n−2)!
.
k−1 )k
Now note that
³
´
st G(1)
=
n
XX
{AT,e : T satisfies f (T, e) = 1 or n − 1} ,
e
and so
E[st(G(1)
n )] = a1 + an−1 = 2a1 .
It is easy to check using Stirling’s formula (2.5) that
2a1 = (n − 1)n−3 n!(n − 2)!/(2n − 3)!
∼ nn−2 (8/e)(πn)1/2 4−n ,
and
2a2 = (n − 2)n−3 n!(2n − 5)!/(3n − 6)!
243
1/2 4 n
∼ nn−2 ( 16e
( 27 ) .
2 )(3πn)
It remains only to check that the sum
Pn−3
k=3
ak is o(a2 ) as n → ∞.
Let n ≥ 6. The total number of trees containing a given edge e is 2nn−3 .
Thus certainly g(n, k) ≤ 2nn−3 for each k. Also note that
g(n, 2) = (n − 2)n−3 ∼ e−2 nn−3 .
For 3 ≤ k ≤ n − 3,
g(n,2) ak
2nn−3 a2
=
(3n−6)(n−1)
g(n,k)
2nn−3 (k(n−k)+n−2)(n−1)
≤
(3n−6)(n−1)
(4n−11)(n−1)
12
3n−6 n−1
≤ ( 4n−11
) .
Hence
n−3
X
k=3
Ã
ak /a2 ≤
!
µ
3n − 6
2nn−3
(n
−
5)
(n − 2)n−3
4n − 11
13
¶n−1
= o(1).
2
5
Hamilton paths
Let hp(G) denote the number of hamilton paths in a graph G. Thus hp(Kn ) =
1
n!.
2
We might ask about the expected numbers of hamilton paths in G(1)
n
and G(C)
n . If we argue as in the last section we obtain the following result,
which resembles theorem 4.2.
(5.1)
Theorem
For n ≥ 2,
2
E[hp(G(1)
n )] = n!((n − 2)!) /(2n − 3)!
1
∼ hp(Kn )16(π/n) 2 4−n ,
and
(1)
E[hp(G(C)
n ) −hp(Gn )]
=
Pn−2 1
k=2 2 n!(n − 2)!(k(n − k) − 1)!/(k(n − k) + n − 2)!
1
4 n
∼ hp(Kn ) 243
( 3π
) 2 ( 27
) .
8
n
14
6
Directed hamilton cycles
Just as we defined a random graph process G̃n on Vn we may define a random
digraph process D̃n on Vn . Let DKn denote the complete digraph on the
vertex set Vn with set A of n(n − 1) arcs. Given a permutation π of A the
corresponding digraph process in the sequence D0 , D1 , . . . , Dn(n−1) of digraphs
on Vn where Di contains the first i arcs in A under π. We assume that each
of the (n(n − 1))! permutations on A is equally likely, so that we obtain a
random digraph process D̃n on Vn .
Let P be an increasing property of digraphs such that DKn ∈ P. The
hitting time τ (D̃n , P) is the least m such that Dm ∈ P. For i, j = 0, 1, let
Di,j be the property that each vertex has indegree at least i and outdegree at
least j. We define Dni,j to be the digraph Dm in the random digraph process
D̃n where m = τ (D̃n , Di,j ) = τ i,j say. Thus Dni,j is obtained by adding arcs
at random until each vertex has indegree at least i and outdegree at least j.
As with random graph processes, we may assume that we start with a
family (Xa : a ∈ A) of independent random variables each uniformly distributed on [0, 1]; and that the random digraph process D̃n corresponds to
ordering the arcs a by increasing value of Xa .
For any digraph D let hc(D) denote the number of (directed) hamilton
cycles in D. Thus hc(DKn ) = (n − 1)! Frieze [5] has shown the beautiful
result that
P (hc(Dn1,1 ) > 0) → 1 as n → ∞.
We are interested in E[hc(Dn0,1 )] (which equals E[hc(Dn1,0 )] by symmetry)
and E[hc(Dn(1,1) )].
15
(6.1)
Theorem
We have
E[hc(Dn0,1 )] = hc(DKn )/ (2n−2
n )
∼ hc(DKn )4(πn)1/2 4−n ,
and
E[hc(Dn1,1 )] = hc(DKn )n!
n
2(n−2)!
(2n−2)!
−
(2n−4)!
(3n−4)!
o
∼ 2E[hc(Dn0,1 )].
Observe that from theorems (1.5) and (6.1) we obtain
E[hc(Dn1,1 )] ∼ E[hc(G(2)
n )] as n → ∞.
Observe also from theorem (6.1) that on average as we add arcs in the random
digraph process D̃n , about the same number of hamilton cycles appear in the
stage when we form Dn0,1 as in the succeeding stage (if there is one) when we
continue to form Dn1,1 . This seems intuitively reasonable, since if the random
indicator variable I = 1 when τ 0,1 = τ 1,0 and I = 0 otherwise, then
P (I = 1) = 0(n−2 )
as is easily checked, and always
hc(Dn0,1 ) + hc(Dn1,0 ) = hc(Dn1,1 )(1 + I).
Proof
Let C be a fixed hamilton cycle in DKn and let a be an arc in C.
Arguing as in the proof of theorem (1.5) we find
´
³
P C in Dn0,1 , a last in Dn0,1 |Xa = x = xn−1 (1 − x)n−2 ,
and so
P (C in Dn0,1 ) = n
=
R 1 n−1
(1 − x)n−2 dx
0 x
n!(n−2)!
.
(2n−2)!
16
Hence
E[hc(Dn0,1 )] = hc(DKn )/ (2n−2
n )
1
∼ hc(DKn )4(πn) 2 4−n .
In a similar way we find
P (C in Dn1,1 , a last in Dn1,1 |Xa = x) = 2xn−1 (1 − x)n−2 − xn−1 (1 − x)2n−4 ,
and so
P (C in Dn1,1 ) = n
R1
0
(2xn−1 (1 − x)n−2 − xn−1 (1 − x)2n−4 )dx
n
(n−2)!
−
= n! 2 (2n−2)!
Hence
(2n−4)!
(3n−4)!
o
.
E[hc(Dn1,1 )] = hc(DKn )P (C in Dn1,1 )
∼ 2E[hc(Dn0,1 )].
2
17
References
[1] M. Ajtai, J. Komlós and E. Szemerédi, First occurrence of hamilton
cycles in random graphs, Annals of Discrete Mathematics 27 (1985),
173-178.
[2] B. Bollobás, The evolution of sparse graphs, in Graph Theory and Combinatorics, Proceedings of Cambridge Combinatorial Conference in honour of Paul Erdös, ed B. Bollobás, Academic Press, London, 1984, 35-57.
[3] B. Bollobás, Random graphs, Academic Press, London, 1985.
[4] C. Cooper and A. M. Frieze, On the number of hamilton cycles in a
random graph, J. Graph Theory 13 (1989), 719-735.
[5] A. M. Frieze, An algorithm for finding hamilton cycles in random directed graphs, J. Algorithms 9 (1988), 181-204.
[6] J. Komlós and E. Szemerédi, Limit distribution for the existence of
hamilton cycles in random graphs, Discrete Mathematics 43 (1983), 5563.
18