NBER WORKING PAPER SERIES Darrell Duffie

... population (and with a purely finitely additive sample measure space in [26]). Section 6 provides additional discussion of the literature.8 A continuum of agents with independent random types is never measurable with respect to the completion of the usual product σ-algebra, except in the trivial cas ...

... population (and with a purely finitely additive sample measure space in [26]). Section 6 provides additional discussion of the literature.8 A continuum of agents with independent random types is never measurable with respect to the completion of the usual product σ-algebra, except in the trivial cas ...

The cover time of random geometric graphs - CMU Math

... Geometric graphs are widely used as models of ad-hoc wireless networks [18], [19], [24] in which each transmitter has transmission radius r and can only communicate with other transmitters within that radius. In the simplest model of a random geometric graph, the n points representing transmitters, ...

... Geometric graphs are widely used as models of ad-hoc wireless networks [18], [19], [24] in which each transmitter has transmission radius r and can only communicate with other transmitters within that radius. In the simplest model of a random geometric graph, the n points representing transmitters, ...

Introductory lecture notes on Markov chains and random walks

... Markov chains generalise this concept of dependence of the future only on the present. The generalisation takes us into the realm of Randomness. We will be dealing with random variables, instead of deterministic objects. Other examples of dynamical systems are the algorithms run, say, by the softwar ...

... Markov chains generalise this concept of dependence of the future only on the present. The generalisation takes us into the realm of Randomness. We will be dealing with random variables, instead of deterministic objects. Other examples of dynamical systems are the algorithms run, say, by the softwar ...

Random walks and electric networks

... However, there are certain conditions under which we can guarantee that a fair game remains fair when stopped at a random time. For our purposes, the following standard result of martingale theory will do: Martingale Stopping Theorem. A fair game that is stopped at a random time will remain fair to ...

... However, there are certain conditions under which we can guarantee that a fair game remains fair when stopped at a random time. For our purposes, the following standard result of martingale theory will do: Martingale Stopping Theorem. A fair game that is stopped at a random time will remain fair to ...

The spacey random walk: a stochastic process for higher-order data

... 2005] or an l2 eigenvector [Lim, 2005] of P . Li and Ng [2014] and Gleich et al. [2015] analyze when a solution vector x for Equation 1.4 exists and provide algorithms for computing the vector. These algorithms are guaranteed to converge to a unique solution vector x if P satisfies certain propertie ...

... 2005] or an l2 eigenvector [Lim, 2005] of P . Li and Ng [2014] and Gleich et al. [2015] analyze when a solution vector x for Equation 1.4 exists and provide algorithms for computing the vector. These algorithms are guaranteed to converge to a unique solution vector x if P satisfies certain propertie ...

Recurrence vs Transience: An introduction to random walks

... Starting with Pólya’s theorem one can say perhaps that the theory of random walks is concerned with formalizing and answering the following question: What is the relationship between the behavior of a random walk and the geometry of the underlying space? Since it is possible for a drunkard to walk ...

... Starting with Pólya’s theorem one can say perhaps that the theory of random walks is concerned with formalizing and answering the following question: What is the relationship between the behavior of a random walk and the geometry of the underlying space? Since it is possible for a drunkard to walk ...

(pdf)

... Abstract. We define random walks on Rd and recurrent points and demonstrate that a random walk’s recurrence to 0 implies its recurrence to each of its possible points. We then prove two different necessary and sufficient conditions for the recurrence of random walks. Finally, we employ these results ...

... Abstract. We define random walks on Rd and recurrent points and demonstrate that a random walk’s recurrence to 0 implies its recurrence to each of its possible points. We then prove two different necessary and sufficient conditions for the recurrence of random walks. Finally, we employ these results ...

Towards Unique Physically Meaningful Definitions of Random and

... probability laws, i.e., all the statements (deﬁned in a certain language L) which are true for almost all sequences. To be more precise, a probability law on the set X of all sequences is an L-deﬁnable subset S ⊆ X for which P (S) = 1 – or, equivalently, for whose (similarly deﬁnable) complement −S, ...

... probability laws, i.e., all the statements (deﬁned in a certain language L) which are true for almost all sequences. To be more precise, a probability law on the set X of all sequences is an L-deﬁnable subset S ⊆ X for which P (S) = 1 – or, equivalently, for whose (similarly deﬁnable) complement −S, ...

TOWARDS UNIQUE PHYSICALLY MEANINGFUL DEFINITIONS OF

... probability laws, i.e., all the statements (deﬁned in a certain language L) which are true for almost all sequences. To be more precise, a probability law on the set X of all sequences is an L-deﬁnable subset S ⊆ X for which P (S) = 1 – or, equivalently, for whose (similarly deﬁnable) complement −S, ...

... probability laws, i.e., all the statements (deﬁned in a certain language L) which are true for almost all sequences. To be more precise, a probability law on the set X of all sequences is an L-deﬁnable subset S ⊆ X for which P (S) = 1 – or, equivalently, for whose (similarly deﬁnable) complement −S, ...

Members of random closed sets - University of Hawaii Mathematics

... obtain results on infinite subsets of random sets of integers. Here we show that the distributions studied by Barmpalias et al. and by Galton and Watson are actually equivalent, not just classically but in an effective sense. For 0 ≤ γ < 1, let us say that a real x is a Memberγ if x belongs to some ...

... obtain results on infinite subsets of random sets of integers. Here we show that the distributions studied by Barmpalias et al. and by Galton and Watson are actually equivalent, not just classically but in an effective sense. For 0 ≤ γ < 1, let us say that a real x is a Memberγ if x belongs to some ...

(pdf)

... standard solution to this is to consider the sequence {ρn }n∈N of uniform measures on balls of radius n about the identity with respect to a word metric. Although the theory of mixing time is sufficiently developed in simple cases such as D8 to address issues like speed of convergence, the main ques ...

... standard solution to this is to consider the sequence {ρn }n∈N of uniform measures on balls of radius n about the identity with respect to a word metric. Although the theory of mixing time is sufficiently developed in simple cases such as D8 to address issues like speed of convergence, the main ques ...

BROWNIAN MOTION Definition 1. A standard Brownian (or a

... and (4) are compatible. This follows from the following elementary property of the normal distributions: If X , Y are independent, normally distributed random variables with means µ X , µY and variances σ2X , σ2Y , then the random variable X +Y is normally distributed with mean µ X +µY and variance ...

... and (4) are compatible. This follows from the following elementary property of the normal distributions: If X , Y are independent, normally distributed random variables with means µ X , µY and variances σ2X , σ2Y , then the random variable X +Y is normally distributed with mean µ X +µY and variance ...

4 Sums of Independent Random Variables

... 4.2 Nearest Neighbor Random Walks on Z P Definition 4.12. The sequence S n = ni=1 X i is said to be a nearest neighbor random walk (or a p-q random walk) on the integers if the random variables X i are independent, identically distributed and have common distribution P {X i = +1} = 1 ° P {X i = °1} ...

... 4.2 Nearest Neighbor Random Walks on Z P Definition 4.12. The sequence S n = ni=1 X i is said to be a nearest neighbor random walk (or a p-q random walk) on the integers if the random variables X i are independent, identically distributed and have common distribution P {X i = +1} = 1 ° P {X i = °1} ...

Ballot theorems for random walks with finite variance

... are positive with probability of order k/n? This latter perspective is philosophically closely tied to work of Andersen (1953, 1954), Spitzer (1956) and others on the amount of time spent above zero by a conditioned random walk and on related questions. Our procedure for constructing the examples sh ...

... are positive with probability of order k/n? This latter perspective is philosophically closely tied to work of Andersen (1953, 1954), Spitzer (1956) and others on the amount of time spent above zero by a conditioned random walk and on related questions. Our procedure for constructing the examples sh ...

Exact upper tail probabilities of random series

... real-valued random variables {ξj }j=1,2,... . Such series appeared in literature under the name of linear processes (cf. [14] and [10]), and they are basic objects in time series analysis and in regression models (cf. [4]). Estimates on the upper tail probabilities of ...

... real-valued random variables {ξj }j=1,2,... . Such series appeared in literature under the name of linear processes (cf. [14] and [10]), and they are basic objects in time series analysis and in regression models (cf. [4]). Estimates on the upper tail probabilities of ...

(pdf)

... the top branch of the tree (Figure 1) is a tail event. It has probability 1/4. Example 5.2. Let Xn be a simple random walk on Z. The event that Xn visits the origin infinitely many times is a tail event. It has probability zero. Example 5.3. Let Xn be a simple random walk on Lamp Z3 . The event that ...

... the top branch of the tree (Figure 1) is a tail event. It has probability 1/4. Example 5.2. Let Xn be a simple random walk on Z. The event that Xn visits the origin infinitely many times is a tail event. It has probability zero. Example 5.3. Let Xn be a simple random walk on Lamp Z3 . The event that ...

AJP Journal

... that the velocity, ⌬x/⌬t, of a Brownian particle 共the derivative along a Brownian trajectory curve兲 is everywhere infinite. Therefore, a Brownian trajectory is infinitely jagged and care is needed to mathematically analyze Brownian trajectories. Wiener proved that the distance between any two points ...

... that the velocity, ⌬x/⌬t, of a Brownian particle 共the derivative along a Brownian trajectory curve兲 is everywhere infinite. Therefore, a Brownian trajectory is infinitely jagged and care is needed to mathematically analyze Brownian trajectories. Wiener proved that the distance between any two points ...

ENTROPY, SPEED AND SPECTRAL RADIUS OF RANDOM WALKS

... where ⊕ is componentwise sum mod 2 and (x + ξ)(g) := ξ(g − x). We will consider random walks with distribution supported by the symmetric generating set S = {(0, 1), (0, −1), (χ0 , 0)}, where χ0 is the characteristic function of {0}. The way one really should think about this random walk is to imagi ...

... where ⊕ is componentwise sum mod 2 and (x + ξ)(g) := ξ(g − x). We will consider random walks with distribution supported by the symmetric generating set S = {(0, 1), (0, −1), (χ0 , 0)}, where χ0 is the characteristic function of {0}. The way one really should think about this random walk is to imagi ...

(pdf)

... If X is a random variable, then for every Borel subset B of R, X −1 (B) ∈ F. We define a measure µX , called the distribution of the random variable, on Borel sets by µX (B) := P{X ∈ B} = {X −1 (B)}. If µX takes values only for countable subsets of the real numbers, X is a discrete random variable; ...

... If X is a random variable, then for every Borel subset B of R, X −1 (B) ∈ F. We define a measure µX , called the distribution of the random variable, on Borel sets by µX (B) := P{X ∈ B} = {X −1 (B)}. If µX takes values only for countable subsets of the real numbers, X is a discrete random variable; ...

First Return Probabilities - University of California, Berkeley

... has no restrictions other than the probablistic constraints that we gave the Xi ’s. Thus, the number of 2n-paths that have their first return at t = 2k is given by f2k 22k u2n−2k 22n−2k = f2k u2n−2k 22n If we sum, the right hand side of the above equality, over k, we find that u2n 22n = f0 u2n 22n + ...

... has no restrictions other than the probablistic constraints that we gave the Xi ’s. Thus, the number of 2n-paths that have their first return at t = 2k is given by f2k 22k u2n−2k 22n−2k = f2k u2n−2k 22n If we sum, the right hand side of the above equality, over k, we find that u2n 22n = f0 u2n 22n + ...

Real Numbers - Universidad de Buenos Aires

... A naive idea: a sequence x{0,1} is random if it is in no set of Lebesgue measure 0. Of course, since singletons have measure 0, there is no such sequence. ...

... A naive idea: a sequence x{0,1} is random if it is in no set of Lebesgue measure 0. Of course, since singletons have measure 0, there is no such sequence. ...

Branching Processes with Negative Offspring Distributions

... Pr[T = ∞|T > N] is asymptotically 1 if N ε−2 , when ε → 0+ . We also give a formula that generalizes the Otter-Dwass theorem, and use it to prove that if Z = Po(1 − ε), then N ε−2 is both necessary and sufficient for Pr[T = ∞|T > N] ∼ 1. Generalizations of the Galton-Watson branching processes h ...

... Pr[T = ∞|T > N] is asymptotically 1 if N ε−2 , when ε → 0+ . We also give a formula that generalizes the Otter-Dwass theorem, and use it to prove that if Z = Po(1 − ε), then N ε−2 is both necessary and sufficient for Pr[T = ∞|T > N] ∼ 1. Generalizations of the Galton-Watson branching processes h ...

Lecture 1: simple random walk in 1-d Today let`s talk about ordinary

... Theorem 0.9. There exists c > 0 such that with probability one, τ (n) ≤ cn2 log n for all large n. Proof. We split the interval [0, cn2 log n] into b(c/16) log nc intervals of size 16n2 . Let Ri = |Y16in2 +1 + · · · + Y(i+1)16n2 | . Then if τ (n) ≥ cn2 log n, it must be that all Ri ’s are no bigger ...

... Theorem 0.9. There exists c > 0 such that with probability one, τ (n) ≤ cn2 log n for all large n. Proof. We split the interval [0, cn2 log n] into b(c/16) log nc intervals of size 16n2 . Let Ri = |Y16in2 +1 + · · · + Y(i+1)16n2 | . Then if τ (n) ≥ cn2 log n, it must be that all Ri ’s are no bigger ...

Optional Stopping Theorem. 07/27/2011

... Proof of step 4. It follows from step 3 that B is independent of any event generated by X1 , . . . , XN , for any N . Let N → ∞. Then B is independent of any event generated by X1 , X2 , . . .. But B is itself generated by these variables; so B is independent of itself! Recall that any two events C1 ...

... Proof of step 4. It follows from step 3 that B is independent of any event generated by X1 , . . . , XN , for any N . Let N → ∞. Then B is independent of any event generated by X1 , X2 , . . .. But B is itself generated by these variables; so B is independent of itself! Recall that any two events C1 ...