Download Random Processes Random process = random signal = stochastic

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Random Processes
Random process = random signal = stochastic process
Map a random experiment outcome into a signal. This corresponds to a
countably infinite sequence of measurements when the signal is discrete-time.
The time index could be discrete
{Xn (ω)}, n ∈ Z
(or n ∈ {0, 1, 2, . . .})
or continuous
{X(t, ω)}, t ∈ " (or t ∈ "+ )
Two common types of notation: Xt (ω) and X(t, ω)
1
Compare mapping for random vectors and random processes
Fix ω0 ⇒ X(t, ω0 ) : deterministic signal
Fix t0 ⇒ X(to , ω): random variable
Fix ω0 , t0 ⇒ X(to , ω0 ): deterministic number
2
Notation (without the ω)
{X(t)} random process
X(t) random variable
x(t)
deterministic signal
x(t0 ) deterministic number
Like deterministic signals, random signals can be discrete or continuous time.
Like random variables, random signals can be discrete or continuous valued.
It’s possible for a signal to have continuous values, but have individual RVs
that are discrete.
Whether the RVs are discrete- or continuous-valued determines whether you
use pmf’s or pdf’s.
Whether time is discrete or continuous valued determines whether you use
discrete or continuous-time systems analysis (difference vs. differential equations).
3
Example 1: ω ∈ {H, T }
X(t, ω) =
!
3
ω=H
sin(t) ω = T
Example 2: ω ∈ (0, 1) unif
X(n, ω) is the n-th digit in the decimal expansion of ω
Example 3: ω ∈ (0, 1) unif
X(t, ω) = sin(ωt), t ∈ "
Example 4: ω ∈ unit circle, uniform
X(n, ω) = R(ω) cos(Θ(ω)n), n ∈ Z
4
X(t0 ) is a random variable. Find its probability distribution for the random
process in example 1.
5
Like random variables and vectors, we characterize a random process with
(Ω, F, P )
For most RPs of practical interest, it is difficult to characterize P for an infinite
collection of random variables, so we work with finite collections with variable
times (random vectors).
Key steps to finding marginals:
• Find the sample space. (Note that it varies with {ti }.) Remember that
[x(t1 ) x(t2 ) · · · x(td )] all come from the same deterministic signal.
• Use equivalent events to find the distributions.
– If the sample space is discrete, find the PMF by finding the probability for each possible discrete outcome.
– If the sample space is continuous, first find the CDF using equivalent
events, then take the derivative(s) to find the PDF.
• For higher-order marginals: look for independence or conditional independence to simplify the problem.
6
Example 2: Decimal expansion of ω ∼ unif(0, 1)
Find the 1st-order marginals and joint distribution for n = 1, 2. Are these
random variables independent?
7
Another example: Random experiment

−2










1











 u(t − 1)
X(t, ω) =



2










t









−t
is roll of fair die.
ω=1
ω=2
ω=3
ω=4
ω=5
ω=6
Find the marginals and joint distribution for X(0) and X(2). Are these
random variables independent?
8
Example: W ∼ exponential with parameter α
Find the first-order marginal for X(t) where
9
Find the second-order marginal
10
If you have a d-order distribution, you can also compute d-order moments, as
for any random vector.
When time is a variable, then the moments are functions of time:
• mX (t) = E[X(t)] – mean
• RX (t1 , t2 ) = E[X(t1 )X(t2 )] – auto-correlation function
• CX (t1 , t2 ) = E[(X(t1 )−mX (t1 ))(X(t2 )−mX (t2 ))] = RX (t1 , t2 )−mX (t1 )mX (t2 )
– auto-correlation function
If you have joint distributions as a function of time, then you also have conditional distributions (and conditional expectations) as a function of time.
11
Find the mean E[X(t)] and autocorrelation E[X(t1 )X(t2 )] for the pulse random process.
12
Example: ω ∈ {H, T }
X(n, ω) =

 (−1)n

ω=H
(−1)n+1 ω = T
13
Example: Θ ∼ unif(−π, π]
X(t) = cos(ω0 t + Θ)
14
Specifying P for an entire process is possible when there’s a general “rule”
for computing the probability distribution for the random variables associated
with an arbitrary collection of times. Some examples:
• Independent and identically distributed (i.i..d.) processes (discrete-time)
PX(n1 )X(n2 )···X(nd ) (x1 , x2 , . . . , xd ) =
d
&
i=1
PX (xi ) where PX(k) (x) = PX (x) ∀k
• Independent increment process (discrete or continuous time)
PX(t1 )X(t2 )···X(td ) (x1 , x2 , . . . , xd ) = PX(t1 ) (x1 )
d
&
PWi (xi )
i=2
Assuming WLOG that τi = ti − ti−1 > 0 and Wi = X(ti ) − X(ti−1 )
15
Example:
{X(n)} and {Y (n)} are i.i.d. Bernoulli processes with parameters p and q,
respectively. The two processes are mutually independent.
W (n) = X(n) ⊕ Y (n)
16
Example:
17