Download Statistical characterization of stationary ergodic random signals

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Randomness wikipedia , lookup

Transcript
Statistical characterization of stationary ergodic random signals
1. Application goal
We study statistical characteristics of a signal: cumulative distribution function,
probability density function, as well as statistical measures (mean and standard deviation).
2. Random signals: basic notions
A signal is a function of time and it carries useful information. Signals can be modeled
using functions (or distributions) and those are called deterministic. Knowing the function that
models the signal, we know the signal’s value at each moment of time.
In probability theory, a random variable (r.v.) is a variable that can take on different,
random values and it can be described by a probability distribution. In practice we refer to
stochastic or random processes.
Random signals cannot be modeled using neither a function nor a distribution; their
instantaneous value can not be predicted. It is said that random signals do not have a closed form
analytical representation.
Such signals can be analyzed on the basis of their statistical properties. Knowing the
value of the random signal a given moment of time t , we can say with a probability in what
interval lies the value of the signal at the moment of time t + t0 .
A stationary ergodic process is a stochastic process which exhibits both stationarity and
ergodicity.
Stationarity is the property of a random process which guarantees that its statistical
properties, such as the mean value, its moments and variance, will not change over time. A
stationary process is one whose probability distribution is the same at all times.
A stochastic process is said to be ergodic if its statistical properties (mean and variance)
can be deduced from a single, sufficiently long realization of the process. In practice this means
that statistical sampling can be performed at one instant across a group of identical processes or
sampled over time on a single process with no change in the measured result.
Gaussian noise is noise that has a probability density function (pdf) of the normal
distribution (Gaussian).
White noise is a random signal (or process) with a flat power spectral density: the
signal's power spectral density has equal power in any band, at any centre frequency, having a
given bandwidth. White noise is considered analogous to white light which contains all
frequencies.
“Gaussian noise” should not be mistaken with “white noise”, the first term refers to its
statistical property while the second with its spectral properties. Noise can be both white and
Gaussian.
3. Statistical characterization of stationary ergodic random signals
The unidimensional Cumulative Distribution Function (CDF) of a random signal is the
probability that the random variable (signal) X takes on a value less than or equal to x . The
distribution function is sometimes also denoted Q ( x ) (Kay’93).
1
FX ( x ) = P ( X ≤ x )
(1)
The probability density function (pdf) is defined as the derivative of the distribution function
dFX ( x )
dx
pX ( x ) =
(2)
The following relation is satisfied:
b
P ( a ≤ X ≤ b ) = ∫ p X ( x )dx
p X ( x ) ≥ 0, ∀x
a
The complementary cumulative density function CCDF, also denoted R ( x ) , is the
probability that a random variable X takes on a value higher than x :
P ( X ≤ a) + P ( X > a) = 1
CDF , Q ( a )
CCDF , R ( a )
These are the simplest statistical characteristics of random signals. They can be completely
characterized by their n-dimensional CDF.
Using the CDF and the pdf we can compute statistical measures of random signals, such
as, the mean μ X and the standard deviation σ2X (E is the expectation operator).
∞
μX = E{X } =
∫ x ⋅ p ( x )dx
X
(3)
−∞
{
}
σ = E [ X − μX ] =
2
X
2
∞
∫ ( x − μX )
2
⋅ p X ( x ) dx
−∞
For stationary random signals, the pdf doesn’t depend on time:
p X ( x, t ) = p X ( x ) , ∀t
2
(4)
For ergodic random signals, the mean for one realization is the same with the statistical mean
computed for many realizations, in every moment of time.
For random signals, the CDF has the following properties:
FX ( −∞ ) = 0
FX ( ∞ ) = 1
p { x1 < X ≤ x2 } = FX ( x2 ) − FX ( x1 )
x2 ≥ x1 ⇒ FX ( x2 ) ≥ FX ( x1 )
(5)
(6)
(7)
(8)
The probability density functions of stationary random signals can be approximated:
pX ( x ) =
=
dFX ( x )
FX ( x + Δx ) − FX ( x ) ( 7 )
P { x < X ≤ x + Δx}
= lim
Δx → 0
Δx → 0
dx
Δx
Δx
P { x < X ≤ x + Δx}
= lim
(9)
Δx
Figure 1 shows one realization x
time when the realization of the process,
υ
( t ) of duration T of a random signal x ( t ) . The mean
( )
( )
x ( t ) , is below the threshold x is υ ( t )
(k )
(k )
k
k
( x ) = lim
T →∞
∑ Δt
i
i
(10)
T
For ergodic random signals, the mean time is the same with the CDF FX ( x ) (or Q ( x ) ).
According to (9) and (7), for stationary ergodic signals the pdf is:
pX ( x ) =
FX ( x + Δx ) − FX ( x )
1
=
⋅ ∑ Δti
Δx
T Δx i
where the durations Δti are shown in figure 2.
3
(11)
Figure 1. One realization x
(k )
( t ) of duration T
of a random signal x ( t ) .
Figure 2. The durations Δti are the intervals when the realization of the signal is between x and x + Δx
4. Practical part
The practical part of this lab consists in simulations of stationary ergodic random signals.
4.1 First, using the Matlab program random_signals_statistics.m, we generate a Gaussian
noise sequence. Plot the probability density functions and the cumulative density functions for
different number of samples from the signal (N=100, 1000, 10000).
4
4.2 Second, using the Matlab program random_sinsum_statistics.m, we generate a sum
of sinusoids with random frequencies. Plot the probability density functions and the cumulative
density functions for different number of sinusoids (5, 25, 50). Is the central limit theorem
verified or not?
5. Annexe: The central limit theorem
Consider X 1 , X 2 , X 3 ,..., X n a sequence of n independent and identically distributed (i.i.d)
random variables with finite values of mean μ and variance σ 2 > 0 . The central limit theorem
states that as the sample size n increases the distribution of the sample average of these random
variables approaches the normal distribution with a mean μ and variance σ 2 / n irrespective of
the shape of the original distribution. The sum of n random variables is
S n = X 1 + X 2 + X 3 + ... + X n
The distribution of a new random variable:
Zn =
Sn − nμ
σ n
converges towards the standard normal distribution N ( 0,1) as n approaches ∞ .
5