Download 10-RandomVibration.pdf

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Probability wikipedia , lookup

Transcript
Random Vibration
Random vibration is often encountered in practice. It is the result or fluctuating external
excitation varying unpredictably with time. If excitation data are available, they can be
substituted directly into the equation of motion and a solution obtained using numerical
methods but this is often impractical.Instead, the methods of probability and statistics are
used to investigate the response of structural systems to random inputs.
1
Probability
Events and sample space are fundamental concepts in probability. A sample space S is
the set of all possible outcomes of an experiment whose outcome cannot be determined
in advance while an event E is a subset of S. The probability of the event E, P (E), is a
number satisfying the following axioms
0 ≤ P (E) ≤ 1
P (S) = 1
P(
[
Ei ) =
X
P (Ei )
where the various Ei ’s are mutually exclusive events.
One can associate with each occurrence in S a numerical value. A random variable X
is a function assigning a real number to each member of S. Random variables can adopt
discrete values or continuous values.
2
Probability Density and Distribution Functions
Random data are described by their probability density and distributions functions.
Discrete random variables
If X is a random variable (i.e. a function defined over the elements of a sample space)
with a finite number of possible values xi ∈ RX with i = 1, 2, ... , where RX is the range of
values of the random variable, then it is a discrete random variable.
1
The probability of X having a specific value xi , p(xi ) = P (X = xi ) is a number such that
p(xi ) ≥ 0
for every i = 1, 2, ..., and
∞
X
p(xi ) = 1
i=1
The collection of pairs (xi , p(xi )), i = 1, 2, ... is called the probability distribution of X.
p(xi ) is the probability mass function of X.
Continuous random variables
If RX is an interval rather than a discrete set then X is a continuous random variable.
The probability that X ∈ [a, b] is
P (a ≤ X ≤ b) =
Z
b
a
f(x)dx
where f(x) is the probability density function of X satisfying (for all x ∈ RX )
f(x) ≥ 0
Z
RX
f(x)dx = 1
and, if x 6∈ RX
f(x) = 0
Cumulative distribution function
The probability that X ≤ x, P (X ≤ x) = F (x) is the cumulative distribution
function. The CDF is defined as
n
X
F (x) =
p(xi )
i=1
for discrete X ≤ xn and as
F (x) =
Z
x
−∞
for continuous X ≤ x.
2
f(t)dt
Note that if a < b then F (a) ≤ F (b), limx→∞ F (x) = 1, limx→−∞ F (x) = 0 and P (a ≤
X ≤ b) = F (b) − F (a).
Expectation and Moment Generating Function
The expected value of a random variable X, the expectation of X is
n
X
E(X) =
Xi p(Xi )
i=1
for discrete X and
E(X) =
Z
∞
Xf(X)dX
−∞
for continuous X. E(X) is also called the mean or the first moment of X.
Generalizing, the nth moment of X may be defined as
n
E(X ) =
n
X
Xin p(Xi )
i=1
for discrete X and
E(X) =
Z
∞
X n f(X)dX
−∞
for continuous X.
A moment generating function of a random variable X can be defined as
ψ(t) = E(etX ) =
Z
etX dF (x)
Moments of all orders for X are obtained as the derivatives of ψ. The existence of a moment
generating function uniquely determines the distribution of X.
Moments with respect to the mean are often useful in characterizing the appearance of
probability distributions. The variance of X, V (X) = var(X) = σ 2 is
σ 2 = E((X − E(X))2 ) = E(X 2 ) − (E(X))2
√
The standard deviation of X is σ = σ 2.
The third moments of a distribution is called its skewness Sk, it appears as a tilt to the
right or left in the density function representing an asymmetry and it is defined as
1
Sk = 3
σ
Z
∞
−∞
(X − E(X))3 f(X)dX
The normal distribution has zero skewness, a tilt to the right is associated with negative
values of Sk and a tilt to the left implies Sk > 0.
3
The fourth moment of a distribution is called its kurtosis K, it constitutes a description
of the peakedness of the density function at its center and the reach of its tails; it its defined
as
K=
1
σ4
Z
∞
−∞
(X − E(X))4 f(X)dX
The normal distribution has K = 3; tall and skinny distributions with long tails are
associated with K > 3 while squat distributions with short tails are produced when K < 3.
Another important statistic is the covariance of two random variables X and Y , Cov(X, Y ).
This is defined as
Cov(X, Y ) = E(XY ) − E(X)E(Y )
If Cov(X, Y ) = 0 the variables are said to be uncorrelated. Further, the autocorrelation
coefficient, ρ(X, Y ) is defined as
ρ(X, Y ) =
Cov(X, Y )
1
(var(X)var(Y )) 2
The conditional probability gives the probability that a random variable X = x given
that Y = y and is defined as
P (X = x|Y = y) =
P (X = x, Y = y)
P (Y = y)
Law of Large Numbers and the Central Limit Theorem
The following limit theorems are of fundamental and practical importance. They are
given here without proof.
The strong law of large numbers states that if the random variables X1 , X2 , ..., Xn
are independent and identically distributed (iid) with mean µ then the limit
lim
n→∞
Pn
Xi
= lim X̄ = µ
n→∞
n
1
with probability P = 1.
Furthermore if the variance of the distribution of the Xi above is σ 2, the central limit
theorem states that
X̄ − µ
lim P [ √ ≤ a] =
n→∞
σ/ n
Z
a
−∞
1
2
√ e−x /2 dx
2π
In words the√theorem states that the distribution of the normalized random variable
(X̄ − µ)/(σ/ n) approaches the standard normal distribution of mean 0 and standard
deviation 1.
4
The Normal (Gaussian) Distribution
A random variable X with mean µ and variance σ 2 has a normal distribution (X ∼
N(µ, σ) if its probability density function in x ∈ [−∞, ∞] is
1 x−µ 2
1
f(x) = √ e− 2 ( σ )
σ 2π
The cumulative distribution function is
Z x
1 t−µ 2
1
√ e− 2 ( σ ) dt
F (x) = P (X ≤ x) =
−∞ σ 2π
The standardized random variable Z = (X − µ)/σ has mean of zero and standard
deviation of 1. Its probability density function is:
1
z2
φ(z) = √ e− 2
2π
and the cumulative distribution function is
Z z
t2
1
√ e− 2 dt
Φ(z) = P (X ≤ x) =
−∞
2π
The Weibull Distribution
A random variable X associated with the three parameters −∞ < ν < ∞ (location),
α > 0 (scale) and β > 0 (shape), has a Weibull distribution if its probability density function
is
(
β x−ν β−1
(
)
exp(−( x−ν
)β ) x ≥ ν
α
f(x) = α α
0
otherwise
The cumulative probability distribution function is
F (x) =
(
0
x<ν
x−ν β
1 − exp(−( α ) ) otherwise
If ν = 0, the probability density function becomes
f(x) =
(
β α1 ( αx )β−1 exp(−( αx )β ) x ≥ 0
0
otherwise
The corresponding cumulative distribution function is
F (x) =
(
0
x<0
x β
1 − exp(−( α ) ) otherwise
If ν = 0 and β = 1, the probability density function
f(x) =
(
1
α
exp(− αx ) x ≥ 0
0
otherwise
i.e. the exponential distribution with parameter λ = 1/α.
The mean and variance of the Weibull distribution are, respectively E(X) = ν +αΓ( β1 +1)
and V (X) = α2 (Γ( β2 + 1) − Γ( β1 + 1)2 ).
5
3
The Power Spectrum
Recall that the complete Fourier series representation of a periodic function is given by
x(t) = a0 +
∞
X
(an cos(nωt) + bn sin(nωt))
n=1
where the Fourier coefficients are
1
a0 =
T
Z
T /2
−T /2
x(t)dt
and
an =
R T /2
bn =
R T /2
and
−T /2 x(t) cos(nωt)dt
RT
2
0 [cos(nωt)] dt
−T /2 x(t) sin(nωt)dt
RT
2
0 [sin(nωt)] dt
is the fundamental frequency.
for n = 1, 2, 3, ... and ω = 2π
T
n
The quantities f(n) = T (in Hz) are the frequency components of the series. Do not
confuse f(n) with fn the natural frequency of vibration of the undamped system.
The power associated with each frequency component Pn is defined as
1
Pn = (a2n + b2n )
2
The power spectrum of x(t)is a plot of the discrete values of Pn corresponding to the
various frequencies f(n)
The power spectral density at any discrete frequency f(n) , S(f(n)) is defined as
S(f(n) ) =
4
T 2
(a + b2n )
2 n
Response to a Random Input
The function H(f) is called the frequency response function. It is usually expressed in
complex form but it can also be expressed in magnitude and phase form. When many
frequencies f(n) are simultaneously involved, there is one frequency response function for
each frequency H(f(n) )
Let the function x(t) be a function representing the input to the system and y(t) the
resulting response to the input. If the function x(t) is represented by a Fourier series with
6
coefficients an , bn , y(t) can also be represented by a Fourier series but the coefficients of this
series must be multiplied by the modulus of the frequency response function of the system.
The power spectral density of the response is thus given as
T
Sy (f(n) ) = |H(f(n) )|2Sx (f(n) ) = |H(f(n) )|2 (a2n + b2n )
2
And if the frequencies f(n) are so close together that can be regarded as a continuous
function f
Sy (f) = |H(f)|2 Sx (f)
As an example, consider the case of a single degree of freedom system subjected to random
applied force excitation.
The frequency response function in this case is the receptance
H(f) =
ẑ
(1 − Ω2 ) − i2γΩ
(1 − Ω2 ) − i2γΩ
=
=
k[(1 − Ω2 )2 + (2γΩ)2 ]
mωn2 [(1 − Ω2 )2 + (2γΩ)2 ]
F̂
where ωn = 2P ifn is the natural frequency of vibration of the undamped system.
Since the modulus of the receptance is
|H(f)| =
1
q
mωn2 (1 − Ω2 )2 + (2γΩ)2
Then, if the inpute spectral density of the exciting force is SF (f), the corresponding
displacement response spectral density Sz (f) is simply
Sz (f) = |H(f)|2 SF (f) =
m2 ωn4 [(1
SF (f)
SF (f)
= 2
2
2
2
4
− Ω ) + (2γΩ) ]
m (2πfn ) [(1 − Ω2 )2 + (2γΩ)2 ]
The mean square displacement response (variance) is given by
σz2
=
Z
0
∞
SF
Sz df = 2
m (2πfn )4
Z
∞
0
(1 −
1
df
+ (2γΩ)2
Ω2 )2
where the common assumption made in practice of a constant value of Sf (f) for all frequencies has been used.
Since the frequency ratio Ω = ωωn = ffn , df = fn dΩ and
σz2 =
SF fn
2
m (2πfn )4
Z
0
∞
(1 −
1
dΩ
+ (2γΩ)2
Ω2 )2
Integration and rearrangement finally yields the following expression for the standard
deviation of the displacement
1
σz =
8m
s
SF
3
π fn3 γ
7
Using the same procedure, the standard deviation of the relative displacement resulting
from excitation due to random base motion is given by
1
σy =
8
s
Sx
3
π fn3 γ
8