Download Entropy And Entropy-based Features In Signal Processing K

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of thermodynamics wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

T-symmetry wikipedia , lookup

Thermodynamic system wikipedia , lookup

Entropy wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

H-theorem wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

Transcript
Entropy And Entropy-based Features In Signal
Processing
K. Ekštein, T. Pavelka
Laboratory of Intelligent Communication Systems, Dept. of Computer Science and Engineering,
University of West Bohemia, Plzeň, Czech Republic
I.
Introduction
Entropy as a thermodynamic state variable was introduced into physics by German physicist Rudolf
Clausius in second half of 18th century. It was originally defined as
dS =
δQ
,
T
(1)
where dS is an elementary change of entropy, δQ is a reversibly received elementary heat, and T is
an absolute temperature. Of course such a definition has no sense for signal processing. However,
it started a diffusion of entropy as a term into the other areas. The entropy as a measure of system
disorganisation appeared for the first time in connection with the First postulate of thermodynamics:
“Any macroscopic system which is in time t0 in given time-invariant outer conditions will reach after
a relaxation time the so-called thermodynamic equilibrium. It is a state in which no macroscopic
processes proceed and the state variables of the system gains constant time-invariant values.” The
entropy of a system is maximal when the system has reached the thermodynamic equilibrium.
The above depicted key idea promoted entropy to a generic measure of system disorganisation.
Another definitions of entropy were later proposed for use in mathematics, especially statistics:
H(x) = −
N
X
p(xi ) log10 p(xi ),
(2)
i=1
where x = {x1 , x2 , . . . , xN } is a set of random phenomena, and p(xi ) is a probability of a random
phenomenon xi .
A proposed relation between entropy and signal processing is based on the hypothesis that a noise
(white noise) is a projection of a system in thermodynamic equilibrium into a signal. As a result the
noise is supposed to have the highest entropy value while the speech (and mainly periodic sounds like
e.g. vowels) has significantly lower entropy value as it is more organised and required an extra energy
to be produced in such an organised form1 . According to the above presumption the entropy can be
used in signal processing for e.g. separating the useful signal from a background noise.
II.
Entropy Computation
Entropy (or an entropy-based feature) can be computed from any finite set of values, e.g. a parametric
vector, a discrete spectral density estimate, or directly from a segment of a digital signal. We used the
following algorithms to compute the entropy:
1
This principle reflects the Second thermodynamic postulate saying that entropy can be lowered if an energy is exerted
into the task of organising the examined system.
function y=entropy1(x);
function y=entropy2(x);
tot = 0.0;
ent = 0.0;
ent = 0.0;
m = mean(x);
for i=1:length(x)
tot = tot + x(i)ˆ2;
end
for i=1:length(x)
quo = abs(x(i) - m);
ent = ent + (quo * log10(quo));
end
for i=1:length(x)
quo = x(i)ˆ2 / tot;
ent = ent + (quo * log10(quo));
end
y = -ent;
y = -ent;
The algorithm entropy1 (called Type 1 hereafter) reflects the original statistical definition of entropy
given by eq. 2. The algorithm entropy2 (called Type 2 hereafter) represents a modified computation
scheme of entropy which takes signal spectrum characteristics into account. The probability p(xi ) is
approximated by the difference of the spectrum component and the mean value, p(xi ) ' |s[i] − s̄[i]|.
Figure 1: Signal with spectral entropy (Type 2) course
The figure 1 shows the Type 2 spectral entropy course over a noisy signal. It can be seen that the
start- and end-points of speech are naturally given by a significant entropy drop (the displayed course
is normalised onto h−1, 0i; the original entropy values differed in 11 orders of magnitude for noise and
speech respectively).
III.
Conclusion
We showed that entropy (obtained via a modified entropy computation algorithm) can be used in
the signal processing area to separate the useful signal from an intrusive noise. The application in the
speech recognition area is obviously (i) voice activity (start- and end-point) detection, (ii) spectral
analysis and classification of frames (noise/tonal structure separation), and (iii) hypotheses support
in acoustic-phonetic decoding tasks.
Acknowledgement
This research was supported by Research Grant No. MSM 235200005.