Download ECE 101 An Introduction to Information Technology Digital Logic

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
ECE 101
An Introduction to Information
Technology
Information Theory
Information Path
Source of
Information
Information
Display
Digital
Sensor
Information
Processor
& Transmitter
Transmission
Medium
Information
Receiver and
Processor
Information Theory
• Source generates information by producing
data units called symbols
• Measurement of information present
– measure randomness (value of information)
– do this mathematically using probability
– amount of information present is measure of
“entropy”
Probability
•
•
•
•
Study of random outcomes
The experiment
The outcome
P[Xi] = probability of an a particular
outcome (Xi)
 0 < P[Xi] < 1
N
  P[ X i ]  1
i 1
 where N= number of different outcomes
Measuring Information
• Symbol - data units of information
• Entropy
 average amount of energy that a source
produces, measured in bits/symbol
M
H   P[X i ] log 2 P[X i ] bits/symbo l
i 1
M
H  3.322 P[X i ] log 10{P[X i ]} bits/symbo l
i 1
or
M
H  3.322 P[X i ] log 10{1/P[X i ]} bits/symbo l
i 1
Logarithms – Base 2
• In information theory we need logs to the
base 2, not 10 (log10 N = x or 10x = N) (logs
are exponents)
• log2 N = x or 2x = N
• 20 = 1; log2 1 = 0
• 21 = 2; log2 2 = 1
• 22 = 4; log2 4 = 2
• 23 = 8; log2 8 = 3
• 24 = 16; log2 16 = 4
• 25 = 32; log2 32 = 5
Logarithms – Base “a” then a=2
• Conversion of bases in general:
•
•
•
•
•
loga N = x or ax = N
So log2 N = x or 2x = N
loga N = (log10 N)/ (log10 a)
If a = 2, then use log10 2 = .301
log2 N = 3.32 (log10 N)
• loga MN = (loga M) + (loga N)
• loga M/N = (loga M) - (loga N)
• loga Nm = m(loga N)
Measuring Information
• Symbol - data units of information
• Entropy
 average amount of energy that a source
produces, measured in bits/symbol
M
H   P[X i ] log 2 P[X i ] bits/symbo l
i 1
M
H  3.322 P[X i ] log 10{P[X i ]} bits/symbo l
i 1
or
M
H  3.322 P[X i ] log 10{1/P[X i ]} bits/symbo l
i 1
Effective Probability and Entropy
• Measurement of entropy when probability is
not known
 estimate probability when it is not known
 effective probability = Pe[Xi] = NXi/N
M
He   Pe [X i ] log 2 Pe [X i ] bits/symbo l
i 1
Simulating Randomness by
Computer
• Information is an unexpected quality
• Model it an an experiment that produces
random outcomes
• Common method: pseudo-random number
generator (PRNG)
• PRNG uses Modular Arithmetic
Modular Arithmetic
• [B]mod(N) = modulo-N value of integer B
• Divide B by N: B/N = I + R/N
– where I is integer quotient and R is remainder
– 0  R  (N-1)
• [B]mod(N) = R = B - (I  N)
• or R = (B/N - I)  N, where B/N = I.xxx
Pseudo-Random Number
Generator
• Create a random number from a sequence
X1, X 2, X3 , … , Xn, … where Xn is the nth
integer in the sequence
• Find Xn = [A  Xn-1 + B]mod(N) where
– A is an arbitrary multiplier of Xn-1
– N is the base of the modulus
– B prevents the sequence from degenerating into
a set of zeroes
– to get started we need an arbitrary X0, or seed
Arbitrary Range for PseudoRandom Numbers
– Desire range other than an integer number then
Range Y where 0  Y  M
then
Xn
Yn 
M
N
Related documents