Download Discrete Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Infinite monkey theorem wikipedia , lookup

Randomness wikipedia , lookup

Random variable wikipedia , lookup

Inductive probability wikipedia , lookup

Birthday problem wikipedia , lookup

Probability interpretations wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Law of large numbers wikipedia , lookup

Conditioning (probability) wikipedia , lookup

Transcript
C4: DISCRETE RANDOM VARIABLES
CIS 2033 based on
Dekking et al. A Modern Introduction to Probability
and Statistics. 2007
Longin Jan Latecki
Discrete Random Variables
Discrete random variables are obtained by counting and have sample spaces which
Are countable. The values that represent each outcome are usually integers.
Random variables are denoted by capital letters.
X: is the number of times that we flip a coin until H comes up
The possible outcomes are denoted by lower case letters:
a=1, a=2, a=3...
Probability Mass Function
The probability mass function (pmf) of a discrete random variable maps each possible
outcome in the sample space to it's corresponding probability.
The sum of the probabilities of all possible outcomes will always be equal to 1.
Probability Distribution Function
The distribution function of a random variable X, also referred to as the
cumulative distribution function (CDF) yields the probability
that X will take a value less than or equal to a.
Hence, the value of F(a) is equal to the sum of all probabilities
of outcomes less than or equal to a:
F (a)   P( X  ai )  P( X  a )  P( X  a )
ai a
If we are given a CDF, we can get the pmf with the following formula:
P( X  a )  P( X  a )  P( X  a )  P( X  a )  P( X  a   )  F ( a )  F ( a   )
for some sufficiently small ε>0.
Graphs of pmf and CDF
Cumulative Distribution Function
Probability Distribution Function
F (a)   P( X  ai )  P( X  a )  P( X  a )
ai a
Bernoulli Distribution


The Bernoulli distribution is used to model an
experiment with only two outcomes, success
and failure. The parameter p is the chance for
success.
An example is flipping a coin, where “heads”
may be success and “tails” may be failure.
Binomial Distribution


The Binomial Distribution represents multiple
Bernoulli trials. The parameter n is the number of
trials, and the parameter p is the probability of
success as in the Bernoulli distribution.
P(X=k) is the probability of k successful outcomes in
n trials.
Geometric Distribution



A geometric distribution gives information about the
probability of success after k attempts. The parameter
p is the probability of success on the kth try.
This means that all previous k-1 tries failed
An example of this would be finding the probability that
you will hit a bullseye with a dart on your kth toss.