Download Homework #5 - Bryn Mawr College

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Functional decomposition wikipedia , lookup

Abuse of notation wikipedia , lookup

History of the function concept wikipedia , lookup

Central limit theorem wikipedia , lookup

Karhunen–Loève theorem wikipedia , lookup

Foundations of statistics wikipedia , lookup

Birthday problem wikipedia , lookup

Infinite monkey theorem wikipedia , lookup

Dragon King Theory wikipedia , lookup

Expected value wikipedia , lookup

Law of large numbers wikipedia , lookup

Transcript
Math 295
October 2, 2001
Mathematical Statistics
Homework #5 (due 10/7/02)
Text, Section 3.2, problems 1, 2, 11;
Section 3.4, problems 2, 3, 7, 10, 12.
Most of these problems require drawing graphs of functions, and graphs are useful even for some
of the problems that don’t require them.
What’s on the Exam? #1
Exam 1 will be available at the end of class on October 7 and in the math office after class. You
may take it at a time and place of your own choosing. It is due at the start of class (12:10)
October 9 or in the math office before then. There will be a time limit (not severe, I hope).
Calculators are allowed but not computers, reference materials, or consultation.
What is covered?
1. Material covered in class
2. Handout on descriptive statistics
3. Text:
Sections 2.2 through 2.8
Sections 3.2 and 3.4
4. Homework ## 1 – 5 and solutions. (If you have any doubts, read solutions #5--available at the website on October 7---before starting the exam.)
Key ideas
Experiment; outcomes; sample space; events
Mutually exclusive ( = disjoint ) events
Set operations - union, intersection, set difference (A–B means A B ), complement,
subset relationship, role of  and S
Probability function (P(A) for events A)
axioms (P(A)  0, P(S) = 1, ordinary and countable additivity)
theorems that follow from the axioms
Discrete (finite or countably infinite) sample spaces:
define a probability function by specifying P(s) for single-outcome events s,
and extend to all events by P(A)=  sA P( s)
all probability functions on discrete spaces can be defined this way
1
Sample spaces that are subsets of :
define a probability measure by specifying a probability density function f(y);
get probability of events by P(A)=  f ( y )dy
y A
in this case single-point events have probability zero
not all probability functions on continuous spaces can be defined this way,
but in this case we’ll concentrate on the ones that can
Conditional probability
definition of P(A|B)
multiplication rule for P(AB)
Independent events
testing whether events are independent
multiplication rule for independent events
Venn diagrams
Tree diagrams — each branch uses multiplication rule like P(AB) = P(A) P(B|A)
Reversing conditionals — given stuff like P(A|B), get stuff like P(B|A)
Bayes formula
Multiply prior probabilities by likelihood to get (unnormalized) posterior
probabilities
Random variables
Random variable X is function S  (assigns real number X(s) to each s)
Defining events in terms of a random variable
“X=3” is an abbreviation for the event { s | X(s) = 3 }
Probability density function
pX(k) = P(X=k) if X is discrete
f(y) if X is continuous
use either form to get probabilities of events defined in terms of X
Cumulative distribution function (cdf) for a random variable
The cdf for a random variable is the function F defined by
F(a) = P ( X  a ) for all a (so F is a function from R to R)
Use cdf to get probabilities of other events defined in terms of X
(Also descriptive statistics)
Some key formulas:
Probabilities of unions:
P ( A  B ) = P(A) + P(B) – P(AB)
always
P ( A  B ) = P(A) + P(B)
if A and B are disjoint, because then
P(AB) = 0.
2
Probabilities of intersections:
P ( A  B ) = P(A) P(B | A)
always
P ( A  B ) = P(A) P(B)
if A and B are independent, because
then P(B|A) = P(B)
When events are independent:
If any of these is true, then they all are true and A and B are independent:
P(B|A)=P(B)
P(A|B)=P(A)
P ( AB) = P(A) P(B)  often this is the easiest one to test
What kind of problems might appear?
The homework problems are a good guide. Some good types of problems are these:
1. Given a short list of numbers,
determine:
mean, median, x , x, percentiles x (when unambiguous), quartiles, variance,
standard deviation, 2,  (divide by n, not n-1, on exam)
draw:
dot plot, histogram
from histogram, be able to estimate mean, standard deviation, maybe guess whether
mean is larger than median or vice versa
2. Given probabilities of some events (including perhaps some conditional probabilities), compute
probabilities of other events (or conditional probabilites). Basic tools: Venn diagrams,
tree diagrams, basic theorems
3. Given the pdf for a random variable, describe (and graph) its cdf
4. Given the cdf for a random variable, describe (and graph) its pdf
5. Given the pmf or cdf for a random variable, determine probabilities of events defined in terms
of the random variable. ( For example: Given some table of P(X = a), OR a graph of F(a),
determine P ( -1  X  +2 ). )
6. Reversing conditions — given things like P(A) and P(B|A), determine things like P(A|B).
(This is just a subcategory of #1 above; or, you could use Bayes’ Formula.)
7. Proofs, within reason.
3