Download CSE 230: Lecture #1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability wikipedia , lookup

Randomness wikipedia , lookup

Transcript
CSE 3504: Probabilistic Analysis of Computer Systems
Topics covered:
Course outline and schedule
Introduction
Event Algebra
(Sec. 1.1-1.4)
General information
CSE 3504
Instructor
Phone
Email
Office
Lecture time
Office hours
Web page
: Probabilistic Analysis of Computer Systems
: Swapna S. Gokhale
: 6-2772.
: [email protected]
: ITEB 237
: Wed/Fri 9:30 – 10:45 am
: By appointment
(I will hang around for a few minutes at the end of
each class).
: http://www.engr.uconn.edu/~ssg/cse3054.html
(Lecture notes, homeworks, and general announcements
will be posted on the web page)
Course goals
 Appreciation and motivation for the study of probability
theory.
 Definition of a probability model
 Application of discrete and continuous random variables
 Computation of expectation and moments
 Application of discrete and continuous time Markov chains.
 Estimation of parameters of a distribution.
 Testing hypothesis on distribution parameters
Expected learning outcomes
 Sample space and events:
Define a sample space (outcomes) of a random experiment and
identify events of interest and independent events on the
sample space.
 Compute conditional and posterior probabilities using Bayes
rule.
 Identify and compute probabilities for a sequence of Bernoulli
trials.

 Discrete random variables:






Define a discrete random variable on a sample space along with
the associated probability mass function.
Compute the distribution function of a discrete random
variable.
Apply special discrete random variables to real-life problems.
Compute the probability generating function of a discrete
random variable.
Compute joint pmf of a vector of discrete random variables.
Determine if a set of random variables are independent.
Expected learning outcomes (contd..)
 Continuous random variables:
Define general distribution and density functions.
 Apply special continuous random variables to real problems.
 Define and apply the concepts of reliability, conditional failure
rate, hazard rate and inverse bath-tub curve.

 Expectation and moments:

Obtain the expectation, moments and transforms of special and
general random variables.
 Stochastic processes:

Define and classify stochastic processes.

Derive the metrics for Bernoulli and Poisson processes.
Expected learning outcomes (contd..)
 Discrete time Markov chains:
Define the state space, state transitions and transition
probability matrix
 Compute the steady state probabilities.
 Analyze the performance and reliability of a software
application based on its architecture.

 Statistical inference:
Understand the role of statistical inference in applying
probability theory.
 Derive the maximum likelihood estimators for general and
special random variables.
 Test two-sided hypothesis concerning the mean of a random
variable.

Expected learning outcomes (contd..)
 Continuous time Markov chains:





Define the state space, state transitions and generator matrix.
Compute the steady state or limiting probabilities.
Model real world phenomenon as birth-death processes and
compute limiting probabilities.
Model real world phenomenon as pure birth, and pure death
processes.
Model and compute system availability.
Textbooks
Required text book:
1. K. S. Trivedi, Probability and Statistics with Reliability, Queuing and
Computer Science Applications, Second Edition, John Wiley.
Course topics
 Introduction (Ch. 1, Sec. 1.1-1.5, 1.7-1.11):

Sample space and events, Event algebra, Probability axioms,
Combinatorial problems, Independent events, Bayes rule,
Bernoulli trials
 Discrete random variables (Ch. 2, Sec. 2.1-2.4, 2.5.1-2.5.3,
2.5.5,2.5.7,2.7-2.9):

Definition of a discrete random variable, Probability mass and
distribution functions, Bernoulli, Binomial, Geometric, Modified
Geometric, and Poisson, Uniform pmfs, Probability generating
function, Discrete random vectors, Independent events.
 Continuous random variables (Ch. 3, Sec. 3.1-3.3, 3.4.6,3.4.7):

Probability density function and cumulative distribution
functions, Exponential and uniform distributions, Reliability and
failure rate, Normal distribution
Course topics (contd..)
 Expectation (Ch. 4, Sec. 4.1-4.4, 4.5.2-4.5.7):

Expectation of single and multiple random variables, Moments
and transforms
 Stochastic processes (Ch. 6, Sec. 6.1, 6.3 and 6.4)

Definition and classification of stochastic processes, Bernoulli
and Poisson processes.
 Discrete time Markov chains (Ch. 7, Sec. 7.1-7.3):

Definition, transition probabilities, steady state concept.
Application of discrete time Markov chains to software
performance and reliability analysis
 Statistical inference (Ch. 10, Sec. 10.1, 10.2.2, 10.3.1):

Motivation, Maximum likelihood estimates for the parameters
of Bernoulli, Binomial, Geometric, Poisson, Exponential and
Normal distributions, Parameter estimation of Discrete Time
Markov Chains (DTMCs), Hypothesis testing.
Course topics (contd..)
 Continuous time Markov chains (Ch. 8, Sec. 8.1-8.3, 8.4.1):

Definition, Generator matrix, Computation of steady
state/limiting probabilities, Birth-death process, M/M/1 and
M/M/m queues, Pure birth and pure death process, Availability
analysis.
Course topics and exams calendar
Week #1 (Jan. 21):
1. Jan 21: Logistics, Introduction, Sample Space, Events, Event algebra
2. Jan 23: Probability axioms, combinatorial problems
Week #2 (Jan. 28):
3. Jan 28: Conditional probability, Independent events, Bayes rule, Bernoulli trials
4. Jan 30: Discrete random variables, Probability mass and Distribution function
Week #3 (Feb. 4):
5. Feb. 4: Special discrete distributions
6. Feb. 6: Poisson pmf, Uniform pmf, Probability Generating Function
Week #4 (Feb. 11):
7. Feb. 11: Discrete random vectors, Independent random variables
8. Feb. 13: Continuous random variables, Uniform and Normal distributions
Week #5 (Feb. 18):
9. Feb. 18: Exponential distribution, reliability and failure rate
10. Feb. 20: Expectations of random variables, moments
Course topics and exams calendar (contd..)
Week #6 (Feb. 25):
11. Feb. 25: Multiple random variables, transform methods
12. Feb. 27: Moments and transforms of special distributions
Week #7 (Mar. 4):
13. Mar 4: Stochastic processes, Bernoulli and Poisson processes
14. Mar 6: Discrete time Markov chains
Week #8 (Mar. 11):
Spring break, no class.
Week #9 (Mar. 18):
15. Mar 18: Discrete time Markov chains (contd..)
16. Mar 20: Analysis of software reliability and performance
Week #10 (Mar. 25):
17. Mar 25: Statistical inference
18. Mar 27: Statistical inference (contd..)
Week #11 (Apr. 1):
19. Apr. 1: Confidence intervals
20. Apr. 3: Hypothesis testing
Course topics and exams calendar (contd..)
Week #12 (Apr. 8):
21. Apr. 8: Hypothesis testing (contd..)
22. Apr. 10: Continuous time Markov Chains
Week #13 (Apr. 15):
23. Apr. 14: Continuous time Markov chains, applications (contd..)
24. Apr. 18: Simple queuing models
Week #14: (Apr. 22)
25. Apr. 22: Pure death processes, availability models
26. Apr. 24: Lognormal distribution and its applications
Week #15: (Apr. 29)
Apr. 29: Make up class
May 1: Final exam handed.
Assignment/Homework logistics
 There will be one homework based on each topic
(approximately)
 One week will be allocated to complete each homework
 Homeworks will not be graded, but I encourage you to do
homeworks since the exam problems will be similar to the
homeworks.
 Solution to each homework will be provided after a week.
Exam logistics
 Exams will have problems similar to that of the homeworks.
 Midterm exam: Before Spring break
 Final exam: Handed on the last day of classes, due when the
exam for the class is scheduled.
 Exams will be take-home.
Project logistics
 Project will be handed in the week first week of April, and
and will be due in the last week of classes.
 2-3 problems:



Experimenting with design options to explore tradeoffs and to
determine which system has better performance/reliability etc.
Parameter estimation, hypothesis testing with real data.
May involve some programming (can be done using Java, Matlab
etc.)
 Project report must describe:
Approach used to solve the problem.
 Results and analysis.

Grading system
Homeworks – 0%
- Ungraded homeworks.
Midterm - 20%
Project – 25%
- Two to three problems.
Final
- 55%
- Heavy emphasis on the final
Attendance policy
Attendance not mandatory.
Attending classes helps!
Many examples, derivations (not in the book) in the class
Problems, examples covered in the class fair game for the
exams.
 Everything not in the lecture notes




Feedback
Please provide informal feedback early and often, before the formal
review process.
Introduction and motivation
 Why study probability theory?
 Answer questions such as:
Probability model
 Examples of random/chance phenomenon:
 What is a probability model?
Sample space
 Definition:
 Example: Status of a computer system
 Example: Status of two components: CPU, Memory
 Example: Outcomes of three coin tosses
Types of sample space
 Based on the number of elements in the sample space:
 Example: Coin toss
 Countably finite/infinite
 Countably infinite
Events
 Definition of an event:
 Example: Sequence of three coin tosses:
 Example: System up.
Events (contd..)
 Universal event
 Null event
 Elementary event
Example
 Sequence of three coin tosses:
 Event E1 – at least two heads
 Complement of event E1 – at most one head (zero or one
head)
 Event E2 – at most two heads
Example (contd..)
 Event E3 – Intersection of events E1 and E2.
 Event E4 – First coin toss is a head
 Event E5 – Union of events E1 and E4
 Mutually exclusive events
Example (contd..)
 Collectively exhaustive events:
 Defining each sample point to be an event