Download M T : C

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
MODULE TITLE: Probability Modelling
LEVEL: Junior Sophister
CODE: EE3E3
CREDITS: 5
PREREQUISITES: SF
TERMS: Michaelmas
LECTURES/WEEK: 3
TUTORIALS/WEEK: 1
DURATION (WEEKS): 11
TOTAL: 33
TOTAL: 11
LECTURER(S):
Associate Professor
Anthony Quinn
AIMS/OBJECTIVES
This module provides a thorough grounding in probability for electrical, computer and bioengineering students, and a pathway into the design of statistics. In particular, it equips the
student with methods for dealing with uncertainty in engineering practice, notably in the
analysis of experiments, and the interpretation and processing of data. The keystone of the
module is a philosophical one. The relationship between uncertainty and information
(learning) is explored from the start. A full review of propositional logic is provided, so that
the student can be confident in formulating propositions associated with an uncertain
experiment, and in understanding the logical relationships between propositions. The
probability calculus is developed as a consistent means of quantifying and manipulating
belief in these uncertain propositions (i.e. the Bayesian perspective). In this way, the
foundation of the module is a unified one, with uncertainty, logic, information, observation
and imprecision (noise) all embraced within a Bayesian notion of probability.
The main aim is to confront engineering contexts that induce the canonical probability
models, in both the discrete case (Bernoulli, geometric, binomial, multinomial, Poisson) and
the continuous case (rectangular, exponential, m-Erlang, normal). Their mixtures and
transformations are developed as a response to practical modelling needs. There is a special
emphasis on the concept of dependence (conditioning), and its relationship to the key
engineering notions of correlation and prediction. Sequential dependence is handled in the
discrete case only, via an introduction to Markov chains. A main learning outcome for the
student is the capability to choose the appropriate model to apply in a range of engineering
contexts, knowing the assumptions justifying the deployment of each model.
This theory is not separated from the engineering practice it aims to serve. Rather, the
module carries a number of extended case studies throughout, each progressively refined
as our new probability tools become available to us. The main case studies are (i) the noisy
digital communication system, using discrete models and, later, the additive Gaussian noise
model; (ii) Poisson count bio-imaging contexts (e.g. FLIM, SPECT); (iii) reliability, lifetime and
traffic modelling in large device assemblies; and (iv) quantization error analysis in ADC
A feature of the module is that it develops statistics consistently, using the same inductive
inference principles described above. Typically a blind spot in the formation of the
engineering student, statistical inference is re-cast in this module as an application of
probability modelling. The simple nonparametric case is considered, using the elegant
device of the empirical distribution. This allows students to derive appropriate descriptive
statistics for their data, to estimate probabilities, and to describe dependence and
regression phenomena quantitatively. An accessible introduction to parametric estimation
is also provided, via moment matching techniques. Finally, binary hypothesis testing for
known models is considered. Together, these topics allow themes of importance to
engineers within the data analysis context to be considered.
The module primes the student for later modules on statistical signal processing and
communications. In its own terms, it is an invitation to confront uncertainty as a
fundamental phenomenon – and resource – in engineering systems, and to appreciate
probability as a consistent framework for the design, analysis and optimization of such
systems.
SYLLABUS

Review of Propositional Logic
Uncertain experiments in electrical, computer and bio- engineering
Sample space, propositions and events
Propositional logic: equivalence, necessity, sufficiency, mutual exclusivity

The Foundation of Probability Modelling
The axioms of probability and the probability triple
Conditional probability; independence
Key relationships: chain rule, Bayes’ rule, theorem of total probability

Sequential Experiments
Independent sequential experiments: geometric, binomial and multinomial
probability laws
Homogeneous Markov chains

Univariate Random variables
Probability functions for random variables (cdf, pdf, pmf)
Key discrete probability models (BernoulligeometricbinomialPoisson)
Key continuous probability models (rectangular, exponential, m-Erlang, normal)
Functions of random variables
Expectation

Multiple random variables
Marginal and conditional distributions
Discrete-continuous case: finite mixture models
The bivariate normal distribution
Correlation and linear regression
Introduction to graphical models

Statistics and Data Analysis
Random sampling: the empirical distribution and its moments (sampling statistics)
Probability estimation from survey data
Quantification of error; quantization noise
Design of statistics for hypothesis testing, and for description
RECOMMENDED TEXTS
The main recommended text for the module is:
1. Leon-Garcia, A., Probability, Statistics, and Random Processes for Electrical
Engineering, 3rd ed., Prentice Hall, 2008.
Secondary recommended texts are as follows:
2. Bertsekas, D.P. and Tsitsiklis, J.N., Introduction to Probability, 2nd ed., Athena
Scientific Press, 2008.
3. Applebaum, D., Probability and Information, 2nd ed., Camb Univ Press, 2008.
LEARNING OUTCOMES
On successful completion of this module, the student will be able to:
1. Quantify beliefs in uncertain propositions related to key electrical, computer and bioengineering contexts, such as noisy communication, bio-imaging, and large assemblies
2. Distinguish between the vital notions of independence and dependence, and relate the
latter to the idea of prediction
3. Apply and analyze the key parametric probability models (distributions) governing
uncertainty in these contexts
4. Evaluate measures of location, spread and dependence for these distributions
5. Convert random experimental data (samples and surveys) into quantified beliefs,
summarize these data via sampling statistics, assess dependence between data, and test
competing hypotheses
TEACHING STRATEGIES
There is a 3:1 ratio between lectures and tutorials. Archive lecture notes are provided
regularly, via scans uploaded to the webpage (www.mee.tcd.ie/~aquinn/3e3). Problemsolving experience is vital, and gained primarily via the tutorial periods, but also via regular
homework sheets, with solutions provided on the webpage. Students are reminded that
attendance at all timetabled activities is compulsory.
ASSESSMENT MODES
70% of the final mark is determined via the annual examination. The remaining 30% is
reserved for continuous assessment, by means of about 5 in-lecture tests during the term,
and a one-hour end-of-term quiz.