Download Statistics lecture (Powerpoint)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Density matrix wikipedia , lookup

Probability amplitude wikipedia , lookup

Coherent states wikipedia , lookup

Interpretations of quantum mechanics wikipedia , lookup

Hidden variable theory wikipedia , lookup

EPR paradox wikipedia , lookup

Quantum entanglement wikipedia , lookup

Quantum state wikipedia , lookup

Bell's theorem wikipedia , lookup

Measurement in quantum mechanics wikipedia , lookup

Transcript
Experimental Uncertainties:
A Practical Guide
• What you should already know well
• What you need to know, and use, in this lab
More details available in handout ‘Introduction to
Experimental Error’ in your folders.
• In what follows I will use convention:
– Error = deviation of measurement from true value
– Uncertainty = measure of likely error
Why are Uncertainties
Important?
• Uncertainties absolutely central to the
scientific method.
• Uncertainty on a measurement at least as
important as measurement itself!
• Example 1:
“The observed frequency of the emission line
was 8956 GHz. The expectation from
quantum mechanics was 8900 GHz”
• Nobel Prize?
Why are Uncertainties
Important?
• Example 2:
“The observed frequency of the emission line
was 8956 ± 10 GHz. The expectation from
quantum mechanics was 8900 GHz”
• Example 3:
“The observed frequency of the emission line
was 8956 ± 10 GHz. The expectation from
quantum mechanics was 8900 GHz ± 50 GHz”
Types of Uncertainty
• Statistical Uncertainties:
– Quantify random errors in measurements between
repeated experiments
– Mean of measurements from large number of
experiments gives correct value for measured
quantity
– Measurements often approximately gaussiandistributed
• Systematic Uncertainties:
– Quantify systematic shift in measurements away
from ‘true’ value
– Mean of measurements is also shifted  ‘bias’
Examples
• Statistical Errors:
True Value
0.45
– Measurements gaussian0.4
distributed
0.35
– No systematic error (bias)
0.3
– Quantify uncertainty in
0.25
measurement with standard
0.2
deviation (see later)
0.15
– In case of gaussian-distributed 0.1
measurements std. dev. = s in 0.05
formula
0
– Probability interpretation
-3 -2 -1 -0
1
2
(gaussian case only): 68% of
  x  x 2 
1

exp  
2
measurements will lie within ± 1
2
2s 
2s

s of mean.
3
Examples
• Statistical + Systematic Errors:
True Value
0.45
– Measurements still gaussian0.4
distributed
0.35
– Measurements biased
0.3
– Still quantify statistical
0.25
uncertainty in measurement with 0.2
standard deviation
0.15
0.1
– Probability interpretation
0.05
(gaussian case only): 68% of
0
measurements will lie within ± 1 s -3 -2 -1 -0 1 2
of mean.
  x  x 2 
1

exp  
2
2
– Need to quantify systematic error
2s 
2s

(uncertainty) separately  tricky!
3
Systematic Errors
• How to quantify uncertainty?
• What is the ‘true’ systematic
error in any given
measurement?
– If we knew that we could correct
for it (by addition / subtraction)
• What is the probability
distribution of the systematic
error?
True Value
0.45
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.05
– Often assume gaussian
distributed and quantify with ssyst.
– Best practice: propagate and
quote separately
0
-3
-2
-1
-0
1
2
3
Calculating Statistical
Uncertainty
• Mean and standard deviation of set of independent
measurements (unknown errors, assumed
uniform):
1
x0 
x

N
i
 x;
i
1
2


s 
x

x

i
N 1 i
2
• Standard deviation estimates the likely error of
any one measurement
• Uncertainty in the mean is what is quoted:
sx 
s

1
2
 xi  x  


N  N ( N  1) i

1/ 2
.
Propagating Uncertainties
• Functions of one variable (general formula):
df
F 
X
dx
• Specific cases:
 
 x 2  2 xx
   nx
x
n
n 1
x or
sin x   cos x x
1
ln x   x
x
or
 
 x2
x

2
x
x2
 xn
x
n
n
x
x
 
Propagating Uncertainties
• Functions of >1 variable (general formula):
f 
2
• Specific cases:
2

 f
  f
  x    y  .
 x   y 
2
f=
Apply equation
Simplify
x y
xy
f 2  x 2  y 2
f 2  x 2  y 2
x y
f 2
 y 2 x   x 2 y 
2
2

x 
f   2
2
y
x2
2
 4 y 
y
2
 f

 f
 f

 f
2
2

 y 
 x 
  

  
x
y





2
2
2

 x   y 
      
 x   y 

2
Combining Uncertainties
• What about if have two or more
measurements of the same quantity, with
different uncertainties?
• Obtain combined mean and uncertainty with:
x
2
x
s
 i i
i
1 s
i
2
i
1
s
2

i
1
s
2
i
• Remember we are using the uncertainty in the
mean here:
s
si 
N
Fitting
•
Often we make measurements of several
quantities, from which we wish to
1. determine whether the measured values follow a
pattern
2. derive a measurement of one or more parameters
describing that pattern (or model)
•
•
•
This can be done using curve-fitting
E.g. EXCEL function linest.
Performs linear least-squares fit
Method of Least Squares
ln eta [cPs]
• This involves taking
measurements yi and
comparing with the
equivalent fitted value yif
• Linest then varies the
model parameters and
hence yif until the
following quantity is
minimised:
 y
N
i 1
i
 yi
f

2
In this example the model is
a straight line
yif = mx+c. The model
parameters are m and c
1
0.5
0
-0.5
-1
-1.5
0.0026
• Linest will return the fitted
parameter values (=mean)
and their uncertainties (in
the mean)
0.0028
0.003
0.0032
0.0034
0.0036
0.0038
1/T [1/K]
In the second year lab never use
the equations returned by ‘Add
Trendline’ or linest to estimate your
parameters!!!
Weighted Fitting
• Those still awake will have noticed the least
square method does not depend on the
uncertainties (error bars) on each point.
• Q: Where do the uncertainties in the parameters
come from?
– A: From the scatter in the measured means about the
fitted curve
• Equivalent to:
1
2


s 
x

x
 i
N 1 i
2
• Assumes errors on points all the same
• What about if they’re not?
Weighted Fitting
• To take non-uniform uncertainties (error bars) on
points into account must use e.g. chi-squared fit.
• Similar to least-squares but minimises:
 y i  yi
   
si
i 1 
N
2
f



2
• Enables you to propagate uncertainties all the way
to the fitted parameters and hence your final
measurement (e.g. derived from gradient).
• This is what is used by chisquare.xls (download
from Second Year web-page)  this is what we
expect you to use in this lab!
General Guidelines
Always:
• Calculate uncertainties on measurements and plot
them as error bars on your graphs
• Use chisquare.xls when curve fitting to calculate
uncertainties on parameters (e.g. gradient).
• Propagate uncertainties correctly through derived
quantities
• Quote uncertainties on all measured numerical
values
• Quote means and uncertainties to a level of precision
consistent with the uncertainty, e.g: 3.77±0.08 kg, not
3.77547574568±0.08564846795768 kg.
• Quote units on all numerical values
General Guidelines
Always:
• Think about the meaning of your results
– A mean which differs from an expected value by more than
1-2 multiples of the uncertainty is, if the latter is correct,
either suffering from a hidden systematic error (bias), or is
due to new physics (maybe you’ve just won the Nobel
Prize!)
Never:
• Ignore your possible sources of error: do not just
say that any discrepancy is due to error (these
should be accounted for in your uncertainty)
• Quote means to too few significant figures, e.g.:
3.77±0.08 kg not 4±0.08 kg