Download statistical nature of radiation counting

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Inductive probability wikipedia , lookup

Bootstrapping (statistics) wikipedia , lookup

Taylor's law wikipedia , lookup

Foundations of statistics wikipedia , lookup

Probability amplitude wikipedia , lookup

Law of large numbers wikipedia , lookup

History of statistics wikipedia , lookup

Transcript
N3 - 1
STATISTICAL NATURE OF RADIATION COUNTING
OBJECTIVE
The objective of this experiment to explore the statistics of random, independent events in physical
measurements and to test the hypothesis that the distribution of counts of random emission events from a
gamma source using a Geiger-Muller detector are fitting a Poisson or Gaussian distribution depending on
the length of time used for those counts and the range of the mean of those counts.
INTRODUCTION AND THEORY
Radioactive decay and consequently the emission of particles is a randomly occurring process and
therefore any count of the radiation emitted is subject to some degree of statistical fluctuation. There is a
continuous change in the activity of a source from one instant to the next due to the random nature of
radioactive decay and there are also fluctuations in the decay rate due to the half-life of the radionuclide.
Suppose a sample has n radioactive nuclei, with known half-life and probability of counting in a detector
when they decay. What would be the probability of recording k counts during a given time interval? The
answer is provided by the binomial distribution. The Poisson distribution describes the large-n limit (k
large or small), and the Gaussian distribution applies in the large-n, large-k limit.
Let us denote:
•
•
p - the probability of a single given nucleus of decaying (and so, of being counted) during a given
interval of time. It could be also treated as the “average number of counts from a single nucleus”.
q - the probability of a nucleus of surviving the radioactive transformation (not-decaying and so of
not being counted) during the same set interval of time.
Let us label the n nuclei in the sample “1, 2,…, k, k + 1,…, n”. Then, the probability of the set of nuclei
labeled “1 through k” of decaying and being counted, and of the set of nuclei labeled “k+1 through n” of
surviving and not being counted is:
P(p, k ) = p k ⋅ q n − k
(1)
Since decaying takes place randomly, another set of k nuclei could also give the same number of counts,
with the same probability. The total number of “ways” of getting k counts corresponds to the number of
distinct permutations of the labels of the n nuclei. The total number of such permutations is
n!
k!(n − k )!
(2)
It follows that the probability of counting k decaying nuclei out of n (given that the probability of the
decaying transformation is p) is:
P(n , p, k ) =
n!
p k q n −k
k!(n − k )!
(3)
N3 - 2
This is the general form of the binomial distribution. The binomial distribution function specifies the
number of times (k) that an event occurs in n independent trials where p is the probability of the event
occurring in a single trial. It is an exact probability distribution for any number of discrete trials. If n is
very large, it may be treated as a continuous function. This yields the Gaussian distribution. If the
probability p is so small that the function has significant value only for very small k, then the function can
be approximated by the Poisson distribution.
Therefore, for large number of atoms in the sample and very small probability of decaying (p → 0, n >>
k), the previous distribution becomes the “Poisson distribution”. Let us denote the average number of
counts λ. Since the probability p had the meaning of the “average number of counts from a single
nucleus”, it follows that the average number of counts from all the nuclei
μ = pn
(4)
According to the definition of the two probabilities p and q,
q=1–p
(5)
And so,
q
n −k
= (1 − p)
n −k
= (1 − p)
1
⋅p ( n − k )
p
(6)
Furthermore, if n is large compared to k (n >> k), n – k ≈ n, and the equation above becomes:
q
n −k
= (1 − p)
n −k
= (1 − p)
1
⋅λ
p
(7)
At the limit when p → 0 (p << 1), this becomes:
lim q n − k = lim (1 − p)
p → 0 , n >> k
1
⋅p ( n − k )
p
p →0 , n >> k
= lim(1 − p)
p→0
1
⋅μ
p
1
− ⎞
⎛
⎜
= lim(1 − p) p ⎟
⎜ p →0
⎟
⎝
⎠
−μ
= e −μ
(8)
Where we have used the definition of the number e:
1
lim(1 + x ) x = e
x →0
Therefore:
lim P(n , k, p) =
p →0
n!
⋅ p k ⋅ e −μ
k!(n − k )!
(9)
We can also, at the limit n >> k, approximate
n!
1 ⋅ 2 ⋅ ...(n − k ) ⋅ (n − k + 1) ⋅ (n − k + 2) ⋅ ... ⋅ n
= (n − k + 1) ⋅ (n − k + 2) ⋅ ... ⋅ n ≈ n14
⋅ n2
⋅ ...4
⋅ n = nk
=
3
(n − k )!
1 ⋅ 2 ⋅ ... ⋅ (n − k )
k − − times
N3 - 3
therefore,
P( n , k , p) =
n!
λk ⋅ e − μ
>> k , p → 0
p k q n − k ⎯n⎯
⎯⎯→ P(k , μ) =
;
k!(n − k )!
k!
where k = 0, 1, 2, . . .
(10)
This is the Poisson distribution. It is applied, generally speaking, to situations where only a few events are
counted. It is equally valid when large numbers of events are counted, but the Gaussian distribution can be
used then, and it is easier to use.
For n, k, and n - k all large, and p is no longer approaching zero, we obtain the Gaussian distribution,
given here without derivation:
P( k , λ ) =
1
2πμ
⋅e
−
( k −λ )2
2μ
where k = 0, 1, 2, . . .
(11)
Here λ is the most probable number of counts, and k is the actual number counted. This is a very practical
relation, permitting a graphical determination of P in cases where calculating the terms would take a long
time.
If n and therefore k is very large and the argument in equation 11 is no longer discrete but can be treated
as continuous x, equation (11) becomes:
P( x ) =
1
σ 2π
⋅e
1 ⎛ x −μ ⎞
− ⋅⎜
⎟
2⎝ σ ⎠
2
Where, μ and σ are the mean and standard deviation.
Figure 1: Examples of the Poisson and Gaussian distributions.
(12)
N3 - 4
STATISTICS DEFINITIONS:
Statistics
Sample
x
Mean
Population
μ
2
Variance
s
standard deviation
s
The definitions of these symbols are:
Sample Mean:
1
x=
N
∑x
N
i =1
Sample Variance:
s2 =
σ2
σ
μ = lim x
N →∞
i
(
1 N
∑ x−x
N − 1 i =1
)
2
σ 2 = lim s 2
N →∞
(Note the N - 1 in the definition of s2 ; this makes s2 a correct estimator of σ2.)
EQUIPMENT
•
The ST150 Nuclear Lab Station provides a self-contained unit that includes a versatile
timer/counter, GM tube and source stand; High voltage is fully variable from 0 to +800 volts.
Figure 4: The ST150 Nuclear Lab Station
PROCEDURE
PART I. BACKGROUND RADIATION ONLY AND SHORT TIME INTERVALS
1. Use only the background radiation for this part. This will ensure the conditions for n and k both
small, and p→ 0.
2. Adjust the high voltage to the value obtained in the previous experiment. (Make sure you are using
the same lab station!)
3. Adjust the time interval to 10 seconds.
4. Record all your data in the table below.
N3 - 5
Table 1: Data for Background Radiation and “Short” Time Intervals Count Rates Histogram
Run
No
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
Normal
Operating
Voltage
(V)
Time (sec.)
Counts
Mean
Variance
Standard
Deviation
N3 - 6
DATA ANALYSIS
1. Using the data in the table above, draw a histogram of the counts, and analyze its shape.
2. How well does the shape of your histogram resemble the characteristic skewed shape of the
Poisson distribution?
3. Use the statistics definitions to find the mean and the standard deviation of your data.
4. How many points (express in %) fall within one standard deviation of the mean?
PART II. 60Co GAMMA SOURCE AND LONG TIME INTERVALS
5. Sign out a 60Co source from your TA and place it on the plastic tray in the second lowest position
from the GM mica window.
6. It is assumed that the high voltage has been adjusted to the normal operation value from part one.
Check again that the value of the high voltage is correct.
7. Adjust the recording time interval to 100 seconds.
8. Record all your data in the table below.
Table 2: Data for 60Co Gamma Source and “Long” Time Intervals Count Rates Histogram
Run
No
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
Normal
Operating
Voltage
(V)
Time (sec.)
Counts
Mean
Variance
Standard
Deviation
N3 - 7
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
DATA ANALYSIS
1. Using the data in the table above draw a histogram of the counts, and analyze its shape.
2. How well does the shape of your histogram resemble the expected shape of the Gaussian
distribution? Comment on the shape of this histogram compared to the one obtained in Part I.
3. Find the mean and the standard deviation of your data.
4. How many points (express in %) fall within one standard deviation of the mean? How many points
(express in %) fall within 2σ of the mean? Is this in agreement with the theory predictions?
5. Compare the two sets of results and comment on the general shape of the two histograms.