Download Class Separation and Parameter Estimation with Neural Nets

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Performance of Statistical Learning Methods
Jens Zimmermann
[email protected]
Max-Planck-Institut für Physik, München
Forschungszentrum Jülich GmbH
Performance Examples from Astrophysics
Performance vs. Control
H1 Neural Network Trigger
Controlling Statistical Learning Methods
Overtraining
Efficiencies
Uncertainties
Comparison of Learning Methods
Artificial Intelligence
Higgs Parity Measurement at the ILC
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
1
Performance of Statistical Learning Methods: MAGIC
Significance and number of excess events scale the
uncertainties in the flux calculation.
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
2
Performance of Statistical Learning Methods: XEUS
Pileup vs. Single photon
?
?
pileups not recognised by XMM but by NN
classical algorithm
„XMM“
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
3
Control of Statistical Learning Methods
There may be many different successful applications
of statistical learning methods.
There may be great performance improvements
compared to classical methods.
This does not impress people who fear that
statistical learning methods are not well under control.
First talk: Understanding and Interpretation
Now: Control and correct Evaluation
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
4
The Neural Network Trigger in the H1 Experiment
Trigger Scheme
H1 at HERA ep Collider, DESY
10 MHz
L1 2.3 µs
500 Hz
L2 20 µs
50 Hz
L4 100 ms
10 Hz
„L2NN“
Each neural network on L2 verifies a specific L1 sub-trigger.
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
5
Triggering Deeply Virtual Compton Scattering
Theory
Signal
(DVCS)
Background
(upstream
beam-gas
interaction)
L1 sub-trigger 41 triggers DVCS by requiring
• Significant energy deposition in SpaCal
• Within Time Window
L2 neural network additional information
• Liquid argon energies
• SpaCal centre energies
• z-vertex information
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
Triggering with
4 Hz
Must be reduced to
0.8 Hz
6
Determine the correct efficiency
50% training set
25% selection set
25% test set
Tune training parameters to
• avoid overtraining
• optimise performance
signal
should
peak at 1
background
should
peak at 0
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
7
Determine the Correct Efficiency
training set
[%]
test set
[%]
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
8
Check Statistical Uncertainties
efficiency
propagation of uncertainties
statistical uncertainty of the efficiency
e.g. 80% ± 4% for 80 of 100
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
9
Check Systematical Uncertainties
There is only a propagation of
systematical uncertainties of the inputs
Assuming
x1 with absolute error s1
x2 with relative error s2= 5%
x3 with relative error s3=10%
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
10
Check Systematical Uncertainties
example: DVCS dataset
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
11
Comparison of Hypotheses
NN: 96.5% vs. SVM: 95.7%
Statistically significant?
Build 95% confidence interval!
sm is the variation over
different parts of the test set
efficiencies for fixed rejection of 80%
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
12
Comparison of Learning Methods
Compare performances
over different training sets!
sm is the variation
over the different trainings
Cross-Validation:
Divide dataset into k parts,
train k classifiers by
using each part once as test set.
efficiencies for fixed rejection of 60%
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
13
Artificial Intelligence
H1-L2NN: Triggering
Charged Current
CC
two events with low NN-output
cosmic
overlayJens Zimmermann, MPI für Physik München, ACAT 2005
cosmic
Zeuthen
14
Artificial Intelligence
H1-L2NN: Triggering J/y
background found
in J/y selection
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
15
Higgs Parity Measurement at the ILC
H/A
t+
t-
rn rn
ppn ppn
Classical approach:
fit angular distribution
Parity induces favourite r-configuration:
• anti-parallel for H
• parallel for A
A
0
2p
p
Significance is amplitude
divided by its uncertainty
Significance measured for
500 events and averaged
over 600 pseudo-experiments
s = 5.09
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
16
Higgs Parity Measurement at the ILC
Statistical learning approach: direct discrimination
trained towards 0
trained towards 1
Significance is difference
Significance measured for
of measured means
500 events and averaged
divided by its uncertainty over 600 pseudo-experiments
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
s = 6.26
17
Conclusion
Statistical Learning Methods successful in many
applications in high energy and astrophysics.
Significant performance improvements compared
to classical algorithms.
Statistical learning methods are well under control:
- efficiencies can be determined
- uncertainties can be calculated.
Comparison of learning methods reveals
statistically significant differences.
Statistical Learning Methods sometimes show more
artificial intelligence than expected.
Jens Zimmermann, MPI für Physik München, ACAT 2005 Zeuthen
18
Related documents