* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Exploring Artificial Neural Networks to discover Higgs at
Neural coding wikipedia , lookup
Neuroeconomics wikipedia , lookup
Computational creativity wikipedia , lookup
Holonomic brain theory wikipedia , lookup
Neurocomputational speech processing wikipedia , lookup
Cortical cooling wikipedia , lookup
Optogenetics wikipedia , lookup
Neuroethology wikipedia , lookup
Neuropsychopharmacology wikipedia , lookup
Neural oscillation wikipedia , lookup
Artificial intelligence wikipedia , lookup
Neural correlates of consciousness wikipedia , lookup
Gene expression programming wikipedia , lookup
Channelrhodopsin wikipedia , lookup
Central pattern generator wikipedia , lookup
Metastability in the brain wikipedia , lookup
Nervous system network models wikipedia , lookup
Neural binding wikipedia , lookup
Catastrophic interference wikipedia , lookup
Neural engineering wikipedia , lookup
Artificial neural network wikipedia , lookup
Development of the nervous system wikipedia , lookup
Convolutional neural network wikipedia , lookup
Exploring Artificial Neural Networks to Discover Higgs at LHC Using Neural Networks for B-tagging By Rohan Adur www.hep.ucl.ac.uk/~radur Exploring Artificial Neural Networks to Discover Higgs at LHC Outline: • What are Neural Networks and how do they work? • How can Neural Networks be used in bjet tagging to discover the Higgs boson? • What results have I obtained using Neural Networks to find b-jets? Neural Networks - Introduction • Neural Networks simulate neurons in biological systems • They are made up of neurons connected by synapses • They are able to solve non-linear problems by learning from experience, rather than being explicitly programmed for a particular problem The Simple Perceptron • The Simple Perceptron is the simplest form of a Neural Network • It consists of one layer of input units and one layer of output units, connected by weighted synapses Output layer Synapses connected by weights Input layer The Simple Perceptron contd. • Requires a training set, for which the required output is known • Synapse weights start at random values. A learning algorithm then changes the weights until they give the correct output and the weights are frozen • The trained network can then be used on data it has never seen before Output layer Synapses connected by weights Input layer The Multilayer Perceptron • The main drawback of the simple perceptron is that it is only able to solve linearly-separable problems • Introduce a hidden layer to produce the Multilayer Perceptron • The Multilayer Perceptron is able to solve non-linear problems Output layer Synapses Hidden Layer Synapse s Input layer Finding Higgs • The Higgs boson is expected to decay to bquarks, which will produce b-jets • b-jet detection at LHC is important in detecting Higgs • 40 million events happening per second • b-taggers must reject light quark jets b-tagging • B mesons are able to travel a short distance before decaying, so bjets will originate away from the primary vertex • Several b-taggers exist • IP3D tagger uses the Impact Parameter of the b-jets ~ 1mm Primary Vertex IP B Secondary Vertex B-jets •SecVtx tagger reconstructs the secondary vertex and rejects jets which have a low probability of coming from this vertex IP3D Tagger •Good amount of separation between b-jets and light jets b-tagger performance Neural Network for b-tagging • The current best tagger is a combination of IP3D and SV1 tag weights • Using Neural Networks, can this tagger be combined with others to provide better separation? The Multilayer Perceptron and b-tagging • The TMultiLayerPerceptron class is an implementation of a Neural Network built into the ROOT framework • It contains several learning methods. The best was found to be the default BFGS method • Train with output = 1 for signal and output = 0 for background • The b-tagging weights were obtained using the ATHENA 10.0.1 release • The data was obtained from Rome ttbar AOD files • Once extracted, the weights were used to train the Neural Network Results • 5 Inputs used: Transverse momentum, IP3D tag, SV1 tag, SecVtx Tag and Mass • 12 Hidden units and 1 Output unit Results Contd. Results Contd. Rejection rates Efficiency IP3D+SV1 Neural Network 60% 88 136 50% 175 387 Mistagging efficiency Efficiency IP3D+SV1 Neural Network 60% 1.14% 0.73% 50% 0.57% 0.26% At fixed rejection Rejection IP3D+SV1 Neural Network 100 57% 62% Discussion of Results • Using a Neural Network, b-taggers can be combined to provide up to double the purity at fixed efficiency • At fixed rejection rate, the Neural Network provides 5% more signal than the IP3D+SV1 tagger alone • Neural Network performance is not always reproducible. Each time training is undertaken a different network is produced Conclusions • Neural Networks are a powerful tool for bjet classification • Neural Networks can be used to significantly increase b-tagging efficiency/rejection ratios and could be useful in the search for Higgs • Training a Neural Network on real data will be the next hurdle