Time: 4:00 - 5:00pm
Venue: BR 3.01, Bancroft Road Teaching Room, QMUL
Title: Understanding the Standard Model of Particle Physics and ‘seeing’ new Physics
Speaker: Dr Adrian Bevan from School of Physics and Astronomy
This is the second part of an overview of ML techniques used in the Particle Physics community. I will concentrate on the use of neural networks, support vector machines and deep learning techniques in the HEP context including our recent work with TensorFlow for searches of new particles using MLP-style deep neural networks and CNNs applied to data from the ATLAS and MoEDAL experiments at the LHC. We have also started asking the question what is the machine learning; and for this we are looking at the use of the gradCam algorithm in order to understand what regions of an image lead to a classification decision.
Dr Adrian Bevan
Adrian is an experimental particle physicist with a long standing interest in machine learning and optimisation problems. He worked on an experiment at CERN for his PhD and then moved to work on an experiment in Menlo Park (at the Stanford Linear Accelerator Centre) in California, before moving back to CERN after the LHC started up. For the first half of his career Adrian has focused on understanding the difference between matter and antimatter – a fundamental issue that needs to be understood in order to bridge the gap between our understanding of cosmology and that of particle physics: Why are we here instead of this being a universe of photons resulting from a cataclysmic annihilation of everything after the Big Bang? He now works on searches for new physics using both direct and indirect search techniques, and for the past 10 years he has been working on the upgrade of the ATLAS experiment.