Please use this identifier to cite or link to this item:
Title: Identification of hadronic tau lepton decays using a deep neural network
Authors: Tümasyan, A.
Adam, W.
Andrejkovic, J. W.
Bergauer, T.
Chatterjee, S.
Gürpınar, Emine Güler
Güler, Yalçın
Keywords: Large detector systems for particle and astroparticle physics
Particle identification methods
Pattern recognition
cluster finding
calibration and fitting methods
Issue Date: 2022
Publisher: IOP Publishing Ltd
Abstract: A new algorithm is presented to discriminate reconstructed hadronic decays of tau leptons (tau(h)) that originate from genuine tau leptons in the CMS detector against tau(h) candidates that originate from quark or gluon jets, electrons, or muons. The algorithm inputs information from all reconstructed particles in the vicinity of a tau(h) candidate and employs a deep neural network with convolutional layers to efficiently process the inputs. This algorithm leads to a significantly improved performance compared with the previously used one. For example, the efficiency for a genuine tau(h) to pass the discriminator against jets increases by 10-30% for a given efficiency for quark and gluon jets. Furthermore, a more efficient tau(h) reconstruction is introduced that incorporates additional hadronic decay modes. The superior performance of the new algorithm to discriminate against jets, electrons, and muons and the improved tau(h) reconstruction method are validated with LHC proton-proton collision data at root s = 13 TeV.
ISSN: 1748-0221
Appears in Collections:Mühendislik ve Doğa Bilimleri Fakültesi Koleksiyonu
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collections
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collections

Files in This Item:
File SizeFormat 
Tumasyan_2022_J._Inst._17_P07023.pdf1.47 MBAdobe PDFView/Open
Show full item record

CORE Recommender


checked on Oct 27, 2023

Page view(s)

checked on Dec 4, 2023


checked on Dec 4, 2023

Google ScholarTM



Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.