Chekhovsky, V.Hayrapetyan, A.Makarenko, V.Tumasyan, A.Adam, W.Andrejkovic, J. W.Druzhkin, D.2026-01-102026-01-1020251434-60441434-6052https://doi.org/10.1140/epjc/s10052-025-14713-wMuñoz Díaz, Conrado/0009-0001-3417-4557; Painesis, Haris/0000-0001-5061-7031; Saidmakhamadov, Nosir/0000-0002-7460-5972; Ruales, Anderson/0000-0003-0826-0803; Giacomo, Bolini/0000-0001-5490-605X; Pereira, Miguel/0000-0003-4296-7028; Figueiredo, Diego/0000-0003-2514-6930; Monsch, Artur Artemij/0009-0007-3529-1644; Fernández Ramos, Juan Pablo/0000-0002-0122-313X; De Souza Lemos, Dener/0000-0003-1982-8978; Consuegra Roiguez, Sana/0000-0002-1383-1837; Jaramillo Gallego, Johny/0000-0003-3885-6608; Noll, Dennis Daniel Nick/0000-0002-0176-2360; Ribeiro Lopes, Beatriz/0000-0003-0823-447X; Agicevic, Marko/0000-0003-1967-6783; Ruiz, Jose/0000-0002-3306-0363; Hernández Calama, José María/0000-0001-6436-7547; Erice Cid, Carlos Francisco/0000-0002-6469-3200; Eimanis, Karlis/0000-0003-0972-5641; Thachayath Sugunan, Aravind/0000-0001-6545-0350; Palencia Cortezon, Jose Enrique/0000-0001-8264-0287We propose a neural network training method capable of accounting for the effects of systematic variations of the data model in the training process and describe its extension towards neural network multiclass classification. The procedure is evaluated on the realistic case of the measurement of Higgs boson production via gluon fusion and vector boson fusion in the tau tau decay channel at the CMS experiment. The neural network output functions are used to infer the signal strengths for inclusive production of Higgs bosons as well as for their production via gluon fusion and vector boson fusion. We observe improvements of 12 and 16% in the uncertainty in the signal strengths for gluon and vector-boson fusion, respectively, compared with a conventional neural network training based on cross-entropy.eninfo:eu-repo/semantics/openAccessDevelopment of Systematic Uncertainty-Aware Neural Network Trainings for Binned-Likelihood Analyses at the LHCArticle10.1140/epjc/s10052-025-14713-w2-s2.0-105024350483