Fast kernel methods for Data Quality Monitoring as a goodness-of-fit test

Grosso, Gaia, Lai, Nicolò, Letizia, Marco, Pazzini, Jacopo, Rando, Marco, Rosasco, Lorenzo, Wulzer, Andrea, Zanetti, Marco

arXiv.org Artificial Intelligence 

Modern high-energy physics experiments operating at colliders are extremely sophisticated devices consisting of millions of sensors sampled every few nanoseconds, producing an enormous throughput of complex data. Several types of technologies are employed, devoted to identifying and measuring the particles that originated in the collisions; in all cases, the environmental conditions are severe, making the required performances challenging to achieve. Although the various subsystems are designed to offer redundancy, measurements can be undermined by malfunctions of parts of the experiment, either because of critical inefficiencies or because of possibly misinterpreted spurious signals. In addition to supervising the status (powering, electronic configuration, temperature, etc.) of the various hardware components, data from all sources must thus be monitored continuously to assess their quality and to promptly detect any faults, possibly providing indications about their causes. Given the rate of tens of MHz at which data is gathered and the number of sensors to be checked, the monitoring process needs to be as automated as possible: approaches based on Machine Learning (ML) techniques are particularly suited for this task and have started being employed by the experimental collaborations [1-4], complementing more traditional methods [5-9].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found