Innovation
- North America > United States > California (0.14)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- Europe > Italy > Calabria > Catanzaro Province > Catanzaro (0.04)
- Research Report > Experimental Study (0.93)
- Research Report > Promising Solution (0.60)
- Overview > Innovation (0.60)
- North America > United States > California (0.04)
- Asia > China > Shanghai > Shanghai (0.04)
- Research Report > Promising Solution (0.50)
- Overview > Innovation (0.40)
- North America > United States > California (0.04)
- Asia > China > Shanghai > Shanghai (0.04)
- Research Report > Promising Solution (0.50)
- Overview > Innovation (0.40)
- Asia > China (0.04)
- North America > United States > New Jersey > Hudson County > Hoboken (0.04)
- Research Report > Promising Solution (0.64)
- Overview > Innovation (0.40)
- Research Report > Promising Solution (0.34)
- Overview > Innovation (0.34)
Low Rank Transformer for Multivariate Time Series Anomaly Detection and Localization
Shimillas, Charalampos, Malialis, Kleanthis, Fokianos, Konstantinos, Polycarpou, Marios M.
Multivariate time series (MTS) anomaly diagnosis, which encompasses both anomaly detection and localization, is critical for the safety and reliability of complex, large-scale real-world systems. The vast majority of existing anomaly diagnosis methods offer limited theoretical insights, especially for anomaly localization, which is a vital but largely unexplored area. The aim of this contribution is to study the learning process of a Transformer when applied to MTS by revealing connections to statistical time series methods. Based on these theoretical insights, we propose the Attention Low-Rank Transformer (ALoRa-T) model, which applies low-rank regularization to self-attention, and we introduce the Attention Low-Rank score, effectively capturing the temporal characteristics of anomalies. Finally, to enable anomaly localization, we propose the ALoRa-Loc method, a novel approach that associates anomalies to specific variables by quantifying interrelationships among time series. Extensive experiments and real data analysis, show that the proposed methodology significantly outperforms state-of-the-art methods in both detection and localization tasks.
- Europe > Middle East > Cyprus > Nicosia > Nicosia (0.04)
- North America > United States > Florida > Miami-Dade County > Coral Gables (0.04)
- Research Report > Promising Solution (0.54)
- Overview > Innovation (0.34)
- Information Technology (0.92)
- Water & Waste Management > Water Management (0.46)
- Oceania > Australia > Victoria > Melbourne (0.04)
- North America > United States > Virginia > Arlington County > Arlington (0.04)
- North America > United States > Texas (0.04)
- (5 more...)
- Research Report > Promising Solution (0.40)
- Overview > Innovation (0.40)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Search (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.46)
- Europe > France (0.14)
- Asia > Middle East > Jordan (0.04)
- Europe > United Kingdom (0.04)
- Research Report > Promising Solution (0.34)
- Overview > Innovation (0.34)
- Research Report > Promising Solution (0.41)
- Overview > Innovation (0.41)
Representer Point Selection via Local Jacobian Expansion for Post-hoc Classifier Explanation of Deep Neural Networks and Ensemble Models
Explaining the influence of training data on deep neural network predictions is a critical tool for debugging models through data curation. A recent tractable and appealing approach for this task was provided via the concept of Representer Point Selection (RPS), i.e. a method the leverages the dual form of $l_2$ regularized optimization in the last layer of the neural network to identify the contribution of training points to the prediction. However, two key drawbacks of RPS are that they (i) lead to disagreement between the originally trained network and the RP regularized network modification and (ii) often yield a static ranking of training data for the same class, independent of the data being classified. Inspired by the RPS approach, we propose an alternative method based on a local Jacobian Taylor expansion (LJE) of the Jacobian.We empirically compared RPS-LJE with the original RPS-$l_2$ on image classification (with ResNet), text classification recurrent neural networks (with Bi-LSTM), and tabular classification (with XGBoost) tasks.Quantitatively, we show that RPS-LJE slightly outperforms RPS-$l_2$ and other state-of-the-art data explanation methods by up to 3\% on a data debugging task. Qualitatively, we observe that RPS-LJE provides individualized explanations for each test data point rather than the class-specific static ranking of points in the original approach. Overall, RPS-LJE represents a novel approach to RPS that provides a powerful tool for data-oriented explanation and debugging.
- Research Report > Promising Solution (0.59)
- Overview > Innovation (0.59)