SCALAR: Self-Calibrating Adaptive Latent Attention Representation Learning
Abbas, Farwa, Ahmad, Hussain, Szabo, Claudia
–arXiv.org Artificial Intelligence
High-dimensional, heterogeneous data with complex feature interactions pose significant challenges for traditional predictive modeling approaches. While Projection to Latent Structures (PLS) remains a popular technique, it struggles to model complex non-linear relationships, especially in multivariate systems with high-dimensional correlation structures. This challenge is further compounded by simultaneous interactions across multiple scales, where local processing fails to capture crossgroup dependencies. Additionally, static feature weighting limits adaptability to contextual variations, as it ignores sample-specific relevance. To address these limitations, we propose a novel method that enhances predictive performance through novel architectural innovations. Our architecture introduces an adaptive kernel-based attention mechanism that processes distinct feature groups separately before integration, enabling capture of local patterns while preserving global relationships. Experimental results show substantial improvements in performance metrics, compared to the state-of-the-art methods across diverse datasets.
arXiv.org Artificial Intelligence
Oct-21-2025
- Country:
- Europe > United Kingdom
- England
- Greater London > London (0.04)
- Oxfordshire > Oxford (0.04)
- England
- Oceania > Australia
- South Australia > Adelaide (0.04)
- Europe > United Kingdom
- Genre:
- Research Report
- New Finding (0.66)
- Promising Solution (0.54)
- Research Report
- Industry:
- Technology: