AF-MAT: Aspect-aware Flip-and-Fuse xLSTM for Aspect-based Sentiment Analysis

Lawan, Adamu, Pu, Juhua, Yunusa, Haruna, Lawan, Muhammad, Basi, Mahmoud, Adam, Muhammad

arXiv.org Artificial Intelligence 

Aspect-based Sentiment Analysis (ABSA) is a crucial NLP task that extracts fine-grained opinions and sentiments from text, such as product reviews and customer feedback. Existing methods often trade off efficiency for performance: traditional LSTM or RNN models struggle to capture long-range dependencies, transformer-based methods are computationally costly, and Mamba-based approaches rely on CUDA and weaken local dependency modeling. The recently proposed Extended Long Short-Term Memory (xLSTM) model offers a promising alternative by effectively capturing long-range dependencies through exponential gating and enhanced memory variants, sLSTM for modeling local dependencies, and mLSTM for scalable, parallelizable memory. However, xL-STM's application in ABSA remains unexplored. To address this, we introduce Aspect-aware Flip-and-Fuse xLSTM (AF-MA T), a framework that leverages xLSTM's strengths. AF-MA T features an Aspect-aware matrix LSTM (AA-mLSTM) mechanism that introduces a dedicated aspect gate, enabling the model to selectively emphasize tokens semantically relevant to the target aspect during memory updates. To model multi-scale context, we incorporate a FlipMix block that sequentially applies a partially flipped Conv1D (pf-Conv1D) to capture short-range dependencies in reverse order, followed by a fully flipped mLSTM (ff-mLSTM) to model long-range dependencies via full sequence reversal. Additionally, we propose MC2F, a lightweight Multihead Cross-Feature Fusion based on mLSTM gating, which dynamically fuses AA-mLSTM outputs (queries and keys) with FlipMix outputs (values) for adaptive representation integration. Experiments on three benchmark datasets demonstrate that AF-MA T outperforms state-of-the-art baselines, achieving higher accuracy in ABSA tasks.