FraudTransformer: Time-Aware GPT for Transaction Fraud Detection
Aminian, Gholamali, Elliott, Andrew, Li, Tiger, Wong, Timothy Cheuk Hin, Dehon, Victor Claude, Szpruch, Lukasz, Maple, Carsten, Read, Christopher, Brown, Martin, Reinert, Gesine, Mamouei, Mo
Detecting payment fraud in real-world banking streams requires models that can exploit both the order of events and the irregular time gaps between them. We introduce FraudTransformer, a sequence model that augments a vanilla GPT-style architecture with (i) a dedicated time encoder that embeds either absolute timestamps or inter-event values, and (ii) a learned positional encoder that preserves relative order. Experiments on a large industrial dataset -- tens of millions of transactions and auxiliary events -- show that FraudTransformer surpasses four strong classical baselines (Logistic Regression, XGBoost and LightGBM) as well as transformer ablations that omit either the time or positional component. On the held-out test set it delivers the highest AUROC and PRAUC.
Sep-30-2025
- Country:
- Europe > United Kingdom > England
- Greater London > London (0.05)
- Oxfordshire > Oxford (0.14)
- West Midlands > Coventry (0.04)
- Europe > United Kingdom > England
- Genre:
- Research Report
- Experimental Study (0.34)
- New Finding (0.49)
- Research Report
- Industry:
- Law Enforcement & Public Safety > Fraud (0.86)
- Technology: