Inference-Time Decomposition of Activations (ITDA): A Scalable Approach to Interpreting Large Language Models
Leask, Patrick, Nanda, Neel, Moubayed, Noura Al
–arXiv.org Artificial Intelligence
Sparse autoencoders (SAEs) are a popular method for decomposing Large Langage Models (LLM) activations into interpretable latents. However, due to their substantial training cost, most academic research uses open-source SAEs which are only available for a restricted set of models of up to 27B parameters. SAE latents are also learned from a dataset of activations, which means they do not transfer between models. Motivated by relative representation similarity measures, we introduce Inference-Time Decomposition of Activations (ITDA) models, an alternative method for decomposing language model activations. To train an ITDA, we greedily construct a dictionary of language model activations on a dataset of prompts, selecting those activations which were worst approximated by matching pursuit on the existing dictionary. ITDAs can be trained in just 1% of the time required for SAEs, using 1% of the data. This allowed us to train ITDAs on Llama-3.1 70B and 405B on a single consumer GPU. ITDAs can achieve similar reconstruction performance to SAEs on some target LLMs, but generally incur a performance penalty. However, ITDA dictionaries enable cross-model comparisons, and a simple Jaccard similarity index on ITDA dictionaries outperforms existing methods like CKA, SVCCA, and relative representation similarity metrics. ITDAs provide a cheap alternative to SAEs where computational resources are limited, or when cross model comparisons are necessary. Code available at https://github.com/pleask/itda.
arXiv.org Artificial Intelligence
Jun-13-2025
- Country:
- Asia > Japan (0.04)
- Europe > Italy (0.04)
- North America
- Canada (0.04)
- United States (0.28)
- Oceania > Australia (0.04)
- Genre:
- Research Report (0.82)
- Industry:
- Education (0.67)
- Health & Medicine > Therapeutic Area
- Cardiology/Vascular Diseases (0.67)
- Oncology (0.46)
- Technology: