Alternatives To Next Token Prediction In Text Generation -- A Survey
Wyatt, Charlie, Joshi, Aditya, Salim, Flora
–arXiv.org Artificial Intelligence
The paradigm of Next Token Prediction (NTP) has driven the unprecedented success of Large Language Models (LLMs), but is also the source of their most persistent weaknesses such as poor long-term planning, error accumulation, and computational inefficiency. Acknowledging the growing interest in exploring alternatives to NTP, the survey describes the emerging ecosystem of alternatives to NTP. We categorise these approaches into five main families: (1) Multi-Token Prediction, which targets a block of future tokens instead of a single one; (2) Plan-then-Generate, where a global, high-level plan is created upfront to guide token-level decoding; (3) Latent Reasoning, which shifts the autoregressive process itself into a continuous latent space; (4) Continuous Generation Approaches, which replace sequential generation with iterative, parallel refinement through diffusion, flow matching, or energy-based methods; and (5) Non-Transformer Architectures, which sidestep NTP through their inherent model structure. By synthesizing insights across these methods, this survey offers a taxonomy to guide research into models that address the known limitations of token-level generation to develop new transformative models for natural language processing.
arXiv.org Artificial Intelligence
Sep-30-2025
- Country:
- Asia > Japan
- Honshū
- Chūbu > Toyama Prefecture
- Toyama (0.04)
- Tōhoku > Iwate Prefecture
- Morioka (0.04)
- Chūbu > Toyama Prefecture
- Honshū
- Europe > Middle East
- Cyprus > Nicosia
- Nicosia (0.04)
- Malta > Eastern Region
- Northern Harbour District > St. Julian's (0.04)
- Cyprus > Nicosia
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Asia > Japan
- Genre:
- Overview (1.00)
- Research Report (0.88)
- Technology: