Goto

Collaborating Authors

 mechanics


There Will Be a Scientific Theory of Deep Learning

Simon, Jamie, Kunin, Daniel, Atanasov, Alexander, Boix-Adserà, Enric, Bordelon, Blake, Cohen, Jeremy, Ghosh, Nikhil, Guth, Florentin, Jacot, Arthur, Kamb, Mason, Karkada, Dhruva, Michaud, Eric J., Ottlik, Berkan, Turnbull, Joseph

arXiv.org Machine Learning

In this paper, we make the case that a scientific theory of deep learning is emerging. By this we mean a theory which characterizes important properties and statistics of the training process, hidden representations, final weights, and performance of neural networks. We pull together major strands of ongoing research in deep learning theory and identify five growing bodies of work that point toward such a theory: (a) solvable idealized settings that provide intuition for learning dynamics in realistic systems; (b) tractable limits that reveal insights into fundamental learning phenomena; (c) simple mathematical laws that capture important macroscopic observables; (d) theories of hyperparameters that disentangle them from the rest of the training process, leaving simpler systems behind; and (e) universal behaviors shared across systems and settings which clarify which phenomena call for explanation. Taken together, these bodies of work share certain broad traits: they are concerned with the dynamics of the training process; they primarily seek to describe coarse aggregate statistics; and they emphasize falsifiable quantitative predictions. We argue that the emerging theory is best thought of as a mechanics of the learning process, and suggest the name learning mechanics. We discuss the relationship between this mechanics perspective and other approaches for building a theory of deep learning, including the statistical and information-theoretic perspectives. In particular, we anticipate a symbiotic relationship between learning mechanics and mechanistic interpretability. We also review and address common arguments that fundamental theory will not be possible or is not important. We conclude with a portrait of important open directions in learning mechanics and advice for beginners. We host further introductory materials, perspectives, and open questions at learningmechanics.pub.



Games with loot boxes to get minimum 16 age rating across Europe

BBC News

Games which feature loot boxes will soon be given an age rating of 16 across Europe, including in the UK, under a host of changes by the European video game ratings organisation. The Pan-European Game Information body (PEGI)'s age ratings are displayed on games sold in the UK and other countries in Europe to indicate their suitability for children of different ages. Loot boxes are an in-game feature allowing players to buy random mystery items with real or virtual currency, but recent research has found they blur the line between gaming and gambling. The new ratings, taking effect from June, could see games containing loot box systems, such as EA Sports FC, receive a much higher age rating. The PEGI system is used in 38 countries to help consumers and particularly parents make informed decisions about the games they purchase.



Thermodynamic Isomorphism of Transformers: A Lagrangian Approach to Attention Dynamics

Kim, Gunn

arXiv.org Machine Learning

We propose an effective field-theoretic framework for analyzing Transformer attention through a thermodynamic lens. By constructing a Lagrangian on the information manifold equipped with the Fisher metric, we show that, within the Shannon--Boltzmann entropy framework, the Softmax function arises as a stationary solution minimizing a Helmholtz free energy functional. This establishes a formal correspondence between scaled dot-product attention and canonical ensemble statistics. Extending this mapping to macroscopic observables, we define an effective specific heat associated with fluctuations of the attention energy landscape. In controlled experiments on the modular addition task ($p = 19$--$113$), we observe a robust peak in this fluctuation measure that consistently precedes the onset of generalization. While no asymptotic power-law divergence is detected in this finite-depth regime, the reproducible enhancement of energy variance suggests a critical-like crossover accompanying representational reorganization. Our framework provides a unified statistical-mechanical perspective on attention scaling, training dynamics, and positional encoding, interpreting the phenomena as emergent properties of an effective thermodynamic system rather than isolated heuristics. Although the present results indicate finite-size crossover behavior rather than a strict phase transition, they motivate further investigation into scaling limits of deep architectures through fluctuation-based observables.




d6ef5f7fa914c19931a55bb262ec879c-Paper.pdf

Neural Information Processing Systems

A recently proposed class of models attempts to learn latent dynamics from high-dimensionalobservations,likeimages,usingpriorsinformedbyHamiltonian mechanics.