Goto

Collaborating Authors

 bordelon


Two-Point Deterministic Equivalence for Stochastic Gradient Dynamics in Linear Models

Atanasov, Alexander, Bordelon, Blake, Zavatone-Veth, Jacob A., Paquette, Courtney, Pehlevan, Cengiz

arXiv.org Machine Learning

Modern deep learning practice is governed by the surprising predictability of performance improvement with increases in the scale of data, model size, and compute [17]. Often, the scaling of performance as a function of these quantities exhibits remarkably regular power law behavior, termed a neural scaling law [2, 6, 12, 13, 15, 16, 18, 19, 22, 32]. Here, performance is usually measured by some differentiable loss on the predictions of the model on a held out test set representative of the population. Given the relatively universal behavior of the exponents across architectures and optimizers [11, 18, 19], one might hope that relatively simple models of information processing systems might be able to recover the same types of scaling laws. The (stochastic) gradient descent (SGD) dynamics in random feature models were analyzed in recent works [7, 20, 26] which exhibits a surprising breadth of scaling behavior and captures several interesting phenomena in deep network training. Each of the above works has isolated various effects that can hurt performance compared to the idealized infinite data and infinite model size limits. The model was first studied in [7], where the bottlenecks due to finite width and finite dataset size were computed and, for certain data structure, resulted in a Chinchilla-type scaling result as in [18].


Will We Have to Relinquish Some Privacy for the Best AI?

#artificialintelligence

Social media giant Meta Platforms, formerly known as Facebook, is only the latest company to draw legal heat over its technology -- specifically, its artificial intelligence (AI) innovations. In this episode of "The AI/ML Show" on Motley Fool Live, recorded on Feb. 16, Fool.com contributors Toby Bordelon and Jason Hall discuss how the debate of AI versus privacy continues to rage on. Toby Bordelon: We talked about data protection and privacy, I think, a decent amount with Facebook, and you can see what happens when that goes badly. If you don't follow those rules, $650 million with maybe more to come, and that can put a damper on what you can do. You want data to train AI well.