Goto

Collaborating Authors

 ace









Learning (Approximately) Equivariant Networks via Constrained Optimization

Manolache, Andrei, Chamon, Luiz F. O., Niepert, Mathias

arXiv.org Artificial Intelligence

Equivariant neural networks are designed to respect symmetries through their architecture, boosting generalization and sample efficiency when those symmetries are present in the data distribution. Real-world data, however, often departs from perfect symmetry because of noise, structural variation, measurement bias, or other symmetry-breaking effects. Strictly equivariant models may struggle to fit the data, while unconstrained models lack a principled way to leverage partial symmetries. Even when the data is fully symmetric, enforcing equivariance can hurt training by limiting the model to a restricted region of the parameter space. Guided by homotopy principles, where an optimization problem is solved by gradually transforming a simpler problem into a complex one, we introduce Adaptive Constrained Equivariance (ACE), a constrained optimization approach that starts with a flexible, non-equivariant model and gradually reduces its deviation from equivariance. This gradual tightening smooths training early on and settles the model at a data-driven equilibrium, balancing between equivariance and non-equivariance. Across multiple architectures and tasks, our method consistently improves performance metrics, sample efficiency, and robustness to input perturbations compared with strictly equivariant models and heuristic equivariance relaxations.


Jeff Bezos' New AI Venture Quietly Acquired an Agentic Computing Startup

WIRED

Jeff Bezos' New AI Venture Quietly Acquired an Agentic Computing Startup Project Prometheus has raised over $6 billion in funding and hired over 100 employees, a handful of whom joined through its acquisition of General Agents, according to records and sources. In early June, tech entrepreneur Vik Bajaj took over Saison, a two-Michelin star restaurant in San Francisco, for an off-the-record dinner to talk about AI with journalists and a handful of scientists. In attendance was Sherjil Ozair, a late addition who had previously held senior research roles at DeepMind and Tesla . The following day, Bajaj and Ozair were on their way to making a deal, public records show. Bajaj didn't mention it at the dinner, but earlier this year he had begun working with Amazon executive chairman Jeff Bezos on a new AI venture called Project Prometheus.


Mitigating Participation Imbalance Bias in Asynchronous Federated Learning

Chang, Xiangyu, Yao, Manyi, Krishnamurthy, Srikanth V., Shelton, Christian R., Chakraborty, Anirban, Swami, Ananthram, Oymak, Samet, Roy-Chowdhury, Amit

arXiv.org Artificial Intelligence

In Asynchronous Federated Learning (AFL), the central server immediately updates the global model with each arriving client's contribution. As a result, clients perform their local training on different model versions, causing information staleness (delay). In federated environments with non-IID local data distributions, this asynchronous pattern amplifies the adverse effect of client heterogeneity (due to different data distribution, local objectives, etc.), as faster clients contribute more frequent updates, biasing the global model. We term this phenomenon heterogeneity amplification. Our work provides a theoretical analysis that maps AFL design choices to their resulting error sources when heterogeneity amplification occurs. Guided by our analysis, we propose ACE (All-Client Engagement AFL), which mitigates participation imbalance through immediate, non-buffered updates that use the latest information available from all clients. We also introduce a delay-aware variant, ACED, to balance client diversity against update staleness. Experiments on different models for different tasks across diverse heterogeneity and delay settings validate our analysis and demonstrate the robust performance of our approaches.