Goto

Collaborating Authors

 conservation


Could AI Data Centers Be Moved to Outer Space?

WIRED

Could AI Data Centers Be Moved to Outer Space? Massive data centers for generative AI are bad for the Earth. Data centers are being built at a frantic pace all over the world, driven by the AI boom. These facilities consume staggering amounts of electricity. By 2028, AI servers alone may use as much energy as 22 percent of US households.


AI-Newton: A Concept-Driven Physical Law Discovery System without Prior Physical Knowledge

Fang, You-Le, Jian, Dong-Shan, Li, Xiang, Ma, Yan-Qing

arXiv.org Artificial Intelligence

Advances in artificial intelligence (AI) have made AI-driven scientific discovery a highly promising new paradigm [1]. Although AI has achieved remarkable results in tackling domain-specific challenges [2, 3], the ultimate aspiration from a paradigm-shifting perspective still lies in developing reliable AI systems capable of autonomous scientific discovery directly from a large collection of raw data without supervision [4, 5]. Current approaches to automated physics discovery focus on individual experiments, employing either neural network (NN)-based methods [6-25] or symbolic techniques [26-33]. By analyzing data from a single experiment, these methods can construct a specific model capable of predicting future data from the same experiment; if sufficiently simple, such a model may even be expressed in symbolic form [34-36]. Although these methods represent a crucial and successful stage towards automated scientific discovery, they have not yet reached a discovery capacity comparable to that of human physicists.


Energy-Conserving Neural Network Closure Model for Long-Time Accurate and Stable LES

van Gastelen, Toby, Edeling, Wouter, Sanderse, Benjamin

arXiv.org Artificial Intelligence

Machine learning-based closure models for LES have shown promise in capturing complex turbulence dynamics but often suffer from instabilities and physical inconsistencies. In this work, we develop a novel skew-symmetric neural architecture as closure model that enforces stability while preserving key physical conservation laws. Our approach leverages a discretization that ensures mass, momentum, and energy conservation, along with a face-averaging filter to maintain mass conservation in coarse-grained velocity fields. We compare our model against several conventional data-driven closures (including unconstrained convolutional neural networks), and the physics-based Smagorinsky model. Performance is evaluated on decaying turbulence and Kolmogorov flow for multiple coarse-graining factors. In these test cases we observe that unconstrained machine learning models suffer from numerical instabilities. In contrast, our skew-symmetric model remains stable across all tests, though at the cost of increased dissipation. Despite this trade-off, we demonstrate that our model still outperforms the Smagorinsky model in unseen scenarios. These findings highlight the potential of structure-preserving machine learning closures for reliable long-time LES.


A Variational Manifold Embedding Framework for Nonlinear Dimensionality Reduction

Vastola, John J., Gershman, Samuel J., Rajan, Kanaka

arXiv.org Artificial Intelligence

Dimensionality reduction algorithms like principal component analysis (PCA) are workhorses of machine learning and neuroscience, but each has well-known limitations. Variants of PCA are simple and interpretable, but not flexible enough to capture nonlinear data manifold structure. More flexible approaches have other problems: autoencoders are generally difficult to interpret, and graph-embedding-based methods can produce pathological distortions in manifold geometry. Motivated by these shortcomings, we propose a variational framework that casts dimensionality reduction algorithms as solutions to an optimal manifold embedding problem. By construction, this framework permits nonlinear embeddings, allowing its solutions to be more flexible than PCA. Moreover, the variational nature of the framework has useful consequences for interpretability: each solution satisfies a set of partial differential equations, and can be shown to reflect symmetries of the embedding objective. We discuss these features in detail and show that solutions can be analytically characterized in some cases. Interestingly, one special case exactly recovers PCA.


PIPHEN: Physical Interaction Prediction with Hamiltonian Energy Networks

Chen, Kewei, Long, Yayu, Shang, Mingsheng

arXiv.org Artificial Intelligence

Multi-robot systems in complex physical collaborations face a "shared brain dilemma": transmitting high-dimensional multimedia data (e.g., video streams at ~30MB/s) creates severe bandwidth bottlenecks and decision-making latency. To address this, we propose PIPHEN, an innovative distributed physical cognition-control framework. Its core idea is to replace "raw data communication" with "semantic communication" by performing "semantic distillation" at the robot edge, reconstructing high-dimensional perceptual data into compact, structured physical representations. This idea is primarily realized through two key components: (1) a novel Physical Interaction Prediction Network (PIPN), derived from large model knowledge distillation, to generate this representation; and (2) a Hamiltonian Energy Network (HEN) controller, based on energy conservation, to precisely translate this representation into coordinated actions. Experiments show that, compared to baseline methods, PIPHEN can compress the information representation to less than 5% of the original data volume and reduce collaborative decision-making latency from 315ms to 76ms, while significantly improving task success rates. This work provides a fundamentally efficient paradigm for resolving the "shared brain dilemma" in resource-constrained multi-robot systems.



New wolf snake honors the late Steve Irwin

Popular Science

Lycodon irwini is the latest species named after The Crocodile Hunter. Breakthroughs, discoveries, and DIY tips sent every weekday. Conservationists have discovered a previously unknown species of snake, slithering around one of Earth's most unique environments. In naming their new reptile, researchers decided to honor one of popular culture's most unique and beloved wildlife educators: the late, great Steve Irwin . The snake was discovered in the Nicobar Islands.