invariant manifold
- South America > Brazil (0.04)
- North America > United States > Indiana (0.04)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (3 more...)
- Research Report > Experimental Study (1.00)
- Research Report > New Finding (0.67)
- Asia > China > Shanghai > Shanghai (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > Mexico > Gulf of Mexico (0.04)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (0.92)
Compositional Symmetry as Compression: Lie Pseudogroup Structure in Algorithmic Agents
In the algorithmic (Kolmogorov) view, agents are programs that track and compress sensory streams using generative programs. We propose a framework where the relevant structural prior is simplicity (Solomonoff) understood as \emph{compositional symmetry}: natural streams are well described by (local) actions of finite-parameter Lie pseudogroups on geometrically and topologically complex low-dimensional configuration manifolds (latent spaces). Modeling the agent as a generic neural dynamical system coupled to such streams, we show that accurate world-tracking imposes (i) \emph{structural constraints} -- equivariance of the agent's constitutive equations and readouts -- and (ii) \emph{dynamical constraints}: under static inputs, symmetry induces conserved quantities (Noether-style labels) in the agent dynamics and confines trajectories to reduced invariant manifolds; under slow drift, these manifolds move but remain low-dimensional. This yields a hierarchy of reduced manifolds aligned with the compositional factorization of the pseudogroup, providing a geometric account of the ``blessing of compositionality'' in deep models. We connect these ideas to the Spencer formalism for Lie pseudogroups and formulate a symmetry-based, self-contained version of predictive coding in which higher layers receive only \emph{coarse-grained residual transformations} (prediction-error coordinates) along symmetry directions unresolved at lower layers.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- (3 more...)
- Asia > China > Shanghai > Shanghai (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > Mexico > Gulf of Mexico (0.04)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (0.92)
- South America > Brazil (0.04)
- North America > United States > Indiana (0.04)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (3 more...)
- Research Report > Experimental Study (1.00)
- Research Report > New Finding (0.67)
Modified Loss of Momentum Gradient Descent: Fine-Grained Analysis
Cattaneo, Matias D., Shigida, Boris
We analyze gradient descent with Polyak heavy-ball momentum (HB) whose fixed momentum parameter $β\in (0, 1)$ provides exponential decay of memory. Building on Kovachki and Stuart (2021), we prove that on an exponentially attractive invariant manifold the algorithm is exactly plain gradient descent with a modified loss, provided that the step size $h$ is small enough. Although the modified loss does not admit a closed-form expression, we describe it with arbitrary precision and prove global (finite "time" horizon) approximation bounds $O(h^{R})$ for any finite order $R \geq 2$. We then conduct a fine-grained analysis of the combinatorics underlying the memoryless approximations of HB, in particular, finding a rich family of polynomials in $β$ hidden inside which contains Eulerian and Narayana polynomials. We derive continuous modified equations of arbitrary approximation order (with rigorous bounds) and the principal flow that approximates the HB dynamics, generalizing Rosca et al. (2023). Approximation theorems cover both full-batch and mini-batch HB. Our theoretical results shed new light on the main features of gradient descent with heavy-ball momentum, and outline a road-map for similar analysis of other optimization algorithms.
- North America > United States > Indiana (0.04)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- (3 more...)
Learning Optimal Control and Dynamical Structure of Global Trajectory Search Problems with Diffusion Models
Graebner, Jannik, Li, Anjian, Sinha, Amlan, Beeson, Ryne
Spacecraft trajectory design is a global search problem, where previous work has revealed specific solution structures that can be captured with data-driven methods. This paper explores two global search problems in the circular restricted three-body problem: hybrid cost function of minimum fuel/time-of-flight and transfers to energy-dependent invariant manifolds. These problems display a fundamental structure either in the optimal control profile or the use of dynamical structures. We build on our prior generative machine learning framework to apply diffusion models to learn the conditional probability distribution of the search problem and analyze the model's capability to capture these structures.
- North America > United States > North Carolina > Mecklenburg County > Charlotte (0.04)
- Europe > United Kingdom > Wales (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Search (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Information Retrieval (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
Connectivity Shapes Implicit Regularization in Matrix Factorization Models for Matrix Completion
Bai, Zhiwei, Zhao, Jiajie, Zhang, Yaoyu
Matrix factorization models have been extensively studied as a valuable test-bed for understanding the implicit biases of overparameterized models. Although both low nuclear norm and low rank regularization have been studied for these models, a unified understanding of when, how, and why they achieve different implicit regularization effects remains elusive. In this work, we systematically investigate the implicit regularization of matrix factorization for solving matrix completion problems. We empirically discover that the connectivity of observed data plays a crucial role in the implicit bias, with a transition from low nuclear norm to low rank as data shifts from disconnected to connected with increased observations. We identify a hierarchy of intrinsic invariant manifolds in the loss landscape that guide the training trajectory to evolve from low-rank to higher-rank solutions. Based on this finding, we theoretically characterize the training trajectory as following the hierarchical invariant manifold traversal process, generalizing the characterization of Li et al. (2020) to include the disconnected case. Furthermore, we establish conditions that guarantee minimum nuclear norm, closely aligning with our experimental findings, and we provide a dynamics characterization condition for ensuring minimum rank. Our work reveals the intricate interplay between data connectivity, training dynamics, and implicit regularization in matrix factorization models.
Machine-learning invariant foliations in forced systems for reduced order modelling
We identify reduced order models (ROM) of forced systems from data using invariant foliations. The forcing can be external, parametric, periodic or quasi-periodic. The process has four steps: 1. identify an approximate invariant torus and the linear dynamics about the torus; 2. identify a globally defined invariant foliation about the torus; 3. identify a local foliation about an invariant manifold that complements the global foliation 4. extract the invariant manifold as the leaf going through the torus and interpret the result. We combine steps 2 and 3, so that we can track the location of the invariant torus and scale the invariance equations appropriately. We highlight some fundamental limitations of invariant manifolds and foliations when fitting them to data, that require further mathematics to resolve.
- North America > United States > New York (0.04)
- North America > United States > Indiana (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Learning effective dynamics from data-driven stochastic systems
Feng, Lingyu, Gao, Ting, Dai, Min, Duan, Jinqiao
Numerous complex systems in the areas of science, engineering, chemistry or material science have the philosophy of multiscale properties in their dynamic evolution [1-4]. By considering models at different scales simultaneously, we would like to obtain both the efficiency of the macroscopic models as well as the accuracy of the microscopic models. For example, approaches in chemistry usually involve the quantum mechanics models in the reaction region and the classical molecular models elsewhere [5]. Besides, as noisy observations always exist in all kinds of systems under internal or external factors, stochastic dynamical systems come to play an important role in modeling such phenomena. Thus, it is of great importance to study multiscale stochastic dynamical systems [5, 6]. To better understand the intrinsic nature of such complex systems, researchers usually try to investigate the effective dynamics of these systems, such as invariant manifolds, global attractors, tipping points, noise induced bifurcations, transition pathways, and so on [7-11]. These dynamical behaviors could capture the fundamental structures when the system evolves over time or parameter space.
- Asia > China > Hubei Province > Wuhan (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)