trivialization
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (3 more...)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.71)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.71)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.69)
1b33d16fc562464579b7199ca3114982-AuthorFeedback.pdf
We would like to thank all the reviewers for their effort, and their thoughtful comments. Being formal, it should be "the gradient associated to the pullback of We will change it to "on which standard convergence results still apply". Thm 4.3 We will change "is equivalent" to The same can be said about higher order methods. We chose not to mention them in the main paper for simplicity. In l.138 we do mean "in almost all the manifold" in a measure-theoretical sense with respect to a measure induced by These two things indeed deserve a clarifying footnote.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.50)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (2 more...)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.71)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.71)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.69)
Trivializations for Gradient-Based Optimization on Manifolds
We introduce a framework to study the transformation of problems with manifold constraints into unconstrained problems through parametrizations in terms of a Euclidean space. We prove conditions under which a trivialization is sound in the context of gradient-based optimization and we show how two large families of trivializations have overall favorable properties, but also suffer from a performance issue. We then introduce dynamic trivializations, which solve this problem, and we show how these form a family of optimization methods that lie between trivializations and Riemannian gradient descent, and combine the benefits of both of them. We then show how to implement these two families of trivializations in practice for different matrix manifolds. To this end, we prove a formula for the gradient of the exponential of matrices, which can be of practical interest on its own.
Towards Automatic Identification of Globally Valid Geometric Flat Outputs via Numerical Optimization
Differential flatness enables efficient planning and control for underactuated robotic systems, but we lack a systematic and practical means of identifying a flat output (or determining whether one exists) for an arbitrary robotic system. In this work, we leverage recent results elucidating the role of symmetry in constructing flat outputs for free-flying robotic systems. Using the tools of Riemannian geometry, Lie group theory, and differential forms, we cast the search for a globally valid, equivariant flat output as an optimization problem. An approximate transcription of this continuum formulation to a quadratic program is performed, and its solutions for two example systems achieve precise agreement with the known closed-form flat outputs. Our results point towards a systematic, automated approach to numerically identify geometric flat outputs directly from the system model, particularly useful when complexity renders pen and paper analysis intractable.
- Asia > Middle East > Republic of Türkiye > Karaman Province > Karaman (0.04)
- North America > United States > Pennsylvania (0.04)
The Role of Symmetry in Constructing Geometric Flat Outputs for Free-Flying Robotic Systems
Welde, Jake, Kvalheim, Matthew D., Kumar, Vijay
Mechanical systems naturally evolve on principal bundles describing their inherent symmetries. The ensuing factorization of the configuration manifold into a symmetry group and an internal shape space has provided deep insights into the locomotion of many robotic and biological systems. On the other hand, the property of differential flatness has enabled efficient, effective planning and control algorithms for various robotic systems. Yet, a practical means of finding a flat output for an arbitrary robotic system remains an open question. In this work, we demonstrate surprising new connections between these two domains, for the first time employing symmetry directly to construct a flat output. We provide sufficient conditions for the existence of a trivialization of the bundle in which the group variables themselves are a flat output. We call this a geometric flat output, since it is equivariant (i.e. maintains the symmetry) and is often global or almost-global, properties not typically enjoyed by other flat outputs. In such a trivialization, the motion planning problem is easily solved, since a given trajectory for the group variables will fully determine the trajectory for the shape variables that exactly achieves this motion. We provide a partial catalog of robotic systems with geometric flat outputs and worked examples for the planar rocket, planar aerial manipulator, and quadrotor.
- North America > United States > California (0.04)
- North America > United States > Pennsylvania (0.04)
- North America > United States > Michigan (0.04)
Trivializations for Gradient-Based Optimization on Manifolds
We introduce a framework to study the transformation of problems with manifold constraints into unconstrained problems through parametrizations in terms of a Euclidean space. We prove conditions under which a trivialization is sound in the context of gradient-based optimization and we show how two large families of trivializations have overall favorable properties, but also suffer from a performance issue. We then introduce dynamic trivializations, which solve this problem, and we show how these form a family of optimization methods that lie between trivializations and Riemannian gradient descent, and combine the benefits of both of them. We then show how to implement these two families of trivializations in practice for different matrix manifolds. To this end, we prove a formula for the gradient of the exponential of matrices, which can be of practical interest on its own.
Trivializations for Gradient-Based Optimization on Manifolds
We introduce a framework to study the transformation of problems with manifold constraints into unconstrained problems through parametrizations in terms of a Euclidean space. We call these parametrizations "trivializations". We prove conditions under which a trivialization is sound in the context of gradient-based optimization and we show how two large families of trivializations have overall favorable properties, but also suffer from a performance issue. We then introduce "dynamic trivializations", which solve this problem, and we show how these form a family of optimization methods that lie between trivializations and Riemannian gradient descent, and combine the benefits of both of them. We then show how to implement these two families of trivializations in practice for different matrix manifolds. To this end, we prove a formula for the gradient of the exponential of matrices, which can be of practical interest on its own. Finally, we show how dynamic trivializations improve the performance of existing methods on standard tasks designed to test long-term memory within neural networks.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (3 more...)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.88)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.47)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning > Gradient Descent (0.35)