Goto

Collaborating Authors

 decomposing


On Decomposing the Proximal Map

Neural Information Processing Systems

The proximal map is the key step in gradient-type algorithms, which have become prevalent in large-scale high-dimensional problems. For simple functions this proximal map is available in closed-form while for more complicated functions it can become highly nontrivial. Motivated by the need of combining regularizers to simultaneously induce different types of structures, this paper initiates a systematic investigation of when the proximal map of a sum of functions decomposes into the composition of the proximal maps of the individual summands. We not only unify a few known results scattered in the literature but also discover several new decompositions obtained almost effortlessly from our theory.


On Decomposing the Proximal Map

Neural Information Processing Systems

The proximal map is the key step in gradient-type algorithms, which have become prevalent in large-scale high-dimensional problems. For simple functions this proximal map is available in closed-form while for more complicated functions it can become highly nontrivial. Motivated by the need of combining regularizers to simultaneously induce different types of structures, this paper initiates a systematic investigation of when the proximal map of a sum of functions decomposes into the composition of the proximal maps of the individual summands. We not only unify a few known results scattered in the literature but also discover several new decompositions obtained almost effortlessly from our theory.


Decomposing the Metaverse Digital Twins NFTs Infrastructure and architecture

#artificialintelligence

Metaverse is composed of two words: Meta Verse. To better understand the complex topic of the metaverse, it is worth looking at the etymological origin of the ancient Greek word "meta". Its original meaning is "beyond," transcending, "after," or "behind". Metaphysics, Metacognition are some words that use the prefix "Meta". Metacognition, for example, means to have an awareness and understanding of one's own thought processes.


Decomposing your complex AI problem: Hierarchy

#artificialintelligence

Problem worlds often come with an innate hierarchy. Naturally, this may prompt the question: which level(s) of the hierarchy should be modelled? For example, the US Stock Market can be modelled as a whole or at the index level -- think, the Dow Jones, or for individual stocks. In a linear system, the way that the lower levels interact with the upper levels is "linear" or directly correlated. Take the example of an analytics system for business intelligence and reporting -- sales, inventories, etc.


Decomposing the Prediction Problem; Autonomous Navigation by neoRL Agents

Leikanger, Per R.

arXiv.org Artificial Intelligence

Navigating the world is a fundamental ability for any living entity. Accomplishing the same degree of freedom in technology has proven to be difficult. The brain is the only known mechanism capable of voluntary navigation, making neuroscience our best source of inspiration toward autonomy. Assuming that state representation is key, we explore the difference in how the brain and the machine represent the navigational state. Where Reinforcement Learning (RL) requires a monolithic state representation in accordance with the Markov property, Neural Representation of Euclidean Space (NRES) reflects navigational state via distributed activation patterns. We show how NRES-Oriented RL (neoRL) agents are possible before verifying our theoretical findings by experiments. Ultimately, neoRL agents are capable of behavior synthesis across state spaces -- allowing for decomposition of the problem into smaller spaces, alleviating the curse of dimensionality.


On Decomposing the Proximal Map

Yu, Yao-Liang

Neural Information Processing Systems

The proximal map is the key step in gradient-type algorithms, which have become prevalent in large-scale high-dimensional problems. For simple functions this proximal map is available in closed-form while for more complicated functions it can become highly nontrivial. Motivated by the need of combining regularizers to simultaneously induce different types of structures, this paper initiates a systematic investigation of when the proximal map of a sum of functions decomposes into the composition of the proximal maps of the individual summands. We not only unify a few known results scattered in the literature but also discover several new decompositions obtained almost effortlessly from our theory.