Goto

Collaborating Authors

 gatel0rd



927b028cfa24b23a09ff20c1a7f9b398-Paper.pdf

Neural Information Processing Systems

Tostudy thishypothesis, wepropose GatedL0 Regularized Dynamics (GateL0RD), a novel recurrent architecture that incorporates the inductive bias to maintain stable, sparsely changing latent states.


Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains

Neural Information Processing Systems

A common approach to prediction and planning in partially observable domains is to use recurrent neural networks (RNNs), which ideally develop and maintain a latent memory about hidden, task-relevant factors. We hypothesize that many of these hidden factors in the physical world are constant over time, changing only sparsely. To study this hypothesis, we propose Gated $L_0$ Regularized Dynamics (GateL0RD), a novel recurrent architecture that incorporates the inductive bias to maintain stable, sparsely changing latent states. The bias is implemented by means of a novel internal gating function and a penalty on the $L_0$ norm of latent state changes. We demonstrate that GateL0RD can compete with or outperform state-of-the-art RNNs in a variety of partially observable prediction and control tasks. GateL0RD tends to encode the underlying generative factors of the environment, ignores spurious temporal dependencies, and generalizes better, improving sampling efficiency and overall performance in model-based planning and reinforcement learning tasks. Moreover, we show that the developing latent states can be easily interpreted, which is a step towards better explainability in RNNs.



Supplementary Material for: Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains A Relation to other RNNs

Neural Information Processing Systems

In Sec. 3 we set out to create an RNN LSTMs use two gates, i.e. a forget and an input gate, with the sigmoid activation function Nonetheless, it is not straight forward to apply our approach, outlined in Sec. 3, to GRUs and LSTMs. Our loss (see Eq. 5) punishes non-zero gate activation. Thus, their gating function would need to be modified or replaced, e.g. by our ReTanh gate GateL0RD attempts to overcome the outlined downsides of using LSTMs and GRUs with our proposed latent state regularization. B.2 - B.5 provide further details specific to each Billiard Ball scheduled sampling (Sec. The best learning rates for all experiments are listed in Table 1.


Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains

Neural Information Processing Systems

A common approach to prediction and planning in partially observable domains is to use recurrent neural networks (RNNs), which ideally develop and maintain a latent memory about hidden, task-relevant factors. We hypothesize that many of these hidden factors in the physical world are constant over time, changing only sparsely. To study this hypothesis, we propose Gated L_0 Regularized Dynamics (GateL0RD), a novel recurrent architecture that incorporates the inductive bias to maintain stable, sparsely changing latent states. The bias is implemented by means of a novel internal gating function and a penalty on the L_0 norm of latent state changes. We demonstrate that GateL0RD can compete with or outperform state-of-the-art RNNs in a variety of partially observable prediction and control tasks.


Developing hierarchical anticipations via neural network-based event segmentation

Gumbsch, Christian, Adam, Maurits, Elsner, Birgit, Martius, Georg, Butz, Martin V.

arXiv.org Artificial Intelligence

Humans can make predictions on various time scales and hierarchical levels. Thereby, the learning of event encodings seems to play a crucial role. In this work we model the development of hierarchical predictions via autonomously learned latent event codes. We present a hierarchical recurrent neural network architecture, whose inductive learning biases foster the development of sparsely changing latent state that compress sensorimotor sequences. A higher level network learns to predict the situations in which the latent states tend to change. Using a simulated robotic manipulator, we demonstrate that the system (i) learns latent states that accurately reflect the event structure of the data, (ii) develops meaningful temporal abstract predictions on the higher level, and (iii) generates goal-anticipatory behavior similar to gaze behavior found in eye-tracking studies with infants. The architecture offers a step towards the autonomous learning of compressed hierarchical encodings of gathered experiences and the exploitation of these encodings to generate adaptive behavior.


Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains

Gumbsch, Christian, Butz, Martin V., Martius, Georg

arXiv.org Artificial Intelligence

A common approach to prediction and planning in partially observable domains is to use recurrent neural networks (RNNs), which ideally develop and maintain a latent memory about hidden, task-relevant factors. We hypothesize that many of these hidden factors in the physical world are constant over time, changing only sparsely. To study this hypothesis, we propose Gated $L_0$ Regularized Dynamics (GateL0RD), a novel recurrent architecture that incorporates the inductive bias to maintain stable, sparsely changing latent states. The bias is implemented by means of a novel internal gating function and a penalty on the $L_0$ norm of latent state changes. We demonstrate that GateL0RD can compete with or outperform state-of-the-art RNNs in a variety of partially observable prediction and control tasks. GateL0RD tends to encode the underlying generative factors of the environment, ignores spurious temporal dependencies, and generalizes better, improving sampling efficiency and overall performance in model-based planning and reinforcement learning tasks. Moreover, we show that the developing latent states can be easily interpreted, which is a step towards better explainability in RNNs.