Goto

Collaborating Authors

 lifelong language learning


Reviews: Episodic Memory in Lifelong Language Learning

Neural Information Processing Systems

The paper addresses the very important topic of lifelong learning, and it proposes to employ an episodic memory to avoid catastrophic forgetting. The memory is based on a key-value representation that exploits an encoder-decoder architecture based on BERT. The training is made on the concatenation of different datasets, of which there is no need to specify the identifiers. The work is highly significant and the novelty of the contribution is remarkable. One point that would have deserved more attention is the strategies for the reading and writing of the episodic memory (see also comments below).


Reviews: Episodic Memory in Lifelong Language Learning

Neural Information Processing Systems

This paper proposes the use of memory in life-long learning to prevent catastrophic forgetting by means of experience replay and local adaptation. The idea is simple yet it is an interesting new step in this line of work. The paper would be a good addition to the conference, and has support from reviewers.


Episodic Memory in Lifelong Language Learning

Neural Information Processing Systems

We introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. We propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup. Experiments on text classification and question answering demonstrate the complementary benefits of sparse experience replay and local adaptation to allow the model to continuously learn from new datasets. We also show that the space complexity of the episodic memory module can be reduced significantly ( 50-90%) by randomly choosing which examples to store in memory with a minimal decrease in performance. We consider an episodic memory component as a crucial building block of general linguistic intelligence and see our model as a first step in that direction.


How Relevant is Selective Memory Population in Lifelong Language Learning?

Araujo, Vladimir, Balabin, Helena, Hurtado, Julio, Soto, Alvaro, Moens, Marie-Francine

arXiv.org Artificial Intelligence

Lifelong language learning seeks to have models continuously learn multiple tasks in a sequential order without suffering from catastrophic forgetting. State-of-the-art approaches rely on sparse experience replay as the primary approach to prevent forgetting. Experience replay usually adopts sampling methods for the memory population; however, the effect of the chosen sampling strategy on model performance has not yet been studied. In this paper, we investigate how relevant the selective memory population is in the lifelong learning process of text classification and question-answering tasks. We found that methods that randomly store a uniform number of samples from the entire data stream lead to high performances, especially for low memory size, which is consistent with computer vision studies.


Lifelong Machine Learning of Functionally Compositional Structures

Mendez, Jorge A.

arXiv.org Artificial Intelligence

A hallmark of human intelligence is the ability to construct self-contained chunks of knowledge and reuse them in novel combinations for solving different problems. Learning such compositional structures has been a challenge for artificial systems, due to the underlying combinatorial search. To date, research into compositional learning has largely proceeded separately from work on lifelong or continual learning. This dissertation integrated these two lines of work to present a general-purpose framework for lifelong learning of functionally compositional structures. The framework separates the learning into two stages: learning how to combine existing components to assimilate a novel problem, and learning how to adapt the existing components to accommodate the new problem. This separation explicitly handles the trade-off between stability and flexibility. This dissertation instantiated the framework into various supervised and reinforcement learning (RL) algorithms. Supervised learning evaluations found that 1) compositional models improve lifelong learning of diverse tasks, 2) the multi-stage process permits lifelong learning of compositional knowledge, and 3) the components learned by the framework represent self-contained and reusable functions. Similar RL evaluations demonstrated that 1) algorithms under the framework accelerate the discovery of high-performing policies, and 2) these algorithms retain or improve performance on previously learned tasks. The dissertation extended one lifelong compositional RL algorithm to the nonstationary setting, where the task distribution varies over time, and found that modularity permits individually tracking changes to different elements in the environment. The final contribution of this dissertation was a new benchmark for compositional RL, which exposed that existing methods struggle to discover the compositional properties of the environment.


Episodic Memory in Lifelong Language Learning

d', Autume, Cyprien de Masson, Ruder, Sebastian, Kong, Lingpeng, Yogatama, Dani

Neural Information Processing Systems

We introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. We propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup. Experiments on text classification and question answering demonstrate the complementary benefits of sparse experience replay and local adaptation to allow the model to continuously learn from new datasets. We also show that the space complexity of the episodic memory module can be reduced significantly ( 50-90%) by randomly choosing which examples to store in memory with a minimal decrease in performance. We consider an episodic memory component as a crucial building block of general linguistic intelligence and see our model as a first step in that direction.