Scripts & Frames

Continual Learning with Tiny Episodic Memories Machine Learning

Learning with less supervision is a major challenge in artificial intelligence. One sensible approach to decrease the amount of supervision is to leverage prior experience and transfer knowledge from tasks seen in the past. However, a necessary condition for a successful transfer is the ability to remember how to perform previous tasks. The Continual Learning (CL) setting, whereby an agent learns from a stream of tasks without seeing any example twice, is an ideal framework to investigate how to accrue such knowledge. In this work, we consider supervised learning tasks and methods that leverage a very small episodic memory for continual learning. Through an extensive empirical analysis across four benchmark datasets adapted to CL, we observe that a very simple baseline, which jointly trains on both examples from the current task as well as examples stored in the memory, outperforms state-of-the-art CL approaches with and without episodic memory. Surprisingly, repeated learning over tiny episodic memories does not harm generalization on past tasks, as joint training on data from subsequent tasks acts like a data dependent regularizer. We discuss and evaluate different approaches to write into the memory. Most notably, reservoir sampling works remarkably well across the board, except when the memory size is extremely small. In this case, writing strategies that guarantee an equal representation of all classes work better. Overall, these methods should be considered as a strong baseline candidate when benchmarking new CL approaches

Fused Gromov-Wasserstein distance for structured objects: theoretical foundations and mathematical properties Machine Learning

Optimal transport theory has recently found many applications in machine learning thanks to its capacity for comparing various machine learning objects considered as distributions. The Kantorovitch formulation, leading to the Wasserstein distance, focuses on the features of the elements of the objects but treat them independently, whereas the Gromov-Wasserstein distance focuses only on the relations between the elements, depicting the structure of the object, yet discarding its features. In this paper we propose to extend these distances in order to encode simultaneously both the feature and structure informations, resulting in the Fused Gromov-Wasserstein distance. We develop the mathematical framework for this novel distance, prove its metric and interpolation properties and provide a concentration result for the convergence of finite samples. We also illustrate and interpret its use in various contexts where structured objects are involved.

Episodic memory for continual model learning Machine Learning

Both the human brain and artificial learning agents operating in real-world or comparably complex environments are faced with the challenge of online model selection. In principle this challenge can be overcome: hierarchical Bayesian inference provides a principled method for model selection and it converges on the same posterior for both off-line (i.e. batch) and online learning. However, maintaining a parameter posterior for each model in parallel has in general an even higher memory cost than storing the entire data set and is consequently clearly unfeasible. Alternatively, maintaining only a limited set of models in memory could limit memory requirements. However, sufficient statistics for one model will usually be insufficient for fitting a different kind of model, meaning that the agent loses information with each model change. We propose that episodic memory can circumvent the challenge of limited memory-capacity online model selection by retaining a selected subset of data points. We design a method to compute the quantities necessary for model selection even when the data is discarded and only statistics of one (or few) learnt models are available. We demonstrate on a simple model that a limited-sized episodic memory buffer, when the content is optimised to retain data with statistics not matching the current representation, can resolve the fundamental challenge of online model selection.

Your dog can remember all those silly things you've done: Canines have 'episodic' memories, just like humans

Daily Mail - Science & tech

Dogs have a remarkable ability to recall events from the past, in a similar way to humans. That's according to a new study which found evidence canines have a similar'episodic memory' to their human counterparts. Dogs can recall a person's actions even when they do not expect to have their memory tested, says the research. Previously, evidence that animals use episodic memory has been hard to come by, as it's impossible to ask an animal, in this case a dog, what they remember (stock image) Dogs trained using the trick can watch a person perform an action and carry out the action themselves. For example, if the their owner jumps in the air and then gives the command'do it', the dog would jump in the air.

A Unified Bayesian Model of Scripts, Frames and Language

AAAI Conferences

We present the first probabilistic model to capture all levels of the Minsky Frame structure, with the goal of corpus-based induction of scenario definitions. Our model unifies prior efforts in discourse-level modeling with that of Fillmore's related notion of frame, as captured in sentence-level, FrameNet semantic parses; as part of this, we resurrect the coupling among Minsky's frames, Schank's scripts and Fillmore's frames, as originally laid out by those authors. Empirically, our approach yields improved scenario representations, reflected quantitatively in lower surprisal and more coherent latent scenarios.

Application of Recent Episodic Memory Function for Preparing and Presenting Topics of Group Conversation Supported by Coimagination Method

AAAI Conferences

There is not much evaluation technique of coimagination method, which is one of the group conversation techniques have been proposed for the purpose of cognitive function training. As one of the indicator of usefulness of cognitive function training, episodic memory is usable. Therefore we have proposed an analytical method for measuring the utilization of episodic memory in coimaginaiton method. Thereafter, We conducted the experiment of group conversation base on walking around in order to give the common experience to the participants, and analyzed the results by the proposed method. In consequence, it is revealed the occurrence of past episodic memory. Furthermore, it indicates individual difference of episodic memory utilization quantitatively in terms of memory taxonomy.

A Multi-Domain Evaluation of Scaling in a General Episodic Memory

AAAI Conferences

Episodic memory endows agents with numerous general cognitive capabilities, such as action modeling and virtual sensing. However, for long-lived agents, there are numerous unexplored computational challenges in supporting useful episodic-memory functions while maintaining real-time reactivity. In this paper, we review the implementation of episodic memory in Soar and present an expansive evaluation of that system. We demonstrate useful applications of episodic memory across a variety of domains, including games, mobile robotics, planning, and linguistics. In these domains, we characterize properties of environments, tasks, and episodic cues that affect performance, and evaluate the ability of Soar’s episodic memory to support hours to days of real-time operation.

Ziggurat: Steps Toward a General Episodic Memory

AAAI Conferences

Evidence indicates that episodic memory plays an important role in general cognition. A modest body of research exists for creating artificial episodic memory systems. To date, research has focused on exploring their benefits. As a result, existing episodic memory systems rely on a small, relevant memory cue for effective memory retrieval. We present Ziggurat, a domain-independent episodic memory structure and accompanying episodic learning algorithm that learns the temporal context of recorded episodes. Ziggurat's context-based memory retrieval means that it does not have to rely on relevant agent cues for effective memory retrieval; it also allows an agent to dynamically make plans using past experiences. In our experimental trials in two different domains, Ziggurat performs as well or better than an episodic memory implementation based on most other systems.

An Investigation into the Utility of Episodic Memory for Cognitive Architectures

AAAI Conferences

In most cognitive architectures, episodic memory is either not implemented, or plays a secondary role. In contrast, in the Xapagy architecture episodic memory is the primary means of acquiring and using knowledge. Shadowing, the main reasoning method of the system, relies on unprocessed historical recordings of concrete events to determine the agent's behavior. This paper outlines the use of episodic memory in Xapagy, and investigates whether episodic memory might play a wider role in cognitive architectures at large.