sequential memory
Sequential Memory with Temporal Predictive Coding Supplementary Materials
In Algorithm 1 we present the memorizing and recalling procedures of the single-layer tPC.Algorithm 1 Memorizing and recalling with single-layer tPC Here we present the proof for Property 1 in the main text, that the single-layer tPC can be viewed as a "whitened" version of the AHN. When applied to the data sequence, it whitens the data such that (i.e., Eq.16 in the main text): These observations are consistent with our numerical results shown in Figure 1. MCAHN has a much larger MSE than that of the tPC because of the entirely wrong recalls. In Figure 1 we also present the online recall results of the models in MovingMNIST, CIFAR10 and UCF101. In Fig 4 we show a natural example of aliased sequences where a movie of a human doing push-ups is memorized and recalled by the model.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > Canada > Ontario > Toronto (0.04)
- Africa > Mali (0.04)
- Workflow (0.46)
- Research Report (0.46)
Sequential Memory with Temporal Predictive Coding
Forming accurate memory of sequential stimuli is a fundamental function of biological agents. However, the computational mechanism underlying sequential memory in the brain remains unclear. Inspired by neuroscience theories and recent successes in applying predictive coding (PC) to \emph{static} memory tasks, in this work we propose a novel PC-based model for \emph{sequential} memory, called \emph{temporal predictive coding} (tPC). We show that our tPC models can memorize and retrieve sequential inputs accurately with a biologically plausible neural implementation. Importantly, our analytical study reveals that tPC can be viewed as a classical Asymmetric Hopfield Network (AHN) with an implicit statistical whitening process, which leads to more stable performance in sequential memory tasks of structured inputs. Moreover, we find that tPC exhibits properties consistent with behavioral observations and theories in neuroscience, thereby strengthening its biological relevance. Our work establishes a possible computational mechanism underlying sequential memory in the brain that can also be theoretically interpreted using existing memory model frameworks.
Sequential Memory with Temporal Predictive Coding Supplementary Materials
In Algorithm 1 we present the memorizing and recalling procedures of the single-layer tPC.Algorithm 1 Memorizing and recalling with single-layer tPC Here we present the proof for Property 1 in the main text, that the single-layer tPC can be viewed as a "whitened" version of the AHN. When applied to the data sequence, it whitens the data such that (i.e., Eq.16 in the main text): These observations are consistent with our numerical results shown in Figure 1. MCAHN has a much larger MSE than that of the tPC because of the entirely wrong recalls. In Figure 1 we also present the online recall results of the models in MovingMNIST, CIFAR10 and UCF101. In Fig 4 we show a natural example of aliased sequences where a movie of a human doing push-ups is memorized and recalled by the model.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > Canada > Ontario > Toronto (0.04)
- Africa > Mali (0.04)
- Workflow (0.46)
- Research Report (0.46)
Sequential Memory with Temporal Predictive Coding
Forming accurate memory of sequential stimuli is a fundamental function of biological agents. However, the computational mechanism underlying sequential memory in the brain remains unclear. Inspired by neuroscience theories and recent successes in applying predictive coding (PC) to \emph{static} memory tasks, in this work we propose a novel PC-based model for \emph{sequential} memory, called \emph{temporal predictive coding} (tPC). We show that our tPC models can memorize and retrieve sequential inputs accurately with a biologically plausible neural implementation. Importantly, our analytical study reveals that tPC can be viewed as a classical Asymmetric Hopfield Network (AHN) with an implicit statistical whitening process, which leads to more stable performance in sequential memory tasks of structured inputs.
Sequential Memory with Temporal Predictive Coding
Tang, Mufeng, Barron, Helen, Bogacz, Rafal
Forming accurate memory of sequential stimuli is a fundamental function of biological agents. However, the computational mechanism underlying sequential memory in the brain remains unclear. Inspired by neuroscience theories and recent successes in applying predictive coding (PC) to \emph{static} memory tasks, in this work we propose a novel PC-based model for \emph{sequential} memory, called \emph{temporal predictive coding} (tPC). We show that our tPC models can memorize and retrieve sequential inputs accurately with a biologically plausible neural implementation. Importantly, our analytical study reveals that tPC can be viewed as a classical Asymmetric Hopfield Network (AHN) with an implicit statistical whitening process, which leads to more stable performance in sequential memory tasks of structured inputs. Moreover, we find that tPC exhibits properties consistent with behavioral observations and theories in neuroscience, thereby strengthening its biological relevance. Our work establishes a possible computational mechanism underlying sequential memory in the brain that can also be theoretically interpreted using existing memory model frameworks.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > United States > New York (0.04)
- North America > Canada > Ontario > Toronto (0.04)
- Africa > Mali (0.04)
Illustrated Guide to Recurrent Neural Networks
I'm Michael also known as LearnedVector. If you are just getting started in ML and want to get some intuition behind Recurrent neural networks, this post is for you. You can also watch the video version of this post if you prefer. If you want to get into machine learning, recurrent neural networks are a powerful technique that is important to understand. If you use a smartphone or frequently surf the internet, odd's are you've used applications that leverages RNN's.