Goto

Collaborating Authors

 dependence







Nearly Optimal Bounds for Cyclic Forgetting

Neural Information Processing Systems

One challenge of continual learning is "catastrophic forgetting" [Had+20; VT19; Kem+18]: A model However, if contexts similar to A arise repeatedly, this may be undesirable.. In machine learning, many data sets display cyclic or periodic patterns.


Supplementary Materials for " Multi-Agent Meta-Reinforcement Learning " AT echnical Lemmas

Neural Information Processing Systems

From the three-points identity of the Bregman divergence (Lemma 3.1 of [9]), KL (x y) KL ( x y) = KL (x x) + ln x ln y,x x (12) The first term in (12) can be bounded by KL (x x) = By the Hölder's inequality, the second term in (12) is bounded as ln x ln y,x x ln x ln y Lemma 5. Consider a block diagonal matrix We prove the lemma via induction on N . This completes the induction proof.Lemma 6. We introduce one more notation before presenting the proof. This leads us to the initialization-dependent convergence rate of Algorithm 1, which we re-state and prove as follows. In addition, if we initialize the players' policies to be uniform policies, i.e., The rest of the proof follows by putting all the aforementioned results together.


Streaming PCA for Markovian Data

Neural Information Processing Systems

Since its inception in 1982, Oja's algorithm has become an established method for streaming principle component analysis (PCA).