BMU-MoCo: Bidirectional Momentum Update for Continual Video-Language Modeling - Supplementary Material - Yizhao Gao

Neural Information Processing Systems 

We provide the pseudocode of our BMU-MoCo in Algorithm 1. Algorithm 1 Pseudocode of BMU-MoCo. The R@5 results and its corresponding FR/HM are reported. The memory data are simply used as training samples in the training process. The model architecture is exactly the same as Base-MoCo. Collecting highly parallel data for paraphrase evaluation.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found