A rationale from frequency perspective for grokking in training neural network

Zhou, Zhangchen, Zhang, Yaoyu, Xu, Zhi-Qin John

arXiv.org Machine Learning 

Grokking is the phenomenon where neural networks (NNs) initially fit the training data and later generalize to the test data during training. In this paper, we empirically provide a frequency perspective to explain the emergence of this phenomenon in NNs. The core insight is that the networks initially learn the less salient frequency components present in the test data. We observe this phenomenon across both synthetic and real datasets, offering a novel viewpoint for elucidating the grokking phenomenon by characterizing it through the lens of frequency dynamics during the training process.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found