Learning Compressed Transforms with Low Displacement Rank
Anna Thomas, Albert Gu, Tri Dao, Atri Rudra, Christopher Ré
–Neural Information Processing Systems
The low displacement rank (LDR) framework for structured matrices represents a matrix through two displacement operators and a low-rank residual. Existing use of LDR matrices in deep learning has applied fixed displacement operators encoding forms of shift invariance akin to convolutions. We introduce a class of LDR matrices with more general displacement operators, and explicitly learn over both the operators and the low-rank component.
Neural Information Processing Systems
Oct-9-2024, 02:40:26 GMT