Learning Compressed Transforms with Low Displacement Rank
–Neural Information Processing Systems
The low displacement rank (LDR) framework for structured matrices represents a matrix through two displacement operators and a low-rank residual. Existing use of LDR matrices in deep learning has applied fixed displacement operators encoding forms of shift invariance akin to convolutions. We introduce a rich class of LDR matrices with more general displacement operators, and explicitly learn over both the operators and the low-rank component.
Neural Information Processing Systems
Mar-16-2026, 23:22:50 GMT
- Technology: