Learning-to-Cache: Accelerating Diffusion Transformer via Layer Caching

Neural Information Processing Systems 

Notably, for U-ViT -H/2 on ImageNet, approximately 93.68% of layers are cacheable in the cache step,

Similar Docs  Excel Report  more

TitleSimilaritySource
None found