Goto

Collaborating Authors

 backbone network



CapProNet: Deep Feature Learning via Orthogonal Projections onto Capsule Subspaces

Liheng Zhang, Marzieh Edraki, Guo-Jun Qi

Neural Information Processing Systems

Then, one can adopt theprinciple of separatingthe presence of an entity and its instantiation parameters into capsule length and orientation, respectively. In particular, we use the lengths of capsules to score the presence of entity classes corresponding to different subspaces, while their orientations are used to instantiate the parameters of entity properties such as poses, scales, deformations and textures.






54801e196796134a2b0ae5e8adef502f-Paper-Conference.pdf

Neural Information Processing Systems

Although recently proposed parameter-efficient transfer learning (PETL) techniques allowupdating asmallsubsetofparameters (e.g. This is because the gradient computation for the trainable parameters still requires backpropagation through thelargepre-trained backbone model.