Goto

Collaborating Authors

 polyhistor-lite




Polyhistor: Parameter-Efficient Multi-Task Adaptation for Dense Vision Tasks Appendix 1 Additional analyses

Neural Information Processing Systems

As shown in Table 1 and 2, we find that the trend of all methods are similar to the results on SwinTransformer-Tiny. Specifically, most Another important hyper-parameter in our model is ranks of hyper-network outputs. In Figure 1b of the main paper, we presented the results of different baseline methods with different hyper-parameters. We show that our method generalizes to different backbones. We use SwinTransformer-Base pretrained on ImageNet-1k as the feature backbone. We summarize the difference between Visual Prompt Tuning and our method in the following points.