PVeRA: Probabilistic Vector-Based Random Matrix Adaptation
Fillioux, Leo, Ferrante, Enzo, Cournède, Paul-Henry, Vakalopoulou, Maria, Christodoulidis, Stergios
–arXiv.org Artificial Intelligence
Large foundation models have emerged in the last years and are pushing performance boundaries for a variety of tasks. Training or even finetuning such models demands vast datasets and computational resources, which are often scarce and costly. Adaptation methods provide a computationally efficient solution to address these limitations by allowing such models to be finetuned on small amounts of data and computing power . This is achieved by appending new trainable modules to frozen backbones with only a fraction of the trainable parameters and fitting only these modules on novel tasks. Recently, the V eRA adapter was shown to excel in parameter-efficient adaptations by utilizing a pair of frozen random low-rank matrices shared across all layers. In this paper, we propose PV eRA, a probabilistic version of the V eRA adapter, which modifies the low-rank matrices of V eRA in a probabilistic manner . This modification naturally allows handling inherent ambiguities in the input and allows for different sampling configurations during training and testing. A comprehensive evaluation was performed on the VTAB-1k benchmark and seven adapters, with PV eRA outperforming V eRA and other adapters. Our code for training models with PV eRA and benchmarking all adapters is available here.
arXiv.org Artificial Intelligence
Dec-9-2025
- Country:
- Europe
- France (0.04)
- Romania > Sud - Muntenia Development Region
- Giurgiu County > Giurgiu (0.04)
- North America
- Canada > Alberta
- United States (0.14)
- South America > Argentina
- Pampas > Buenos Aires F.D. > Buenos Aires (0.04)
- Europe
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Health & Medicine (1.00)
- Technology: