Parameter Competition Balancing for Model Merging Junlin Lee 1 Jing Li
–Neural Information Processing Systems
While fine-tuning pretrained models has become common practice, these models often underperform outside their specific domains. Recently developed model merging techniques enable the direct integration of multiple models, each finetuned for distinct tasks, into a single model. This strategy promotes multitasking capabilities without requiring retraining on the original datasets. However, existing methods fall short in addressing potential conflicts and complex correlations between tasks, especially in parameter-level adjustments, posing a challenge in effectively balancing parameter competition across various tasks.
Neural Information Processing Systems
Mar-26-2025, 03:14:30 GMT
- Genre:
- Research Report
- Experimental Study (0.93)
- New Finding (1.00)
- Research Report
- Industry:
- Information Technology > Security & Privacy (0.67)
- Technology:
- Information Technology
- Artificial Intelligence
- Cognitive Science > Problem Solving (0.67)
- Machine Learning
- Evolutionary Systems (0.93)
- Neural Networks > Deep Learning (0.46)
- Natural Language (1.00)
- Representation & Reasoning (1.00)
- Vision (1.00)
- Communications (0.93)
- Artificial Intelligence
- Information Technology