Goto

Collaborating Authors

 target data


Fine-tuning Factor Augmented Neural Lasso for Heterogeneous Environments

Chai, Jinhang, Fan, Jianqing, Gao, Cheng, Yin, Qishuo

arXiv.org Machine Learning

Fine-tuning is a widely used strategy for adapting pre-trained models to new tasks, yet its methodology and theoretical properties in high-dimensional nonparametric settings with variable selection have not yet been developed. This paper introduces the fine-tuning factor augmented neural Lasso (FAN-Lasso), a transfer learning framework for high-dimensional nonparametric regression with variable selection that simultaneously handles covariate and posterior shifts. We use a low-rank factor structure to manage high-dimensional dependent covariates and propose a novel residual fine-tuning decomposition in which the target function is expressed as a transformation of a frozen source function and other variables to achieve transfer learning and nonparametric variable selection. This augmented feature from the source predictor allows for the transfer of knowledge to the target domain and reduces model complexity there. We derive minimax-optimal excess risk bounds for the fine-tuning FAN-Lasso, characterizing the precise conditions, in terms of relative sample sizes and function complexities, under which fine-tuning yields statistical acceleration over single-task learning. The proposed framework also provides a theoretical perspective on parameter-efficient fine-tuning methods. Extensive numerical experiments across diverse covariate- and posterior-shift scenarios demonstrate that the fine-tuning FAN-Lasso consistently outperforms standard baselines and achieves near-oracle performance even under severe target sample size constraints, empirically validating the derived rates.


Confident-Anchor-InducedMulti-Source-Free DomainAdaptation

Neural Information Processing Systems

Unsupervised domain adaptation has attracted appealing academic attentions by transferring knowledge from labeled source domain to unlabeled target domain.





Optimal Transport-Guided Conditional Score-Based Diffusion Model Xiang Gu1, Liwei Y ang

Neural Information Processing Systems

Conditional score-based diffusion model (SBDM) is for conditional generation of target data with paired data as condition, and has achieved great success in image translation. However, it requires the paired data as condition, and there would be insufficient paired data provided in real-world applications.





d5c04aa72b92c53bda5b525b60958295-Supplemental-Conference.pdf

Neural Information Processing Systems

Westudy linear regression under covariate shift, where themarginal distribution over the input covariates differs in the source and the target domains, while the conditional distribution of the output given the input covariates is similar across thetwodomains.