Adaptive and Robust Multi-task Learning

Duan, Yaqi, Wang, Kaizheng

arXiv.org Machine Learning 

Multi-task learning (MTL) solves a number of learning tasks simultaneously. It has become increasingly popular in modern applications with data generated by multiple sources. When the tasks share certain common structures, a properly chosen MTL algorithm can leverage that to improve the performance. However, task relatedness is usually unknown and hard to quantify in practice; heterogeneity can even make multi-task approaches perform worse than single-task learning, which trains models separately on their individual datasets. In this paper, we study MTL from a statistical perspective and develop a family of reliable approaches that adapt to the unknown task relatedness and are robust against outlier tasks with possibly contaminated data.