On the relationship between multitask neural networks and multitask Gaussian Processes
K, Karthikeyan, Bharti, Shubham Kumar, Rai, Piyush
Multitask learning (MTL) is a learning paradigm in which multiple tasks are learned jointly, aiming to improve the performance of individual tasks by sharing information across tasks [4, 26], using various information sharing mechanisms. For example, MTL models based on deep neural networks commonly use shared hidden layers for all the tasks; probabilistic MTL models are usually based on shared priors over the parameters of the multiple tasks [16, 5]; Gaussian Process based models, e.g., multitask Gaussian Processes (GP) and extensions [2, 23], commonly employ covariance functions that models both inputs and task similarity. Multi-label, multi-class, multi-output learning can be seen as special cases of multitask learning where each task has the same set of inputs. Transfer learning is also similar to MTL, except that the objective of MTL is to improve the performance over all the tasks whereas the objective of transfer learning is to usually improve the performance of a target task by leveraging information from source tasks [26]. Zero-shot learning and few-shot learning are also closely related to MTL. Prior works [14, 24] have shown that a fully connected Bayesian neural network (NN) [13, 15] with a single, infinitely-wide hidden layer, with independent and identically distributed (i.i.d) priors on weights, is equivalent to a Gaussian Process. The result has recently been also generalized to deep Bayesian neural networks [9] with any number of hidden layers. These connections between Bayesian neural networks and GP offer many benefits, such as theoretical understanding of neural networks, efficient Bayesian inference for deep NN by learning the equivalent GP, etc. Motivated by the equivalence of deep Bayesian neural networks and GP, in this work, we investigate whether a similar connection exists between deep multitask Bayesian neural networks [18] and multitask Gaussian Processes
Dec-11-2019