Goto

Collaborating Authors

 variational posterior


Appendix for "Episodic Multi-Task Learning with Heterogeneous Neural Processes "

Neural Information Processing Systems

Appendix for "Episodic Multi-T ask Learning with Heterogeneous Neural Processes" In this section, we list frequently asked questions from researchers who help proofread this manuscript. As shown in Table 1, we use "Heterogeneous tasks" to distinguish the different branches of multi-task Meanwhile, "Episodic training" is used to describe the data-feeding strategy. Thus, "Heterogeneous tasks" is not available here (-). In episodic multi-task learning, we restrict the scope of the problem to the case where tasks in the same episode are related and share the same target space. This also implies that tasks with the same target space are related.






Bi-levelScoreMatchingforLearningEnergy-based LatentVariableModels

Neural Information Processing Systems

However, it remains largely open to learn energy-based latent variable models (EBLVMs), exceptsomespecialcases. Thispaperpresents abi-levelscorematching (BiSM) method to learn EBLVMs with general structures by reformulating SM as a bilevel optimization problem. The higher level introduces a variational posterior of the latent variables and optimizes a modified SM objective, and the lower level optimizes the variational posterior to fit the true posterior.




VariationalBayesianMonteCarlo withNoisyLikelihoods

Neural Information Processing Systems

Intheoriginalformulation, observations are assumed to be exact (non-noisy), so the GP likelihood only included a small observation noise σ2obs for numerical stability [32].