d9d347f57ae11f34235b4555710547d8-Supplemental.pdf
–Neural Information Processing Systems
Let X,Y,Z be random variables. Let g: X R be a measurable function, and let Ex Q[expg(x)] .Then DKL(P||Q)=sup Their work has built a connection between PACBayes meta-learning and Hierarchical Variational Bayes. In Appendix A.3 of [1], they give thegenerativegraph model formeta learning whereU W S (their notation usedψ instead of U). The proof technique is analogous to Theorem 5.1. LetΦ = (U,W1:n) be a collection of random variables whereΦ U Wn such thatΦandS1:n follow the joint distributionPΦ,S1;n. Based on Theorem 5.2, for the Meta-SGLD that satisfies Assumption 1, if we set Infact, the algorithm has anest-loop structure, we just list the abovesimple sub-structures for the firststepoftheproof.
Neural Information Processing Systems
Feb-11-2026, 10:26:36 GMT
- Technology: