Bootstrapping Neural Processes
Lee, Juho, Lee, Yoonho, Kim, Jungtaek, Yang, Eunho, Hwang, Sung Ju, Teh, Yee Whye
Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still rely on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.
Oct-27-2020
- Country:
- Asia > South Korea (0.46)
- Europe > United Kingdom
- England > Oxfordshire > Oxford (0.14)
- North America (0.28)
- Genre:
- Research Report (0.64)
- Technology: