Efficient Deep Gaussian Process Models for Variable-Sized Input
Laradji, Issam H., Schmidt, Mark, Pavlovic, Vladimir, Kim, Minyoung
Deep Gaussian processes (DGP) have appealing Bayesian properties, can handle variable-sized data, and learn deep features. Their limitation is that they do not scale well with the size of the data. Existing approaches address this using a deep random feature (DRF) expansion model, which makes inference tractable by approximating DGPs. However, DRF is not suitable for variable-sized input data such as trees, graphs, and sequences. We introduce the GP-DRF, a novel Bayesian model with an input layer of GPs, followed by DRF layers. The key advantage is that the combination of GP and DRF leads to a tractable model that can both handle a variable-sized input as well as learn deep long-range dependency structures of the data. We provide a novel efficient method to simultaneously infer the posterior of GP's latent vectors and infer the posterior of DRF's internal weights and random frequencies. Our experiments show that GP-DRF outperforms the standard GP model and DRF model across many datasets. Furthermore, they demonstrate that GP-DRF enables improved uncertainty quantification compared to GP and DRF alone, with respect to a Bhattacharyya distance assessment. Source code is available at https://github.com/IssamLaradji/GP_DRF.
May-16-2019
- Country:
- Asia
- India (0.04)
- South Korea > Seoul
- Seoul (0.04)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.14)
- North America
- Canada
- British Columbia > Metro Vancouver Regional District
- Vancouver (0.04)
- Quebec > Montreal (0.04)
- British Columbia > Metro Vancouver Regional District
- United States
- New Jersey > Middlesex County
- Piscataway (0.04)
- New York (0.04)
- New Jersey > Middlesex County
- Canada
- Asia
- Genre:
- Research Report (0.82)
- Industry: