Conditional Graph Neural Processes: A Functional Autoencoder Approach
Nassar, Marcel, Wang, Xin, Tumer, Evren
–arXiv.org Artificial Intelligence
We introduce a novel encoder-decoder architecture to embed functional processes into latent vector spaces. This embedding can then be decode d to sample the encoded functions over any arbitrary domain. This autoenco der generalizes the recently introduced Conditional Neural Process (CNP) model o f random processes. Our architecture employs the latest advances in graph neura l networks to process irregularly sampled functions. Thus, we refer to our model a s Conditional Graph Neural Process (CGNP). Graph neural networks can effective ly exploit "local" structures of the metric spaces over which the functions/pr ocessesare defined. The contributions of this paper are twofold: (i) a novel graph-b ased encoder-decoder architecture for functionaland process embeddings, and (i i) a demonstration of the importance of using the structure of metric spaces for this t ype of representations.
arXiv.org Artificial Intelligence
Dec-12-2018