Goto

Collaborating Authors

 equivcnp




Approximately Equivariant Neural Processes

Ashman, Matthew, Diaconu, Cristiana, Weller, Adrian, Bruinsma, Wessel, Turner, Richard E.

arXiv.org Machine Learning

Equivariant deep learning architectures exploit symmetries in learning problems to improve the sample efficiency of neural-network-based models and their ability to generalise. However, when modelling real-world data, learning problems are often not exactly equivariant, but only approximately. For example, when estimating the global temperature field from weather station observations, local topographical features like mountains break translation equivariance. In these scenarios, it is desirable to construct architectures that can flexibly depart from exact equivariance in a data-driven way. In this paper, we develop a general approach to achieving this using existing equivariant architectures. Our approach is agnostic to both the choice of symmetry group and model architecture, making it widely applicable. We consider the use of approximately equivariant architectures in neural processes (NPs), a popular family of meta-learning models. We demonstrate the effectiveness of our approach on a number of synthetic and real-world regression experiments, demonstrating that approximately equivariant NP models can outperform both their non-equivariant and strictly equivariant counterparts.


Group Equivariant Conditional Neural Processes

Kawano, Makoto, Kumagai, Wataru, Sannai, Akiyoshi, Iwasawa, Yusuke, Matsuo, Yutaka

arXiv.org Machine Learning

We present the group equivariant conditional neural process (EquivCNP), a metalearning method with permutation invariance in a data set as in conventional conditional neural processes (CNPs), and it also has transformation equivariance in data space. Incorporating group equivariance, such as rotation and scaling equivariance, provides a way to consider the symmetry of real-world data. We give a decomposition theorem for permutation-invariant and group-equivariant maps, which leads us to construct EquivCNPs with an infinite-dimensional latent space to handle group symmetries. In this paper, we build architecture using Lie group convolutional layers for practical implementation. We show that EquivCNP with translation equivariance achieves comparable performance to conventional CNPs in a 1D regression task. Moreover, we demonstrate that incorporating an appropriate Lie group equivariance, EquivCNP is capable of zero-shot generalization for an image-completion task by selecting an appropriate Lie group equivariance. Data symmetry has played a significant role in the deep neural networks. In particular, a convolutional neural network, which play an important part in the recent achievements of deep neural networks, has translation equivariance that preserves the symmetry of the translation group. From the same point of view, many studies have aimed to incorporate various group symmetries into neural networks, especially convolutional operation (Cohen et al., 2019; Defferrard et al., 2019; Finzi et al., 2020).


Equivariant Conditional Neural Processes

Holderrieth, Peter, Hutchinson, Michael, Teh, Yee Whye

arXiv.org Machine Learning

We introduce Equivariant Conditional Neural Processes (EquivCNPs), a new member of the Neural Process family that models vector-valued data in an equivariant manner with respect to isometries of $\mathbb{R}^n$. In addition, we look at multi-dimensional Gaussian Processes (GPs) under the perspective of equivariance and find the sufficient and necessary constraints to ensure a GP over $\mathbb{R}^n$ is equivariant. We test EquivCNPs on the inference of vector fields using Gaussian process samples and real-world weather data. We observe that our model significantly improves the performance of previous models. By imposing equivariance as constraints, the parameter and data efficiency of these models are increased. Moreover, we find that EquivCNPs are more robust against overfitting to local conditions of the training data.