Regularizing Towards Permutation Invariance In Recurrent Models
–Neural Information Processing Systems
In many machine learning problems the output should not depend on the order of the inputs. Such permutation invariant'' functions have been studied extensively recently. Here we argue that temporal architectures such as RNNs are highly relevant for such problems, despite the inherent dependence of RNNs on order. We show that RNNs can be regularized towards permutation invariance, and that this can result in compact models, as compared to non-recursive architectures. Existing solutions (e.g., DeepSets) mostly suggest restricting the learning problem to hypothesis classes which are permutation invariant by design.
Neural Information Processing Systems
Oct-11-2024, 11:22:01 GMT