meta seq2seq
Country:
- Europe > Germany > Berlin (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > Canada (0.04)
- Europe > Belgium > Brussels-Capital Region > Brussels (0.04)
Genre:
- Research Report (0.46)
- Workflow (0.46)
Technology:
Country:
- Europe > Germany > Berlin (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > Canada (0.04)
- Europe > Belgium > Brussels-Capital Region > Brussels (0.04)
Genre:
- Research Report (0.46)
- Workflow (0.46)
Technology:
Compositional generalization through meta sequence-to-sequence learning
People can learn a new concept and use it compositionally, understanding how to "blicket twice" after learning how to "blicket." In contrast, powerful sequence-to-sequence (seq2seq) neural networks fail such tests of compositionality, especially when composing new concepts together with existing concepts. In this paper, I show that neural networks can be trained to generalize compositionally through meta seq2seq learning. In this approach, models train on a series of seq2seq problems to acquire the compositional skills needed to solve new seq2seq problems. Meta se2seq learning solves several of the SCAN tests for compositional learning and can learn to apply rules to variables.
1906.05381
Country:
- Europe > Germany > Berlin (0.04)
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Europe > Belgium > Brussels-Capital Region > Brussels (0.04)
Genre:
- Research Report (1.00)
- Workflow (0.68)
Technology: