Latent Attention For If-Then Program Synthesis
–Neural Information Processing Systems
Automatic translation from natural language descriptions into programs is a longstanding challenging problem. In this work, we consider a simple yet important sub-problem: translation from textual descriptions to If-Then programs. We devise a novel neural network architecture for this task which we train end-toend. Specifically, we introduce Latent Attention, which computes multiplicative weights for the words in the description in a two-stage process with the goal of better leveraging the natural language structures that indicate the relevant parts for predicting program elements. Our architecture reduces the error rate by 28.57% compared to prior art [3]. We also propose a one-shot learning scenario of If-Then program synthesis and simulate it with our existing dataset. We demonstrate a variation on the training procedure for this scenario that outperforms the original procedure, significantly closing the gap to the model trained with all data.
Neural Information Processing Systems
Mar-12-2024, 12:44:54 GMT
- Country:
- Europe > Spain (0.14)
- North America > United States (0.14)
- Genre:
- Research Report (1.00)
- Industry:
- Information Technology > Services (0.94)
- Technology: