On Conditional and Compositional Language Model Differentiable Prompting
Pilault, Jonathan, Liu, Can, Bansal, Mohit, Dreyer, Markus
–arXiv.org Artificial Intelligence
Prompts have been shown to be an effective method to adapt a frozen Pretrained Language Model (PLM) to perform well on downstream tasks. Prompts can be represented by a human-engineered word sequence or by a learned continuous embedding. In this work, we investigate conditional and compositional differentiable prompting. We propose a new model, Prompt Production System (PRopS), which learns to transform task instructions or input metadata, into continuous prompts that elicit task-specific outputs from the PLM. Our model uses a modular network structure based on our neural formulation of Production Systems, which allows the model to learn discrete rules -- neural functions that learn to specialize in transforming particular prompt input patterns, making it suitable for compositional transfer learning and few-shot learning. We present extensive empirical and theoretical analysis and show that PRopS consistently surpasses other PLM adaptation techniques, and often improves upon fully fine-tuned models, on compositional generalization tasks, controllable summarization and multilingual translation, while needing fewer trainable parameters.
arXiv.org Artificial Intelligence
Jul-3-2023
- Country:
- Europe > Germany (0.28)
- North America
- Canada (0.46)
- United States (0.46)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Education (1.00)
- Health & Medicine (1.00)
- Technology: