NeuralRule-ExecutionTrackingMachineFor Transformer-BasedTextGeneration
–Neural Information Processing Systems
Sequence-to-Sequence (Seq2Seq) neural text generation models, especially the pre-trained ones (e.g., BART and T5), have exhibited compelling performance on various natural language generation tasks. However,the black-box nature of these models limits their application in tasks where specific rules (e.g., controllable constraints, prior knowledge) need to be executed.
Neural Information Processing Systems
Feb-9-2026, 19:59:32 GMT
- Country:
- Asia > China
- Europe
- Belgium > Brussels-Capital Region
- Brussels (0.04)
- Denmark > Capital Region
- Copenhagen (0.04)
- Germany > Berlin (0.04)
- Italy > Tuscany
- Florence (0.04)
- Belgium > Brussels-Capital Region
- North America > United States
- California (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- Oregon > Multnomah County
- Portland (0.04)
- Technology: