Where does In-context Translation Happen in Large Language Models
Sia, Suzanna, Mueller, David, Duh, Kevin
–arXiv.org Artificial Intelligence
Prior work on Self-supervised large language models have in-context MT has focused on prompt-engineering, treating demonstrated the ability to perform Machine GPT models as black boxes by focusing on which examples Translation (MT) via in-context learning, but little to provide in-context (Moslem et al., 2023). Agrawal et al. is known about where the model performs (2022) apply similarity-based retrieval to select in-context the task with respect to prompt instructions and examples, while Sia & Duh (2023) suggest a coherencebased demonstration examples. In this work, we attempt approach. However, these works apply surface level to characterize the region where large language interventions leaving the internal mechanism of MT in GPT models transition from in-context learners to translation models largely not understood.
arXiv.org Artificial Intelligence
Mar-7-2024
- Country:
- Asia > Middle East
- UAE (0.14)
- Europe > Italy (0.14)
- North America > Canada (0.14)
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.92)
- Technology: