Where does In-context Translation Happen in Large Language Models

Sia, Suzanna, Mueller, David, Duh, Kevin

arXiv.org Artificial Intelligence 

Prior work on Self-supervised large language models have in-context MT has focused on prompt-engineering, treating demonstrated the ability to perform Machine GPT models as black boxes by focusing on which examples Translation (MT) via in-context learning, but little to provide in-context (Moslem et al., 2023). Agrawal et al. is known about where the model performs (2022) apply similarity-based retrieval to select in-context the task with respect to prompt instructions and examples, while Sia & Duh (2023) suggest a coherencebased demonstration examples. In this work, we attempt approach. However, these works apply surface level to characterize the region where large language interventions leaving the internal mechanism of MT in GPT models transition from in-context learners to translation models largely not understood.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found