Explain-then-Translate: An Analysis on Improving Program Translation with Self-generated Explanations

Tang, Zilu, Agarwal, Mayank, Shypula, Alex, Wang, Bailin, Wijaya, Derry, Chen, Jie, Kim, Yoon

arXiv.org Artificial Intelligence 

This work explores the use of self-generated natural language explanations as an intermediate step for code-to-code translation with language models. Across three types of explanations and 19 programming languages constructed from the MultiPL-E dataset, we find the explanations to be particularly effective in the zero-shot case, improving performance by 12% on average. Improvements with natural language explanations are particularly pronounced on difficult programs. We release our dataset, code, and canonical solutions in all 19 languages.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found