Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement
Feng, Yunlong, Xu, Yang, Teng, Dechuan, Mu, Honglin, Xu, Xiao, Qin, Libo, Che, Wanxiang, Zhu, Qingfu
–arXiv.org Artificial Intelligence
Decompilation transforms compiled code back into a high-level programming language for analysis when source code is unavailable. Previous work has primarily focused on enhancing decompilation performance by increasing the scale of model parameters or training data for pre-training. Based on the characteristics of the decompilation task, we propose two methods: (1) Without fine-tuning, the Self-Constructed Context Decompilation (sc$^2$dec) method recompiles the LLM's decompilation results to construct pairs for in-context learning, helping the model improve decompilation performance. (2) Fine-grained Alignment Enhancement (FAE), which meticulously aligns assembly code with source code at the statement level by leveraging debugging information, is employed during the fine-tuning phase to achieve further improvements in decompilation. By integrating these two methods, we achieved a Re-Executability performance improvement of approximately 7.35\% on the Decompile-Eval benchmark, establishing a new state-of-the-art performance of 55.03\%.
arXiv.org Artificial Intelligence
Jun-24-2024
- Country:
- Asia
- China > Heilongjiang Province
- Harbin (0.04)
- South Korea (0.04)
- China > Heilongjiang Province
- Europe
- Denmark > Capital Region
- Kongens Lyngby (0.14)
- Italy > Molise
- Campobasso Province > Campobasso (0.04)
- Denmark > Capital Region
- North America
- Canada > Ontario
- National Capital Region > Ottawa (0.04)
- United States
- California > San Diego County
- San Diego (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- New York > New York County
- New York City (0.04)
- California > San Diego County
- Canada > Ontario
- Asia
- Genre:
- Research Report (0.50)
- Industry:
- Law (0.46)
- Technology: