Goto

Collaborating Authors

 Dade, Nii Osae Osae


StarCoder 2 and The Stack v2: The Next Generation

arXiv.org Artificial Intelligence

The BigCode project, an open-scientific collaboration focused on the responsible development of Large Language Models for Code (Code LLMs), introduces StarCoder2. In partnership with Software Heritage (SWH), we build The Stack v2 on top of the digital commons of their source code archive. Alongside the SWH repositories spanning 619 programming languages, we carefully select other high-quality data sources, such as GitHub pull requests, Kaggle notebooks, and code documentation. This results in a training set that is 4x larger than the first StarCoder dataset. We train StarCoder2 models with 3B, 7B, and 15B parameters on 3.3 to 4.3 trillion tokens and thoroughly evaluate them on a comprehensive set of Code LLM benchmarks. We find that our small model, StarCoder2-3B, outperforms other Code LLMs of similar size on most benchmarks, and also outperforms StarCoderBase-15B. Our large model, StarCoder2- 15B, significantly outperforms other models of comparable size. In addition, it matches or outperforms CodeLlama-34B, a model more than twice its size. Although DeepSeekCoder- 33B is the best-performing model at code completion for high-resource languages, we find that StarCoder2-15B outperforms it on math and code reasoning benchmarks, as well as several low-resource languages. We make the model weights available under an OpenRAIL license and ensure full transparency regarding the training data by releasing the SoftWare Heritage persistent IDentifiers (SWHIDs) of the source code data.


Optimizing Large Language Models to Expedite the Development of Smart Contracts

arXiv.org Artificial Intelligence

Programming has always been at the heart of technological innovation in the 21st century. With the advent of blockchain technologies and the proliferation of web3 paradigms of decentralised applications, smart contracts have been very instrumental in enabling developers to build applications that reside on decentralised blockchains. Despite the huge interest and potential of smart contracts, there is still a significant knowledge and skill gap that developers need to cross in order to build web3 applications. In light of this, we introduce MazzumaGPT, a large language model that has been optimised to generate smart contract code and aid developers to scaffold development and improve productivity. As part of this research, we outline the optimisation and fine-tuning parameters, evaluate the model's performance on functional correctness and address the limitations and broader impacts of our research.