AI Code Hallucinations Increase the Risk of 'Package Confusion' Attacks
AI-generated computer code is rife with references to non-existent third-party libraries, creating a golden opportunity for supply-chain attacks that poison legitimate programs with malicious packages that can steal data, plant backdoors, and carry out other nefarious actions, newly published research shows. The study, which used 16 of the most widely used large language models to generate 576,000 code samples, found that 440,000 of the package dependencies they contained were "hallucinated," meaning they were non-existent. A dependency is an essential code component that a separate piece of code requires to work properly. Dependencies save developers the hassle of rewriting code and are an essential part of the modern software supply chain. These non-existent dependencies represent a threat to the software supply chain by exacerbating so-called dependency confusion attacks.
Apr-30-2025, 19:08:33 GMT