AI/ML, Supply chain

Supply chain at risk of AI-hallucinated code dependencies

Generative AI

Extensive dependence on large language models in the code development process could increase the risk of a slopsquatting supply chain intrusions, which involve the creation of hallucinated open source software to lure targets into downloading malicious packages, reports Infosecurity Magazine.

Researchers from the University of Texas at San Antonio, Virginia Tech, and the University of Oklahoma who ordered over a dozen code-generating LLMs to produce 576,000 Python and JavaScript code samples discovered that 20% of recommended packages were hallucinated, according to a report from Socket. Repeating the prompts 10 times each recommended 43% of the hallucinated packages, indicating the viability of such an attack. "This threat scales. If a single hallucinated package becomes widely recommended by AI tools, and an attacker has registered that name, the potential for widespread compromise is real," said Socket, which urged developers to ensure proper tracking and verification of dependencies to prevent potential compromise.

An In-Depth Guide to AI

Get essential knowledge and practical strategies to use AI to better your security program.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.

You can skip this ad in 5 seconds