WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

115

Archive: https://archive.today/DaKWH

From the post:

>Cybersecurity researchers are warning of a new type of supply chain attack, Slopsquatting, induced by a hallucinating generative AI model recommending non-existent dependencies. According to research by a team from the University of Texas at San Antonio, Virginia Tech, and the University of Oklahama, package hallucination is a common thing with Large Language Models (LLM)-generated code which threat actors can take advantage of. “The reliance of popular programming languages such as Python and JavaScript on centralized package repositories and open-source software, combined with the emergence of code-generating LLMs, has created a new type of threat to the software supply chain: package hallucinations,” the researchers said in a paper.

Archive: https://archive.today/DaKWH From the post: >>Cybersecurity researchers are warning of a new type of supply chain attack, Slopsquatting, induced by a hallucinating generative AI model recommending non-existent dependencies. According to research by a team from the University of Texas at San Antonio, Virginia Tech, and the University of Oklahama, package hallucination is a common thing with Large Language Models (LLM)-generated code which threat actors can take advantage of. “The reliance of popular programming languages such as Python and JavaScript on centralized package repositories and open-source software, combined with the emergence of code-generating LLMs, has created a new type of threat to the software supply chain: package hallucinations,” the researchers said in a paper.

(post is archived)