According to a recent LinkedIn post from Snyk, large language models are portrayed as reshaping how developers consume open-source software. The post highlights that AI tools may increasingly recommend long-abandoned or unmaintained code, which the post refers to as the “Dormant Majority” of the open-source ecosystem.
End of Quarter Sale - 50% Off TipRanks
- Unlock hedge fund-level data and powerful investing tools for smarter, sharper decisions
- Discover top-performing stock ideas and upgrade to a portfolio of market leaders with Smart Investor Picks
The company’s LinkedIn post suggests this trend could elevate security risks, as obsolete packages may lack recent patches and traditional CVE-based monitoring may miss threats. It also notes the emergence of “slopsquatting,” where attackers allegedly register malicious packages with names they expect AI systems to hallucinate, complicating traditional vulnerability management.
For investors, the post implies a potential expansion in demand for more proactive software supply chain governance and health-based security tooling. If Snyk can position its platform as addressing AI-driven open-source risks, this narrative may support higher customer spending on security solutions and reinforce the firm’s role in next-generation application security.
The commentary also underscores broader industry implications, as enterprises adopting AI-assisted development may face new categories of exposure. Vendors that can quantify and mitigate these AI-influenced open-source risks could gain competitive advantage, potentially reshaping budgets within DevSecOps and cloud-native security over time.

