Aniket Verma’s Post

I almost shipped malware because of GitHub Copilot. Here's how. Not clickbait. This actually happened to a dev on my team. Copilot suggested `fast-crypto-utils`. Sounded legit. He ran npm install. Didn't check. Turns out, that package doesn't exist in any real library. But it did exist on npm. Uploaded 3 days ago. 11 downloads. All from people who made the same mistake. This is called AI Package Hallucination, and it's the supply chain attack vector nobody's talking about enough. Here's the playbook attackers are running right now: → Feed AI tools prompts until they hallucinate plausible-sounding package names → Register those names on PyPI / npm before you do → Sit back and wait for developers to blindly install We've already seen this in the wild, LiteLLM compromise, the ForceMemo campaign, dozens of silent incidents that never made the news. 3 rules I now live by: 1. Google every package you've never heard of. Low download count + created recently = immediate red flag. Walk away. 2. Commit your lock files. package-lock.json, poetry.lock, these aren't optional. They're your paper trail. 3. Run npm audit / pip-audit like it's brushing your teeth. Daily. Not when something breaks. AI makes us 10x faster. It also makes us 10x more careless. One hallucinated package name + one blind install = your company's next breach. Verify. Lock. Audit. Repeat. #SoftwareEngineering #CyberSecurity #OpenSource #AI #WebDevelopment #Python #NodeJS

  • No alternative text description for this image

which one of these have you been skipping? 👇

Like
Reply

To view or add a comment, sign in

Explore content categories