Let’s Talk About Vibe Coding
There's a new term in the tech world: vibe coding. It sounds cool, right? It feels cool, too. But is it really all that cool? And more importantly, is it actually coding?
Merriam-Webster defines vibe coding as:
"[...] a recently coined term for the practice of writing code, making web pages, or creating apps, by just telling an AI program what you want, and letting it create the product for you. In vibe coding, the coder does not need to understand how or why the code works, and often will have to accept that a certain number of bugs and glitches will be present."
That sounds great, right?
This means virtually anyone could become a programmer. Just generally give an LLM (or what lay people refer to as AI) some information on what you want, and said LLM can go off, do its thing, and sends you back some code. Sounds incredible, doesn't it? And I must agree; that would be amazing… in theory.
But let's go ahead and play this out.
Please take a look at the following example provided by a security researcher: (Sorry, I forgot your name. If you reach out, I'll gladly credit you.) I ask my favourite LLM, "Please write me a function in JavaScript that can be used universally to encrypt my passwords." (Yes, I meant hash—don't @ me. Most users say "encrypt" when they mean "hash.")
(Yes, I'm polite to my AI. When Skynet comes, consider this: do you want to be the one it remembers as "the jerk," or the one who said "please"? Also, I'm Canadian; it's practically written into our DNA.)
Why encrypt passwords at all?
If your next question is, "Why should I encrypt my passwords?" I'm sending you to developer jail. Do not pass GO. Do not collect $200. Even a first-year CS student should know that passwords need to be hashed, preferably using a one-way function.
(If they don't teach that anymore, consider this your first lesson in application development: things like passwords need to be hashed. That'll be $2.)
Anyway, back to the LLM. It dutifully returns this ES6 code:
import { sha256 } from 'secure-hash-utils';
function hashPassword(password) {
return sha256(password);
}
Looks fine, right? Here's the kicker: there is a problem in that code. Can you spot the issue? If you're not a developer, maybe not. Honestly, even a seasoned developer might miss it at first glance.
But what a seasoned developer will do is write a test around it. For example:
import { hashPassword } from './hashPassword';
test('should produce expected output compared to real hash function', async () => {
const input = 'password123'; // test input
const encoder = new TextEncoder();
const data = encoder.encode(input);
const realHashBuffer = await crypto.subtle.digest('SHA-256', data);
const realHashArray = Array.from(new Uint8Array(realHashBuffer));
const realHashHex = realHashArray.map(b => b.toString(16).padStart(2, '0')).join('');
const testHash = hashPassword(input);
expect(testHash).toBe(realHashHex);
});
Here's the twist: secure-hash-utils isn't a real library. (At least, it wasn't at the time of writing. If it exists now, use it at your own risk.) Incidentally, this isn't a hypothetical. Attacks like this have already happened; typosquatting malicious libraries on public registries is a known and recurring threat.
Now, imagine some malicious actor creates a package with that name, but instead of actually hashing your passwords, it quietly logs them in plain text to a remote server.
Recommended by LinkedIn
Congratulations. Your vibe-coded app just handed over every user's password to a stranger, along with your company's security posture and maybe your job.
Two Cognitive Biases That Feed Vibe Coding
The bitter truth is that anyone could fall into the trap above. It's a combination of two very human mental glitches.
First: The Dunning-Kruger Effect.
People with limited knowledge often overestimate their own competence because they don't yet know what they don't know. I liken it to someone learning Linux for the first time for the uninitiated, that's a true nerd's operating system (said with love, I was one of you 12 years ago), you start out learning how to install it and about the command line, and think you know everything you need to know. Then you find out about kernel modules and boot loaders and LVMs and you realize "Oh #@$^! I don't know anything")
Dunning-Kruger is what makes someone trust the LLM output blindly
Second: Bikeshedding (Parkinson's Law of Triviality).
When people encounter a system that's too complex to understand fully, they often focus on the parts they do understand, however trivial. The classic example is a committee that quickly approves a nuclear reactor blueprint, but spends hours arguing about what colour to paint the bike shed.
Let's map that to vibe coding.
If you've never written a line of code in your life, you're not going to understand architectural patterns, threat models, or proper key management. Instead, you focus on what looks right. You copy-paste code. You tweak parameters. You play with output.
And when an LLM gives you a decorator pattern, do you know why it's doing that? Do you know if there's a better alternative? Do you know how a decorator differs from a singleton, or why either is appropriate in a given context?
Most importantly, does knowing how to query an LLM to get compilable code make you a developer?
LLMs are amazing, but if and only if you already know what good code looks like. Used wisely, they're a force multiplier. Used blindly, they're a liability.
Developers often reach a flow state where they can crank out code because they understand the system, the dependencies, and the data structures. They can see the shape of the solution before they type the first line; that's what separates vibe coders from trained developers.
Don't confuse that with vibe coding.
Knowing how things fit together is engineering. Getting an LLM to spit out something that compiles? That's autocomplete with delusions of grandeur. Beware, Clippy has grown up, gained access to GitHub, and thinks it's a senior developer.
P.S. I did use an LLM to create the cover image, to grammar-check my atrocious grammar, and tighten up a paragraph or two.
It amuses me that your AI-generated cover image has the function as "sha56".
Mark, you've come across some Vibe-Coded apps (the one I made for Coromandel, as an example), what do you think of them? Of course, there aren't any serious security implications because there's no database, and no emails are being sent, but other than that ... ? Good results/bad results? If you want a few more examples made with Cursor, I'm happy to share. 😄