The "This is Fine" DevOps Stack: Why Are We Lying to Ourselves?
I’ve been staring at the data from our 2026 State of DevOps Modernization Report for days now, and I’m genuinely flabbergasted.
We’ve reached a point of collective cognitive dissonance in the industry that should probably have its own entry in a medical journal - Velocity Derangement Syndrome.
The Math Isn't Mathing
Here is the breakdown of the group using AI coding tools "very frequently" (the power users, the cutting-edge cohort):
What’s going great is speed: 61% of this cohort have a commit to production lead time of a day or less. That’s better than the those using assistants “occasionally” of whom 49% report day or less lead time. When we dive in, we’ll see that all of the quality metrics are “on fire” even if velocity is looking good. We need to face it - just because we can deliver quickly doesn’t mean our DevOps stack is high quality. The bar is higher than that.
Things are Truly Broken
Everywhere we looked, the quality and quality-of-life metrics were worse for the very frequent users of AI coding assistants (those using them multiple times a day) than people who used them a few times a week “occasionally”. At this point there aren’t enough engineers using the tools less than that for a representative comparison.
Frequent AI Coding Users vs Occasional
For the questions about “more problematic” or less there were typically sizeable minorities who had less problems, but the high-speed, high AI coding cohort reported “more problematic” significantly more than “less”.
Some very high performing teams may have solved how to use these tools optimally, but most are struggling.
Recommended by LinkedIn
Pride, Craft, and Pressure
I get why the ratings are high. This cohort is likely the most advanced. They’ve put in the sweat equity. They’ve built complex, automated systems, and there is a massive amount of pride in that craft. Deploying multiple times a day is an accomplishment.
Have we collectively decided that "fast" is a synonym for "good?"
This is an extraordinarily dangerous place to be as an industry. Engineering teams are likely caught between execs who know how “easy” it is to vibe code something and expect rapid turn-arounds, and the constant threat that the next outage will come with being told to slow down and be more diligent.
In fact, the pressure to ship quickly was widely reported as a contributing factor to engineer burnout, but among the very frequent AI coding cohort, it was a “significantly” contributing (our highest level) 40% of the time versus just 25% of the time for the occasional users. AI coding assistants are likely a symptom and cause.
Speed is an Input, Not the Outcome
The reality is that AI-assisted coding is a structural acceleration layered on top of toolchains that were already held together by human heroics.
Today, the best engineers are guiding multiple semi-autonomous agent developers and generating significant change. But we’re still using the same manual quality gates and fragmented pipelines.
We’ve successfully automated the code-writing process, and we’re improving our documentation and reviews. The evidence shows that most teams have neglected the part where we ensure code doesn't break the business.
Bottom line: We need to stop patting ourselves on the back for velocity and start looking at the wreckage it’s leaving behind. A system that burns out its people isn't a "good" system—it’s just a fast way to reach a breaking point.
Simply put: If it isn't sustainable, it isn't DevOps.
As we build towards a modern AI SDLC, we'll need to avoid failures we saw in some early adopters of agile or DevOps where speed came at the cost of quality. All of these practices have high quality roots in Lean. When we're grounded there, the permission to move at lightning speed will come - and more importantly it will stay.
What do you think? Are we so addicted to the "vibe" of being fast that we've forgotten what stability actually looks like?
👏 👏 👏