You shipped more features this quarter than last. Your cognitive load is also at burnout level. These two facts are connected. Here's the uncomfortable math.
Your quarterly metrics look great. More stories closed. Fewer cycles per ticket. AI-assisted coding has genuinely accelerated your team.
But you can't see what happened to the engineers who shipped those features.
They used AI to compress the work. To skip the debugging. To skip the failure modes. To skip the part where you genuinely don't know how something works yet, and you sit with that discomfort until it resolves.
AI tools removed the discomfort. They also removed the learning.
So yes — velocity went up. And so did the surface area of things your team understands less deeply.
When you ship a feature using AI, the hours you save are real. But somewhere in your brain, the hours of struggle that would have built understanding didn't happen. That's not a feeling. That's cognitive science — it's the testing effect, the generation effect, the desirable difficulty principle. Your brain learns by encountering resistance, not by bypassing it.
The industry hasn't figured out how to account for this in velocity metrics. It probably never will.
Once you've shipped a few features without deeply understanding the underlying system, the surface area of your ignorance grows. You can feel it — you start avoiding certain parts of the codebase. You say "I know roughly how that works" more often. You reach for AI earlier, because the gaps are bigger.
Each AI-assisted sprint doesn't just fail to build understanding — it actively widens the gap.
This is the velocity trap: the better your tools get at producing working code, the more the working code becomes a surface you can't see beneath.
Junior engineers who joined in the last 18 months don't know any different. Senior engineers who remember building without AI are watching their craft erode in real-time and aren't sure how to name it.
You know this feeling:
These aren't character flaws. They're symptoms of a system that optimized for one thing (velocity) at the cost of another (understanding).
If you've been doing this for 8+ years, the velocity trap hits different. You remember what it felt like when you genuinely understood a system end-to-end. You could trace a bug to its root, propose architectural changes from first principles, mentor junior engineers by explaining not just what to do but why. That understanding took years to build. You're watching it erode faster than you can maintain it — and nobody in your performance review is measuring what you're losing.
This isn't a "give up AI" newsletter. It's a "use AI on purpose" newsletter.
The explanation requirement: Before you ship AI-generated code, write one sentence explaining why it works that way. Not what it does — why that approach was chosen. If you can't, you don't understand it yet. Read it again. Ask AI to explain the alternatives it considered and rejected. Then write your one sentence.
Weekly no-AI sessions: One feature per week, built without AI assistance. Not as a test of willpower. As calibration — to keep your muscle memory alive and to remember what it feels like to be genuinely uncertain and then work through it.
The 20-minute rule: Before reaching for AI to debug, sit with the problem for 20 minutes. Write down what you know, what you don't know, where you'd start. Then use AI — but now you're evaluating its output, not receiving it passively.
Name the cost: When you reach for AI earlier than you used to, notice that movement. Name it. "I'm reaching for AI because I'm not confident I can figure this out myself." That's data. That's the velocity trap catching you.
AI coding tools are genuinely useful. They're not going away. The question isn't whether to use them — it's how to use them without surrendering the slow, difficult, essential parts of being a software engineer.
The engineers who thrive in the next 5 years won't be the ones who use AI the most. They'll be the ones who use it the most deliberately — who understand what AI provides and what it costs, who maintain the craft underneath the automation.
That's the long game. And it's not about productivity theater. It's about not eroding into a state where you can ship features but can't explain them, can't debug them, and can't design the next thing without AI generating it first.
Your brain is not a GPU. It doesn't learn by receiving processed output. It learns by processing.
If this resonated, these pages go deeper:
"The part of programming that's worth doing is the part that stretches you. If AI does the stretch for you, you've skipped the thing that made you a programmer."