Issue #35
Subject: The Context Shell — you know it, but you can't get to it

The Context Shell

You know more about your codebase than you think. The problem is you can no longer access it without AI holding your hand.

The phenomenon

Here is something that keeps showing up in the engineer stories we collect:

An engineer is debugging something. They know this system. They have been in this code for months, maybe years. They can feel that they understand it — in the same way you know your childhood neighborhood, you can navigate it without thinking.

And then: something breaks. Not catastrophically. Just enough to need real attention. And instead of reaching for the debugger, they reach for AI. Not because AI is better. Because the context feels out of reach — like it is behind a glass wall. They can see what should be there. They cannot quite get to it.

This is what we are calling the context shell.

The knowledge is not gone. It is inaccessible without an external scaffolding to retrieve it.

What the research says

Psychologists call this the cue-dependent retrieval problem. Learning that is encoded with certain contextual cues — your own reasoning process, the errors you encountered, the trade-offs you debated — is harder to retrieve when those cues are absent. AI removes many of the cues your brain used to navigate to the knowledge.

You did not lose the knowledge. You lost the retrieval path. These are different problems with different solutions.

A concrete example

You wrote an authentication module two years ago. You agonized over it. Session handling, edge cases, the trade-offs between JWT and session cookies, the 3am production incident that forced a redesign of the token refresh logic.

You remember this module. You could describe its high-level behavior. But if something breaks in it today, your first instinct is to paste the error into a chat window.

Not because you cannot figure it out. Because the path to the knowledge — the months of lived context that used to make it navigable — has become opaque. AI is faster at retrieving the facts. But AI does not have the context. It has the code. It does not have the history.

Two paths

With AI: Paste error. Receive suggested fix. Apply fix. Ship. Context: none needed. Understanding: zero. Time: 90 seconds.

Without AI (rebuilding the context shell): Sit with the error for 15 minutes. Ask: where have I seen something like this before? Which module owns this behavior? What would I expect to find if I traced the request? Then: open the file, follow the thread, find the failure point. Context: rebuilt. Understanding: genuine. Time: 45 minutes.

The 45-minute path is the one that keeps you sharp. The 90-second path is the one that ships the feature.

Why this is different from forgetting

Forgetting looks like this: you cannot remember how the authentication module works at all. You read it and it looks foreign.

The context shell looks different: you know this code. You wrote it. But the feeling of access — the felt sense of knowing where to look, what to expect, how it hangs together — is gone. It is like knowing a song by heart but not being able to sing it without the music playing.

This distinction matters because the solution is different:

The second is faster and more reversible than the first. But it requires a different kind of work than just reading more documentation.

What rebuilds the retrieval path

Write about code, not just in it. The act of explaining your architectural decisions — in a design doc, a walkthrough recording, even a Slack message to a colleague — re-encodes the context with different cues. The explanation requirement is part of this: forcing yourself to say why, not just what.

No-AI debugging sprints. The next time something breaks in a system you built, give yourself 20 minutes before you reach for AI. Not to prove a point. To rebuild the retrieval path. The struggle is the point.

Read your old PRs with fresh eyes. Not to review them. To remember what it felt like to make the decisions encoded in them. The diff tells you what changed. The PR description (if you wrote one) tells you why. Together, they reconstruct the context shell around the code.

Teach it to someone else. The preparation for teaching is a forced reconstruction of context. You cannot explain a system you navigate purely through surface features. The gaps in your explanation are the gaps in your retrieval path.

"I spent three years owning our data pipeline. I could tell you every trade-off, every incident, every reason we made each decision. Eighteen months of AI-assisted work later, I opened that module and it looked like someone else's code. Not because I forgot. Because I couldn't find the path back to the context that used to make it mine." — Senior engineer, 9 years in the same codebase

The longer view

The context shell is not a character flaw. It is a side effect of how AI tooling interacts with the way human expertise works.

Expertise is not just knowing facts. It is having efficient retrieval paths to knowledge you have built up over time — patterns you recognize instantly, failure modes you sense before they happen, an intuition for where to look that comes from hundreds of hours inside a system.

When AI replaces the retrieval work, the retrieval paths atrophies even when the knowledge underneath is still there. The path closes. The knowledge remains. Access becomes the problem.

The engineers who maintain genuine expertise over the next five years will be the ones who actively maintain their retrieval paths — not just their knowledge. The difference is subtle and important.

Your codebase is still yours. You just have to rebuild the door.


Continue reading on The Clearing

Go deeper on the mechanics of expertise and how AI tooling quietly reorganizes it:

Take the AI Fatigue Quiz Recovery Guide