🔍
No terms matched your search. Try a different word?
A
The state of mental, emotional, and often professional exhaustion that results from sustained, high-pressure interaction with AI tools — particularly when those tools change faster than a person can adapt, when their outputs require constant vigilance to verify, or when they've fundamentally changed the nature of work in ways that feel disorienting or hollow.
AI fatigue is distinct from regular burnout in that it often comes paired with a specific dissonance: the AI is supposed to make things easier, and yet somehow things feel harder. The mismatch between expectation and reality amplifies the exhaustion.
See also: Prompt Fatigue, Automation Anxiety, Skill Atrophy
A persistent, low-grade worry that AI tools are eroding your professional value or will eventually render your skills obsolete. Unlike fear of job loss (which is concrete and situational), automation anxiety is chronic — it hums in the background of daily work, creating hypervigilance and a constant sense of inadequacy even when your actual output is strong.
Automation anxiety often leads engineers to use more AI — trying to "keep up" — which ironically deepens the anxiety cycle. The antidote is not using less AI per se, but reconnecting with your own intellectual agency.
See also: Epistemic Abdication, Skill Atrophy
The tendency to over-rely on automated or AI-generated outputs, accepting them uncritically even when errors are present or a human judgment call is warranted. Automation bias isn't laziness — it's a deeply human response to cognitive load. When we're tired, overwhelmed, or under pressure, trusting the machine feels safer than trusting ourselves.
In code review contexts, automation bias looks like approving AI-generated code faster than human-written code — even when the AI code is subtly wrong. Aviation researchers identified this problem decades before LLMs existed; it applies more than ever now.
See also: Epistemic Abdication, Cognitive Offloading
B
A state of disengagement that's more subtle than full burnout. In a brownout, you're still showing up and technically functioning — writing code, attending meetings, answering messages — but your internal light is dimmed. Motivation is low, curiosity feels far away, and work that once energized you now feels like going through motions.
Engineers in brownout are often the last to notice it happening to them. From the outside, they look "fine." Internally, they've started to wonder if they even care about software anymore. They do — they're just depleted.
See also: Disengagement Spiral, AI Fatigue
C
The practice of using external tools — paper, devices, AI — to reduce the mental load your brain has to carry. Healthy cognitive offloading frees up mental bandwidth for higher-order thinking. Problematic cognitive offloading happens when you begin offloading the thinking itself, not just the storage or retrieval.
Writing a shopping list is cognitive offloading — and it's great. Asking an AI to decide what you should think about a problem, then accepting its answer without engaging with it — that's where it gets complicated. The key question: are you offloading information, or judgment?
See also: Epistemic Abdication, Deep Work
When AI tools flatten the rich, tacit context of software development — the team history, the architectural reasoning, the unwritten conventions — into a stripped-down input that a model can process. The model may produce syntactically correct code that is semantically wrong for your codebase, and the engineer must rebuild the context the tool discarded.
Context collapse is particularly exhausting because it's invisible to non-technical stakeholders. "But the code works, right?" Yes. But the work of re-grounding it in the actual context was invisible labor that the tool generated, not eliminated.
The total amount of mental effort being used in working memory at any given time. Engineers have a finite cognitive budget; when it's exceeded, errors increase, decision quality drops, and exhaustion accelerates. AI tools were supposed to reduce cognitive load — and sometimes they do. But managing AI outputs, verifying hallucinations, and context-switching between AI interfaces adds its own kind of load.
Psychologist John Sweller identified three types: intrinsic (the complexity of the task itself), extraneous (how the task is presented), and germane (the effort that builds lasting understanding). Too much AI use can reduce germane load — the kind that actually makes you smarter over time.
See also: Deep Work, Cognitive Offloading
The habitual, near-reflexive act of opening an AI tool and typing a question the moment any uncertainty or difficulty arises — even when the difficulty is the kind that produces growth. Compulsive prompting bypasses the discomfort of not knowing, which is precisely where deep understanding is formed.
The distinction between useful prompting and compulsive prompting is often a feeling: useful prompting feels like using a tool; compulsive prompting feels like reaching for a cigarette. One is deliberate; the other is avoidance.
See also: Prompt Dependence, Productive Struggle
D
Cognitively demanding, distraction-free work performed in states of full concentration, pushing your skills to their limit and producing real value that's hard to replicate. Coined by computer scientist and author Cal Newport. The opposite of shallow work — which includes most AI-assisted, copy-paste, and review-and-merge tasks.
Deep work is not anti-AI. It's pro-thinking. There's a version of AI use that enables deeper focus (handling boilerplate, searching docs) and a version that crowds it out. The goal is to be intentional about which version you're doing.
See also: Flow State, Shallow Work
A self-reinforcing cycle where decreased motivation leads to increased AI reliance, which leads to reduced skill engagement, which leads to even less intrinsic motivation. Each turn of the spiral makes it harder to break out of — not because engineers are weak, but because the reinforcement loop is structurally strong.
Recognition is the first escape. Naming "I'm in a spiral" gives you distance from it. The practical interruption is often a small, deliberate project done entirely without AI assistance — something concrete enough to remind you what your own thinking feels like.
See also: Brownout, Skill Atrophy
E
The act of surrendering your own reasoning and judgment to an AI system — not just outsourcing a task, but outsourcing the act of thinking itself. Epistemic abdication is when you stop asking "what do I think?" and only ask "what does the AI say?" It's the difference between a calculator and ceding your ability to do arithmetic.
Philosophers use "epistemic" to mean "related to knowledge and belief." Abdication means giving up a responsibility you held. Together: you've handed over the throne of your own mind. This isn't an accusation — it's a pattern that can creep in invisibly, and naming it is not blame but awareness.
See also: Automation Bias, Cognitive Offloading, Prompt Dependence
F
A state of total absorption in a challenging task, where time distorts, self-consciousness dissolves, and effort feels effortless. Identified by psychologist Mihaly Csikszentmihalyi. Flow requires a specific balance: the challenge must be real but not crushing; the skills must be stretched but not overwhelmed.
Many engineers who use heavy AI assistance report losing their flow. Not because flow is impossible with AI, but because the conditions for it — genuine challenge, skill engagement, sustained focus — are frequently disrupted by AI-mediated workflows. Frequent context-switching, prompting, and output reviewing breaks the sustained attention flow requires.
See also: Deep Work, Productive Struggle
G
A state where an engineer is nominally the author of a codebase but cannot fully explain, defend, or own the decisions within it — because the code was substantially generated by AI without deep engagement. The engineer's name is on the PR, but the reasoning is the model's.
Ghost authorship creates a quiet professional anxiety: the fear of being "found out." In code review, in production incidents, in architecture discussions — there's a background dread that someone will ask a question you can't answer about your own code. This anxiety is a real and underreported toll of heavy AI code generation.
See also: Skill Atrophy, Ownership Anxiety
M
The loss of procedural fluency in programming tasks — the ability to write certain code, use certain APIs, or navigate certain environments fluidly, without thinking. Like a musician who stops practicing scales, engineers who consistently delegate coding tasks to AI can find their own execution becomes hesitant where it was once fluid.
This isn't a judgment about using AI. It's a neural reality: skills that aren't practiced atrophy. The practical question is: which skills do you want to keep sharp? The answer is personal and strategic — but it requires being deliberate rather than passive.
See also: Skill Atrophy, Deep Work
O
The discomfort or low-grade dread that arises when you're responsible for code, systems, or decisions you don't fully understand — often because AI generated them. Ownership anxiety is particularly acute during incidents, reviews, and high-stakes moments. It erodes confidence over time, even in otherwise capable engineers.
The engineering culture norm of "own your code" collides hard with AI-generated work. Nobody told you it was okay to ship code you couldn't explain. And yet here you are. Ownership anxiety is the gap between that norm and your reality — and it deserves to be named, not just felt in the dark.
See also: Ghost Authorship, AI Fatigue
P
The performance of busyness and output metrics that look like productivity but don't correspond to meaningful work or genuine progress. In an AI-accelerated environment, productivity theater reaches new heights: you can generate enormous volumes of code, documentation, and tickets that give the appearance of velocity while obscuring the absence of deep thinking.
Teams caught in productivity theater often feel simultaneously exhausted and behind. The throughput numbers are high; the feeling of actually building something real is absent. This gap is one of the more demoralizing and underdiagnosed features of AI-augmented software teams.
See also: Shallow Work, Brownout
A pattern where an engineer is no longer able to start or complete a task without first consulting an AI for direction, framing, or validation. The ability to self-initiate — to sit with a problem and begin working through it independently — has atrophied. Prompt dependence is not about frequency of AI use, but about the loss of agency that precedes it.
A useful diagnostic: open a new file or blank terminal and notice your first impulse. If the impulse is "I should ask the AI what to do first," that's worth paying attention to. The prompt was always supposed to be yours.
See also: Compulsive Prompting, Epistemic Abdication
The specific exhaustion that comes from the overhead of working with AI through natural language prompts — constantly translating your intent into inputs the model will understand, evaluating outputs, re-prompting when results miss, managing context windows, and fact-checking hallucinations. Prompt fatigue is cognitively expensive and often invisible.
Prompt fatigue is different from AI fatigue broadly. You can have prompt fatigue with a specific model or workflow while still feeling okay about AI tools in general. It's often a signal to change tools, change workflows, or just take a break from prompting and do something with your hands.
See also: AI Fatigue, Cognitive Load
The effortful, uncomfortable, but growth-generating process of working through a problem you don't immediately know how to solve. Cognitive science consistently shows that knowledge constructed through struggle is retained longer and transfers better than knowledge received passively. AI tools that resolve struggle too quickly can inadvertently eliminate the most valuable part of the learning process.
Productive struggle is not the same as being stuck. Stuck means you're blocked with no path forward. Struggle means you're working hard at something at the edge of your ability — and that edge is exactly where growth lives. Protect time for it.
See also: Flow State, Deep Work
R
The fear — often unspoken — that your professional role is being shifted or hollowed out by AI tools, even if you still have the same title and salary. Role displacement anxiety is about identity, not economics: the sense that "software engineer" is no longer the thing you trained to be, and you're not sure what it is instead.
This is one of the least-discussed and most real dimensions of AI fatigue. Engineers aren't just tired — they're disoriented. The craft they signed up for has changed, and nobody asked them if they were okay with that. Acknowledging the grief of that change is valid and necessary, not dramatic.
See also: Automation Anxiety, AI Fatigue
S
Non-cognitively demanding tasks performed while distracted — meetings, email, copy-paste coding, reviewing AI output. Cal Newport's term. Not inherently bad; necessary, even. But when a workday consists mostly of shallow work, the deep skills go unused, the craft atrophies, and the sense of meaningful contribution fades.
AI tools can both create and eliminate shallow work. They eliminate shallow work by automating it. They create shallow work by producing output that requires review without requiring you to think. The goal is to be deliberate about which kind you're doing and when.
See also: Deep Work, Productivity Theater
The gradual erosion of a technical skill through disuse. Skills that aren't practiced don't stay static — they fade. Skill atrophy in AI-heavy engineering environments is particularly insidious because the atrophy is invisible until a situation arises that requires the skill, and then the gap is stark and humiliating.
Common engineering skills reported as atrophying: debugging from first principles, writing SQL queries, understanding compiler errors, regex fluency, reading unfamiliar codebases, and the ability to hold a complex system in working memory without tools. The skills that atrophy first are usually the ones you delegated first.
See also: Muscle Memory Erosion, Ghost Authorship
The organizational pressure to continue using AI tools at scale — or to use them more — regardless of their actual effect on team wellbeing, code quality, or individual growth. When a team or company has bet its productivity narrative on AI, admitting that the tools are causing problems feels politically dangerous, and so the problems go unspoken and unaddressed.
Engineers who raise concerns about AI overuse often encounter the status quo trap directly: "But everyone else is using it." "We can't slow down." "This is just how things are now." These responses are not wrong, exactly — they're just incomplete. They describe what is, not what is optimal.
T
Exhaustion from the relentless pace of new AI tools entering the ecosystem — each requiring evaluation, onboarding, workflow adjustment, and the cognitive overhead of deciding whether to adopt. Tool fatigue is partly about the number of tools and partly about the emotional labor of keeping up with a landscape that never stabilizes long enough for habits to form.
The average software engineering team in 2025 has been asked to evaluate more new tools in 12 months than most teams saw in five years previously. Tool fatigue is not weakness — it's a rational response to an extraordinary pace of change. The permission to not evaluate every new thing is real permission, not laziness.
See also: AI Fatigue, Prompt Fatigue
W
The distance between what an engineer is willing to express about their AI fatigue and what they're actually experiencing. The willingness gap is large in most engineering cultures because the dominant narrative is that AI tools are an unambiguous productivity good, and expressing doubt or exhaustion about them reads as resistance to progress.
The willingness gap is why AI fatigue spreads silently across teams. Every engineer thinks they're the only one who feels this way. They are not. The willingness gap is why communities like this one exist — to give people a place to say what they can't say at standup.
See also: Status Quo Trap, AI Fatigue