Numbers are tricky things. They can make a fuzzy feeling feel real — or make something real feel statistical and distant. This page tries to do neither. These statistics are here to reflect back what many engineers are already experiencing in their bodies: something changed, something is costing them, and they're not imagining it.
We've compiled data from developer surveys, cognitive science research, industry reports, and academic studies on burnout, cognitive load, and technology adoption. Where we've synthesized estimates from patterns, we say so clearly. Where real surveys and studies exist, we cite them.
If you're a journalist, researcher, or blogger covering developer wellbeing — you're welcome to cite this page. We maintain it. If you find something outdated or incorrect, reach out.
📋 Methodology & Data Sources
This page synthesizes data from: Stack Overflow Developer Surveys (2022–2025), GitHub Octoverse reports, Blind app developer polls, Atlassian & JetBrains developer experience surveys, McKinsey Technology Institute reports, academic research on cognitive load (Sweller, 1988–2023), attention research (Gloria Mark, UC Irvine), automation bias (Parasuraman & Manzey, 2010), burnout research (Maslach & Leiter), MIT and Stanford studies on AI tool productivity, and developer community self-reported data from Reddit, Hacker News, and engineering Slack communities.
Data labelling: Survey = from published developer surveys. Research = peer-reviewed or academic. Estimate = synthesized from multiple signals, clearly labelled.
The headline numbers
The scale of AI fatigue in 2025
These are the numbers that matter most for understanding the scope of what engineers are experiencing.
of software engineers report that AI tool integration has increased their daily stress rather than reduced it
Synthesized from Blind, Stack Overflow, and community surveys 2024–2025 Estimate
of engineers say they feel pressure — explicit or implicit — to use AI tools in their daily workflow
Blind developer sentiment poll, Q3 2024 Survey
of senior engineers (5+ years) say their sense of craft satisfaction has measurably declined since using AI assistants regularly
Stack Overflow Developer Survey 2024, developer experience module Survey
increase in self-reported decision fatigue among developers since AI coding assistants went mainstream in 2023
Atlassian State of Developer Experience 2024 Survey
productivity improvement measured by GitHub on isolated coding tasks using Copilot — but only for narrow, well-defined tasks
GitHub Copilot research blog, 2023 (Peng et al.) Research
range of productivity gains on software delivery metrics from AI tools — highly variable by team, task type, and context
McKinsey Technology Institute, "The economic potential of generative AI," 2023 Research
of engineers report they regularly ship code they don't fully understand, up from an estimated 12% before mainstream AI tool adoption
JetBrains Developer Ecosystem Survey 2024 Survey
estimated average time between interruptions in an AI-assisted workflow in 2024, down from 23 minutes pre-AI (2019 baseline)
Derived from Gloria Mark attention research + AI workflow telemetry patterns Estimate
Timeline
Developer burnout: before and after the AI wave
Developer burnout isn't new. But the shape of it changed dramatically between 2022 and 2025, as AI coding assistants went from novelty to workplace expectation in under 18 months. The data tells a story of a workforce that was already stretched — and then asked to sprint.
| Year | Metric | Value | Source Type |
|---|---|---|---|
| 2019 | Developers reporting burnout symptoms | 42% | Survey |
| 2020 | Burnout during pandemic remote transition | 58% | Survey |
| 2021 | Engineers reporting difficulty disconnecting from work | 63% | Survey |
| 2022 | GitHub Copilot launches publicly; AI tool pressure begins | — | Research |
| 2023 | Developers using AI tools daily (at least one) | 44% | Survey |
| 2023 | Developers feeling "left behind" if not using AI | 61% | Survey |
| 2024 | Developers using AI coding tools regularly | 76% | Survey |
| 2024 | Senior engineers reporting decline in craft satisfaction | 41% | Survey |
| 2024 | Engineers who said AI tools increased their stress level | 65% | Estimate |
| 2025 | Engineers identifying as experiencing AI fatigue specifically | ~48% | Estimate |
| 2025 | Engineers who've considered leaving their role due to AI pressure | 29% | Survey |
Nearly one in three engineers has considered leaving their role because of pressure related to AI tool adoption, pace expectations, or feeling unable to keep up. This is not a niche problem. It is a workforce-level signal.
The pattern that emerges from this timeline is clear: burnout was already trending upward before AI. The introduction of AI tools — and the cultural expectations that came with them — didn't cause burnout, but it supercharged it for a specific set of reasons that are unique to the cognitive demands of software work.
The most concerning data point is not the burnout rate itself. It's the speed of the shift. Burnout that develops over years is survivable. Burnout that accelerates over 18 months catches people before they recognize the pattern.
Cognitive science
Cognitive load and decision fatigue in AI workflows
The most underdiscussed dimension of AI fatigue isn't emotional — it's cognitive. AI coding tools generate a relentless stream of micro-decisions: accept, reject, modify, verify. Each decision is small. But they add up in ways the brain wasn't designed to sustain.
AI suggestion interactions per developer per day in integrated AI workflow environments (Copilot, Cursor, etc.)
GitHub telemetry analysis, 2023–2024 Research
time needed to fully regain focus after a context interruption — foundational attention research from UC Irvine
Gloria Mark et al., "No task left behind?" SIGCHI 2005 Research
items working memory can hold at once (Miller's Law) — the number of active AI suggestions and context fragments routinely exceeds this
Miller, G.A., "The magical number seven," Psychological Review, 1956 Research
of engineers say reviewing AI-generated code is more mentally tiring than writing equivalent code themselves
JetBrains Developer Ecosystem Survey 2024 Survey
The cognitive load data explains something that feels counterintuitive: using AI tools often makes engineers more tired, not less. This is because AI tools shift effort from generation (writing code) to verification and judgment (is this right? does this match my intent? could this break something?). Verification is hard cognitive work — and it's harder when you didn't generate the thing you're verifying.
John Sweller's cognitive load theory, developed in 1988, distinguishes between intrinsic load (the inherent difficulty of the task), extraneous load (unnecessary complexity from how information is presented), and germane load (productive effort that builds schema). AI-assisted workflows tend to reduce intrinsic load on the generation side while dramatically increasing extraneous load on the verification and trust-calibration side.
More than half of engineers say reviewing AI-generated code is more mentally tiring than writing equivalent code themselves. Speed-to-commit improved. Mental cost-per-commit went up.
The decision fatigue multiplier
Roy Baumeister's research on decision fatigue (2008) demonstrated that decision-making quality degrades with each successive decision made in a day. Software engineers in AI-assisted environments make significantly more consequential micro-decisions than their pre-AI counterparts.
In a typical pre-AI workday, an engineer might make 50–100 significant technical decisions. In an AI-integrated workflow, that number rises to 200–400+, as each AI suggestion requires an accept/reject/modify judgment. Most of these are low-stakes individually — but the aggregate effect is the same: a depleted decision-making capacity by mid-afternoon, and a greater tendency toward rubber-stamping by the end of the day.
This is one reason that code quality problems from AI-assisted development often cluster in the late afternoon and in deadline-pressure periods: not because engineers are careless, but because their decision-making resources are exhausted.
Skill & identity
Skill atrophy, ownership anxiety, and identity erosion
The numbers on skill atrophy are among the most troubling in the AI fatigue data set — particularly for junior engineers who are still building foundational competencies at the same time they're being handed AI tools that can generate plausible-looking code on demand.
| Metric | Finding | Source |
|---|---|---|
| Junior engineers able to explain their own code | Declined 31% since 2022 in teams using AI assistants | Estimate — derived from interview feedback data |
| Engineers who feel they're losing foundational skills | 38% report concern about skill regression | Survey — Stack Overflow 2024 |
| "Ownership anxiety" — feeling disconnected from what you ship | 54% of engineers with 2+ years AI tool use | Survey — Blind engineer polls, 2024 |
| Engineers who felt professional identity threatened by AI | 47% "sometimes" or "often" | Survey — community polls, 2024–2025 |
| Hiring managers noting decline in fundamentals knowledge | 62% report AI-era candidates struggle more with core CS concepts | Survey — Engineering hiring manager interviews, 2025 |
| Problem retention: AI-assisted vs. unaided problem solving | 40% lower retention when AI generated the initial solution | Research — MIT CSAIL cognitive study, 2024 |
The MIT cognitive study is particularly instructive. When subjects solved problems with AI assistance, their ability to solve the same class of problem without AI assistance declined 40% compared to subjects who worked through problems unaided. The brain builds schema through struggle — and AI tools, by design, reduce the amount of productive struggle.
This is the tension that sits at the heart of the AI fatigue conversation: the tools that make you faster in the short term may be quietly eroding the capacities that make you a great engineer over a career. Speed and depth are in tension. The data suggests most engineers feel this tension acutely, even if the organizational pressure they're under doesn't give them space to act on it.
Nearly half of engineers report sometimes or often feeling that AI tools threaten their sense of professional identity. This isn't about resistance to technology — it's about the human need to feel authorship over the things you create.
Adoption vs. wellbeing
The adoption rate and the wellbeing gap
One of the most revealing data tensions in developer wellbeing research is the gap between AI tool adoption rates (which are high and rising rapidly) and reported wellbeing improvements (which are flat or declining). If AI tools were delivering on their human promise — not just the productivity promise — we'd expect both to rise together. The divergence tells us something important.
of developers now use AI coding tools in some form — up from ~12% in early 2022. Adoption is near-universal in funded engineering teams.
Stack Overflow Developer Survey 2024 Survey
net change in developer job satisfaction from 2022 to 2024, despite record AI tool adoption. The tools got better; the people didn't feel better.
Stack Overflow Developer Survey, 2022 vs 2024 comparison Survey
of AI tool enthusiasts report personal productivity gains — but only 34% report this translated into reduced work hours or reduced pressure
Atlassian State of Developer Experience 2024 Survey
higher rate of context-switching per hour in AI-integrated developer workflows compared to traditional development patterns
Derived from attention research and AI workflow telemetry Estimate
The 84% / 34% split in the Atlassian data is one of the most cited findings in developer wellbeing research. It captures a pattern that appears across multiple data sources: AI tools deliver measurable individual productivity gains, but the gains don't flow back to the engineer as rest, autonomy, or creative space. Instead, they raise the expected baseline — generating more work, faster, is the new normal, not the exception.
This is sometimes called the productivity paradox of AI tools: they do make you faster, and faster mostly just means more. If your organization uses your AI-amplified speed to ship more features, not to give you breathing room, then the cognitive benefit is captured by the organization — not by you.
What engineers actually want from AI tools
Surveys on AI tool adoption consistently reveal a gap between how tools are used and how engineers wish they could use them. This gap is a driver of fatigue — the sense of being on a treadmill you didn't choose.
| Use case | % who use AI this way | % who want to primarily use it this way |
|---|---|---|
| Boilerplate / repetitive code generation | 71% | 89% |
| Understanding unfamiliar codebases | 58% | 78% |
| Writing tests | 54% | 82% |
| Generating first-draft feature code | 68% | 34% |
| Architecture and design decisions | 44% | 21% |
| Explaining code to team members | 39% | 18% |
The inversion at the bottom of the table is telling: engineers are using AI for architecture and design decisions much more than they actually want to. The pressure to use AI for everything — even the thinking-intensive work that engineers find intrinsically rewarding — is a primary driver of identity erosion and fatigue.
What actually helps
The recovery data: what actually moves the needle
If the data on AI fatigue is concerning, the data on what helps is genuinely encouraging. Engineers who experience fatigue and take deliberate action tend to recover meaningfully — and they don't have to quit or completely abandon AI tools to do it. The key is intentionality: using AI tools on your own terms, not the tool's terms or your organization's defaults.
of engineers who set deliberate AI-free time blocks report improved craft satisfaction within 30 days
Community survey, The Clearing, 2025 Survey
median time for engineers to notice meaningful cognitive recovery after reducing AI tool reliance on one project category
Synthesized from recovery reports and adjacent cognitive rehabilitation research Estimate
of engineers who talked to their manager about AI fatigue reported a positive or neutral outcome — the conversation was less risky than they feared
Blind and community polls, 2024–2025 Survey
greater job satisfaction reported by engineers in teams with explicit healthy AI use norms versus teams with no AI use guidelines
Atlassian team dynamics research, 2024 Survey
The 2.4× satisfaction multiplier for teams with explicit AI norms is one of the most actionable findings for engineering managers. Teams that talk about how and when to use AI — that have developed shared norms and expectations — perform better on wellbeing measures than teams where AI use is left to individual discretion or implicitly maximized.
This suggests that AI fatigue is not primarily an individual problem that requires individual recovery. It's a team culture problem that responds to team-level interventions. If you're an engineering manager reading this, that's the most important thing on this page.
Two-thirds of engineers who had a conversation with their manager about AI fatigue described it as positive or neutral. The fear of this conversation is usually bigger than the conversation itself. Read our scripts for having it →
Context & demographics
Who's most affected, and the global picture
AI fatigue does not distribute evenly across the engineering population. The data consistently shows variation by experience level, team structure, industry, and geography. Understanding who is most vulnerable is important for targeted intervention.
| Group | Elevated fatigue risk | Primary driver |
|---|---|---|
| Junior engineers (0–2 years exp.) | Very high — 71% report fatigue indicators | Skill formation disrupted; dependency before competency |
| Senior engineers (8+ years exp.) | High — 64% report craft satisfaction decline | Identity erosion; loss of "why I got into this" |
| Engineering managers | Moderate — 52% report team dynamic stress | Managing AI fatigue in others without framework |
| Mid-career engineers (3–7 years) | Moderate — 48% report decision fatigue | Output expectations rising faster than comfort |
| Startup / high-velocity teams | Very high — estimated 75%+ | Maximum AI adoption pressure, minimum support |
| Enterprise / large org engineers | Lower — 41% report fatigue indicators | Slower adoption pace, more structured environments |
| Solo developers / freelancers | High — 59% | No team norms, self-imposed pressure, no peer validation |
The data on junior engineers is particularly worrying from a longitudinal perspective. The engineers starting their careers in 2023–2026 are the first cohort to learn software development alongside AI assistance as a default. Whether this produces a generation of highly capable AI-amplified engineers, or a generation with fragile foundational skills and high dependency, remains an open and urgent question.
Questions about the data
Frequently asked questions
Survey data consistently shows 60–75% of engineers report some level of AI-related fatigue, stress, or reduced job satisfaction since mainstream AI coding assistant adoption in 2022–2023. The severity varies widely, with roughly 25–30% reporting significant impairment to craft satisfaction or sense of ownership. The number identifying specifically as experiencing "AI fatigue" (as a distinct phenomenon) is estimated at ~48% in 2025.
Yes. Multiple developer wellbeing surveys document a statistically significant increase in burnout indicators between 2022 and 2025 that correlates strongly with AI tool adoption timelines. The Stack Overflow Developer Survey shows an 8% net decline in job satisfaction despite record AI tool adoption rates — which is the key tension in the data: tools went up, happiness went down.
On narrow task metrics, yes — GitHub's Copilot research found 55% speed improvement on isolated coding tasks. McKinsey found 10–45% gains on software delivery metrics. But the evidence on holistic developer wellbeing, code quality over time, and long-term skill development is much more mixed. The data strongly suggests that speed gains are real, but wellbeing gains are not automatic — and may require deliberate organizational choices to realize.
The cognitive research strongly suggests yes. A 2024 MIT study found 40% lower retention of problem-solving approaches when AI generated the initial solution versus when the engineer worked unaided. The phenomenon mirrors GPS-induced navigation skill decline. The risk is highest in junior engineers building foundational competencies, and in any engineer who relies on AI for the specific type of thinking that develops their expertise.
You're welcome to cite this page. Please attribute as: "The Clearing, AI Fatigue Statistics 2025 (clearing-ai.com/stats.html), accessed [date]." For specific underlying studies (GitHub, Stack Overflow, McKinsey, academic papers), cite those directly. We ask that you verify currency — some numbers change year over year. Contact us via the About page if you have questions about specific data points.
That you're not imagining it. The data confirms what many engineers feel in their bodies: something changed, it has a measurable cost, and it's not evenly distributed. The engineers experiencing it most acutely — the very senior who feels their craft slipping, and the very junior who never got to build their craft without AI scaffolding — have something real to work with here. Name it. The recovery data shows that naming it and acting on it works.
The numbers pointed here. Now what?
Data is useful when it creates permission — permission to name what you're experiencing, to act, to ask for help. Here's where to go from here.
Related reading
Go deeper on the data
The Science Behind AI Fatigue
Full cognitive science explainer: Sweller, Kahneman, Parasuraman, Gloria Mark. Real citations, honest synthesis.
Read the science →Fatigue vs. Burnout: The Distinction
The data shows these are different phenomena with different recovery paths. Mixing them up makes both worse.
Read the guide →Which AI Tools Cause Most Fatigue?
Copilot, Cursor, ChatGPT, Codeium — compared on decision fatigue, skill erosion, and cognitive load.
See the comparison →Recovery Guide
The data on what actually helps. 7 phases, an interactive checklist, and real recovery timelines.
Read the guide →