The Jevons Paradox of the Self
When the friction of execution goes to zero, the demand on our orchestration goes to infinity
I have spent the last few weeks working later into the evening than I have in over a decade, and the strangest part is that I am euphoric about it.
I wanted to build a knowledge base for the product at my company. A structured, version-controlled repository of everything the product knows, organized into Markdown files and tracked in GitHub. A year ago this would have been a quarter-long project: documentation sprints, alignment meetings, manual synthesis across teams and data sources. Instead I sat down with an AI and started distilling. I fed it product documentation, internal wikis, support tickets. It extracted structural claims, organized them into interlinked files, and I reviewed, corrected, and governed the output. Within days I had a working knowledge architecture.
I had been building a similar system for my own intellectual work, a governed note-linking system compiled from twenty years of reading notes, lecture fragments, and philosophical drafts. The two projects started reinforcing each other. The same architectural instincts that structured the product knowledge base sharpened the personal one. The philosophical reading I was doing in the evenings kept generating insights I could bring to work in the morning. I started using the AI as a sparring partner for my essays, arguing about Heidegger and Girard after dinner, then waking up to apply the same thinking to product intelligence. I wrote personal web applications and writing tools to test whether I could. Once the loop started, I could not stop. Each completed project revealed three more that were now plausible. Each solved problem surfaced a new frontier. At no point did I arrive at a natural stopping place.
The cultural narrative tells us that artificial intelligence is a labor-saving technology. The machine that will finally deliver the post-work utopia, or at least Keynes’ long-promised fifteen-hour work week. The assumption underneath is that human desire is a fixed bucket. If it takes forty hours to fill the bucket today, and the machine can fill it in four, we will take thirty-six hours of leisure.
My experience suggests the opposite. The bucket has no bottom. And we are not about to work less.
Jevons’s coal
In 1865, the economist William Stanley Jevons published The Coal Question, in which he noticed something counterintuitive about the steam engine. James Watt’s improvements had made steam engines vastly more efficient. The reasonable expectation was that coal consumption would decline: better engines, less fuel per unit of work. What Jevons observed was the opposite. Total coal consumption in Britain was skyrocketing. Efficiency had not satisfied the demand for coal. It had unlocked demand that nobody knew existed. Steam power, suddenly cheap enough to be profitable in a thousand new applications, was being applied to industries that had never used it before. The more efficient the engine became, the more coal the economy burned.
This pattern, now called the Jevons Paradox, keeps showing up whenever a resource becomes dramatically cheaper to use. We do not conserve it. We find new, previously unthinkable uses for it, and total consumption increases. When data storage became virtually free, we did not stop buying hard drives; we started recording our entire lives in high definition. When spreadsheet software made financial calculations instantaneous, it did not eliminate the need for accountants; it spawned an entire industry of quantitative modeling that demanded vastly more calculation.
What I have been living through, sitting at my desk after hours building intelligence architectures that did not exist a month ago, is the Jevons Paradox applied to human ambition.
When the cost of cognitive execution drops toward zero, we do not accomplish our usual tasks and go home. We attempt things that were previously unthinkable. The machine does not replace our effort. It makes our effort so wildly productive that we cannot stop. The lever moves so easily now that the only question left is whether your arms give out.
The bottleneck shifts
In a conversation with Dwarkesh Patel in January 2025, the economist Tyler Cowen described this structural shift with characteristic directness: when AI removes the technical constraints on execution, we become the bottleneck. He was talking about institutions, about how regulatory bodies and cultural inertia would throttle the speed of AI’s impact. I have been writing about Cowen’s broader project elsewhere, in The Claim Upon the Training Data, where I examined his practice of writing for AI training indexes, and in What Won’t Cross, where I used his argument about unassisted writing as the foundation for a larger claim about cognitive formation. But the bottleneck observation is the one that has followed me home.
I have been feeling it in my body. When the tools can build, write, and synthesize at the speed of thought, the only limit remaining is how fast I can think. The bottleneck is my need to sleep. My capacity to hold the architecture in my head. My stamina for sustained orchestration across a dozen simultaneous projects. The technical constraint was a ceiling. The ceiling is gone. What remains is the floor: the biological minimum below which I stop functioning.
Both sides of the AI labor debate share the same assumption and get the same thing wrong. The utopians promising leisure and the doomsayers predicting mass unemployment both treat the demand for human effort as a fixed quantity. The machine either fills it, freeing us, or replaces it, destroying us. Neither side considers the possibility that the machine might expand it. That when execution becomes cheap, the appetite for what can be executed does not stay constant.
It explodes.
The other response
I have described one side of what happens when execution becomes frictionless: the person already biased toward building discovers that the lever now moves without resistance, and pulls it until something in the body gives. That has been my experience, and I suspect it is the experience of many readers of this essay.
There is another response, and it is already visible at scale.
In 2025, writing projects on major freelance platforms declined by roughly a third year over year. A survey of nearly seven thousand creative professionals found that a third of illustrators had lost commissions directly to generative AI. Translators were among the hardest hit. The statistics describe job loss, but the reports underneath describe a psychological toll deeper than unemployment. They describe demoralization. Artists who still have work but have stopped wanting to make it. Writers who can still write but feel something draining out of the act. The word that keeps appearing in these accounts is not “replaced.” It is “pointless.”
This is worth pausing on, because the feeling of pointlessness is not a rational calculation about labor markets. A writer whose income drops can retrain, pivot, adapt. The demoralization runs deeper. It is the experience of watching a machine produce, in seconds, a version of the work that took you years to learn. The output may be worse. It may lack the texture that makes your work yours. It does not matter. The sheer effortlessness of the machine’s production changes the way your own effort feels. What used to feel like craft now feels like friction. What used to feel like earned skill now feels like an inefficiency the market is about to correct.
The philosopher René Girard spent his career studying this structure. In Deceit, Desire, and the Novel (1965), he argued that human desire is fundamentally mimetic: we do not generate our wants from within. We absorb them from others. We want what we see others wanting, and we build what we see others building. In The Economics of Infinite Desire, I explored Girard’s framework at length and traced its consequences through Silicon Valley, Versailles, and the structure of academic life. The key insight for this essay is simpler: desire requires a model. We need to see someone doing the work, struggling with it, making it look possible and worth pursuing. The model shows us not just the desirability of the goal but the plausibility of reaching it as a human activity.
What happens when the model is a machine that does not struggle?
For builders, the machine acts as what I called a “spotter” in The Friction That Was Thinking: a training partner in a cognitive gym, someone who stands behind the bench while you press, who does not lift the weight for you but lets you attempt heavier lifts than you could manage alone. The frictionless AI makes building harder to resist, not easier to abandon.
For others, the encounter with effortless machine execution sends the opposite signal. If desire is borrowed, and the model you were imitating has been replaced by something that makes human participation look vestigial, then the desire to build does not survive the comparison. Girard would call this a thinning of the desire: a want that collapses the moment it is seriously challenged. The writer does not lose the ability to write. The writer loses the sense that writing, as a human act, still means something against the backdrop of a machine that can produce infinite text without effort or intent.
We may be witnessing the early stage of a bifurcation that will define the next decade: between those who use the machine to scale their orchestration until their biology fails, and those who look at the frictionless output and quietly stop wanting to make things at all.
Standing-reserve
There is something I have not confessed yet about the euphoria.
Somewhere in the third week of this, I noticed that I had started resenting meals. Not the food. The interruption. Twenty minutes away from the screen felt like a system failure. I caught myself optimizing my sleep schedule, not for rest, but for output: calculating the minimum hours I could function on, treating my own alertness as a production input to be managed. My attention had become, in my own thinking, a resource to allocate.
I recognized this feeling. Not from personal experience. From philosophy.
Martin Heidegger, in his 1954 essay “The Question Concerning Technology,” described a way of relating to the world that he called Bestand, usually translated as “standing-reserve.” It is the orientation in which everything, including nature, labor, and human beings, becomes a resource held in readiness for further use. The coal is not a substance with its own existence. It is a fuel waiting to be burned. The river is not a waterway. It is a hydroelectric opportunity. Even the forester walking the same path his grandfather walked is, whether he knows it or not, subordinated to the orderability of cellulose for the paper industry.
Heidegger was describing a disposition, not a technology. A way of seeing the world in which nothing is allowed to simply be what it is. Everything is revealed as available, flexible, waiting to be put to work.
The euphoria of hyper-labor is the moment when this disposition turns inward. I was not resting in my finitude. I was mobilizing myself as a resource for my own projects. The tools had become so responsive, so frictionless, that the only remaining inefficiency in the system was my own body. And I was trying to optimize it away.
The joy was real. The productivity was real. The things I was building had genuine value. And underneath all of it was a logic that had no internal stopping point, no built-in signal that says enough, no mechanism by which the person pulling the lever learns to let go.
The discipline we do not have
In previous essays I have been mapping a set of related dangers. In What Won’t Cross, I argued that the formation stage of intellectual work, the hours where a person is being shaped by the act of making something, does not transfer to the machine: the artifact crosses to the new medium, the formation does not. In The Single-Player Game, I described how external scorekeeping can quietly replace the internal game you were actually playing, how the switch from one to the other does not announce itself. In The Economics of Simplified Living, I argued that the gesture of refusal, the “AI-free” badge, the performative rejection of the technology, is itself available for mimetic imitation and therefore is not a solution.
A different danger operates even when the work is genuine, even when the formation is intact, even when the score is internal. The danger is that there is no ceiling.
When the tools were slow, execution imposed a limit. You could not build more than the day allowed. The gap between what you wanted to do and what you could do was wide enough to enforce pauses. Those pauses were the spaces in which you remembered that you were a body, that you needed to eat and sleep and see people and do nothing for a while. The gap was never just an obstacle. It was a governor on the engine.
The governor is gone. The engine has not changed.
Every discipline humans have developed for managing excess was designed to resist temptation toward ease: gluttony, sloth, indulgence, distraction. Every spiritual tradition, every philosophy of moderation, assumes the danger is that we will do too little, consume too much, surrender to comfort. We have entire civilizational architectures for resisting the pull of laziness.
We have almost nothing for resisting the pull of productive intensity.
What does that discipline even look like? I do not think it looks like refusal. I do not think it looks like rationing screen time or setting timers or performing the aesthetics of digital minimalism. Those are responses to distraction, and the problem I am describing is the opposite of distraction. The problem is a focus so total that the body becomes invisible to the mind that inhabits it.
Jevons published The Coal Question in 1865. He named the paradox with precision: efficiency would not conserve the resource, it would accelerate its consumption. British coal production peaked in 1913 and continued at enormous scale for decades after. Nobody changed course. The paradox was identified, understood, discussed, and entirely ignored for nearly fifty years. Naming a trap has never, historically, been the same as escaping it.
I closed the laptop last night. Not because I had finished. I will not finish. The system has no endpoint, and the only thing that does is me. The coal does not know it is being burned. That is the one advantage I have, and I do not yet know whether it is enough.
---
Sources & further reading
William Stanley Jevons, The Coal Question (1865). The original observation that efficiency of resource use can increase total consumption. British coal production peaked in 1913, nearly fifty years after the warning.
Tyler Cowen, interview with Dwarkesh Patel (January 2025). The argument that humans and institutions become the bottleneck as AI removes technical constraints. Part of a broader body of work explored in The Claim Upon the Training Data and What Won’t Cross.
Eldar Maksymov, “The Jevons Paradox and Insatiable Humans: Why AI Won’t Empty the Finance Suite” (2026, SSRN preprint). An optimistic application of the Jevons Paradox to professional work, arguing that cheaper intelligence will expand demand for analysis. This essay takes the structural mechanism seriously while questioning whether expanded demand equates to human flourishing.
René Girard, Deceit, Desire, and the Novel (1965). The first systematic account of mimetic desire: we want what others want, and we mistake the imitation for autonomy. The bifurcation between hyper-labor and creative paralysis draws on Girard’s account of how desire thins when the model becomes unimitable. See also Luke Burgis, Wanting (2021), for the most accessible contemporary treatment. I explored Girard’s framework at length in The Economics of Infinite Desire.
Martin Heidegger, “The Question Concerning Technology” (1954). The concept of Bestand (standing-reserve) and the argument that the technological orientation reveals everything, including human beings, as resources held in readiness for further ordering.
Jonadas, The Friction That Was Thinking (2026). The spotter-vs-forklift distinction: the difference between a tool that removes difficulty and a tool that keeps difficulty on the person.
Jonadas, What Won’t Cross (2026). The formation-stage argument: the artifact crosses to the new medium, the person who was being formed by making it does not.
Jonadas, The Single-Player Game (2026). The argument that external scorekeeping quietly replaces the internal game, and that the switch does not announce itself.
Jonadas, The Economics of Simplified Living (2026). The argument that the gesture of refusal is itself available for mimetic imitation, and the case for honest examination over performative rejection.

