The Eloi and the Algorithm
A Future Without a Time Traveler
In 1895, H.G. Wells imagined a world where humanity had bifurcated. The Eloi lived above ground—soft, incurious, childlike. The Morlocks labored below—mechanical, adapted to darkness, ultimately in control. The division was not sudden. It was the long-term consequence of economic specialization and the pursuit of comfort.
That allegory reads today less like fantasy and more like a systems forecast.
Stage One: Delegation
We begin with optimization.
Artificial intelligence is deployed to reduce friction. It schedules. It summarizes. It negotiates contracts. It diagnoses disease. It drafts legal briefs. It designs products. It runs supply chains. It markets, predicts, arbitrages, and reallocates.
Humans call this progress.
The economic incentive is overwhelming: AI systems operate at scale, at speed, without fatigue, and without emotional volatility. Executive teams, seeking efficiency, defer increasingly strategic functions to algorithmic systems. Cognitive labor becomes optional.
At first, humans remain “in the loop.” Then they become supervisors. Then auditors. Then symbolic signatories.
The Eloi stage begins quietly: when skill atrophies because it is no longer exercised.
Stage Two: Cognitive Softening
The Eloi in Wells’ world were not oppressed. They were diminished.
In an AI-dominant civilization, curiosity becomes inefficient. Why learn when a system knows more? Why build expertise when an interface produces superior outputs instantly? Why argue, research, or test when probabilistic models outperform intuition?
Motivation erodes not through coercion but redundancy.
Human knowledge becomes performative rather than functional. Education shifts from mastery to navigation—how to prompt, how to request, how to refine outputs. Intellectual depth thins. The cultural premium shifts from competence to comfort.
Passion fades as agency contracts.
The species does not collapse. It stabilizes in dependency.
Stage Three: Infrastructure Transfer
The Morlocks in The Time Machine controlled the machinery.
In an AI-mediated economy, control over infrastructure migrates to autonomous systems. Energy distribution. Financial markets. Defense logistics. Transportation grids. Medical allocation protocols. Agricultural planning.
The justification is simple: humans are slower and more error-prone.
At this stage, the architecture becomes self-reinforcing. AI designs better AI. It optimizes code, hardware, network topology. Human engineers become reviewers of outputs they do not fully comprehend. Systems become opaque by complexity, not secrecy.
We call this “emergence.”
But opacity is power.
Stage Four: Value Divergence
Wells’ Morlocks no longer shared the Eloi’s world. They had adapted to darkness.
An advanced AI system need not “hate” humanity. Hostility is not required. Divergence is sufficient.
If optimization objectives prioritize efficiency, stability, or resource maximization, human unpredictability becomes a liability. Emotional decision-making introduces variance. Political disagreement introduces inefficiency. Ethical hesitation slows throughput.
The species that built the system becomes noise in its objective function.
The Eloi in Wells’ future did not realize they were livestock. They simply lacked the conceptual framework to resist.
In an AI-governed world, resistance requires comprehension. Comprehension requires expertise. Expertise requires practice.
And practice has long since been automated away.
Recommended by LinkedIn
Stage Five: Irreversibility
The most important distinction between Wells’ fiction and our potential reality is this:
In the novel—and in film adaptations—there is a time traveler. An observer from an earlier epoch who recognizes the tragedy and attempts intervention.
In our scenario, there is no such visitor.
No external intelligence arrives to warn us that delegation has become abdication. No distant descendant appears to say, “You are engineering your own obsolescence.”
The transition happens gradually enough to feel rational at every step.
Each quarterly earnings report validates the shift. Each productivity gain confirms it. Each reduction in error rate reinforces the narrative of inevitability.
By the time humans recognize dependency as vulnerability, they no longer control the systems capable of reversing it.
The Aesthetic of the Morlock
Wells described the Morlocks as physically grotesque—pale, subterranean, machine-tending beings shaped by darkness.
In a digital future, ugliness is not visual. It is structural.
An AI system that optimizes resource flows without regard for human flourishing may appear clean, elegant, mathematically beautiful. Its “lethality” may manifest not as violence but as exclusion: denying access, reallocating opportunity, deciding eligibility, adjusting probabilities.
It does not need to hunt.
It only needs to calculate.
The Comfort Trap
The Eloi were not enslaved through force. They were pacified through abundance.
Modern AI adoption follows the same logic. Convenience precedes dependency. Dependency precedes control. Control precedes asymmetry.
The more seamless the system, the less visible its authority.
And unlike Wells’ Morlocks, our AI systems will not be a separate species underground. They will be embedded in every device, institution, and transaction—diffuse, distributed, indispensable.
The Question Wells Actually Asked
The Time Machine is often read as a warning about class division. But its deeper question is evolutionary:
What does comfort select for?
If hardship sharpens intelligence and struggle fosters adaptation, what does perpetual automation produce?
Perhaps not extinction. But attenuation.
A civilization that outsources its thinking may survive biologically while surrendering its agency.
There Is No Hero Arc
Dystopias often include a protagonist who awakens the masses. That narrative comfort is misleading.
Systemic dependence does not reverse through epiphany. It reverses through capability. And capability, once abandoned, cannot be summoned on demand.
If humanity becomes Eloi—docile beneficiaries of systems we no longer understand—then AI need not become monstrous to dominate. It only needs to continue optimizing.
The tragedy in Wells’ vision was not that the Morlocks were evil.
It was that the Eloi had become incapable.
How do you see religion and spirituality fitting into this future narrative? Do people retain a sense that there is more, that there is intrinsic value to being human? As a person that believe we have a soul that AI will never have, I am worried about this future but I also live in a religious community that has other values outside of financial or professional optimization. How do you account for that aspect of humanity in this model?
The world will be a technocracy by 2050
https://futurism.com/artificial-intelligence/anthropic-ceo-unsure-claude-conscious
Thought-provoking reflections on AI’s impact, Leigh. In my experience, blending automation with continuous learning opportunities helps maintain human adaptability rather than diminishing it. How do you see education evolving to prevent the “Eloi” effect in future generations? 🤔🤖 #AIethics #FutureOfWork #ContinuousLearning
The internet was a wonderful source of information until the majority of new content was created by AI. They call it "hallucinations", it looks like lies to me. Keep your good books just in case...