- Published on
Welcome to the Greatest Hallucination
- Authors

- Name
- Hani Al-Shater

Wake up in the morning, brush your teeth and open LinkedIn, tell me what do you see?
A stream of "groundbreaking" announcements, mind-blowing achievements. "Experts" who are absolutely "thrilled to share" their latest success. Apparently, we live in a world where everyone is a visionary, every startup is a unicorn, stocks are rising, the market is booming. Life is good!
But here's the thing: it's all of them but you.
You scroll faster. Everyone's climbing the promotion ladder, making money, building products. And you? You're drowning. Linear algebra, optimization, ML and deep learning, then LLMs and agent systems, MLOps and system design. Too much on your plate. Too many papers to keep up with. Too many people to catch up to.
You know what success looks like: Being a very smart guy who can write complex math in papers, craft dense appendices with Greek letters. Design architecture diagrams that look like circuit boards. We've learned to recognize this shape. We see it and think: this must be real. Look how cool and useful it is! Well, let me ask you, are you sure that is how success looks like? How could you be sure about it?
I know it is not an easy question, because you did not live that success to be really sure about it, so I would help you here. Let me take you to some special place, to IKEA showrooms. You walk through staged rooms no one lives in, you think, look how cool it is, then arrange your real home to match. The fake is not a photoshopped image of what real rooms are—it becomes the template for what real rooms are. The fake produces the reality!
That inversion creates a self-inflating balloon that keeps you up at night trying to catch a career that does not exist. And it is not only you—you suffer, companies suffer, and even the whole economy! But wait, I am not saying we are in a bubble. Bubbles pop and you return to normal. We're in a simulacrum—there's no normal to return to.
So now, let me tell you what a simulacrum is. Baudrillard, the famous postmodern philosopher, explained how a good reality can shift to a complete fake that has no ground to stand on—a stage he called simulacrum—and he left us with this warning:
"The simulacrum is never that which conceals the truth—it is the truth which conceals that there is none. The simulacrum is true!"
Baudrillard's Simulacra is a very dangerous idea, and if you buy it, we will need to talk about it. But first, let me tell you how we got here.
Welcome to the Greatest Hallucination
Going from reality to simulation is not a new thing. From the deep past, people gathered food and built houses. When problems happened, they returned to some wise guys who helped them resolve things. The wise guys became important and changed the narrative—they created dogmas or religions, and people lived by that. The hallucination became something real. It is not false info; it is the code that all people live by.
The simulation validates and amplifies itself. It rewards narrators, not value builders. It is not the farmers or workers who came to power, but the priests and kings. It is not the workers who make money, but the guys who run the bank. And so on.
But who cares about the past? We are in the AI age now—the ultimatum of the simulation, haha. And it did not take us centuries to get here. This age was really fast. Just within 15 years we went from something real to full simulacrum.
Brief History of the Great Hallucination
Stage 1: The Sign Reflects Reality (2008-2012)
ML became hot. People realized that ML is a very useful tool. Online courses like Andrew Ng's ML course go viral—100,000 students learn machine learning. AlexNet crushes ImageNet in 2012. Code is public. Claims are testable. Results are reproducible. Knowledge leads to results. Results lead to reward.
Then the break: too many ML graduates, too much venture capital. Everyone needs a new story.
Stage 2: The Sign Distorts Reality (2013-2023)
Ten years of gradual hype buildup. The shift wasn't a button press—it was slow drift.
DeepMind's DQN masters Atari games. AlphaGo beats Lee Sedol and the world loses its mind. Neural machine translation takes over Google Translate. Real breakthroughs—but the hype machine kicks in.
"Deep Learning" replaces "neural networks"—same math, better branding. Papers test 47 architectures, report only the winner. Every startup adds "AI-powered" to their deck. Elon promises full self-driving, every year. IBM Watson beats Jeopardy, then promises to cure cancer. Self-driving is "99% solved"—except the last 1% is the entire problem.
GPT-2 is "too dangerous to release"—for nine months, as marketing. GPT-3 costs $12M to train and nobody can reproduce it. Papers cite papers cite capabilities nobody can verify.
ML is eating the world. The best CS paper in a decade—a double greedy algorithm, pure computer science—gets published at an ML conference. Because that's where the attention is. That's where the funding is. The narrative captures even the fields it has nothing to do with.
The sign still points to something. Just not as much as claimed.
Stage 3: The Sign Masks Absence (2023-2024)
November 2022: ChatGPT launches. Within two months, 100 million users. AI stops being a niche topic and becomes the only topic.
The multi-billion rounds begin:
- Microsoft → OpenAI: $13B
- Google → Anthropic: $3B+
- Amazon → Anthropic: $8B
What are they buying? Not current capabilities—those are commoditizing. They're buying territory in the AGI market. The market that doesn't exist yet. The market everyone agrees will exist because everyone is buying territory in it.
Stage 4: Full Simulacrum (2025-Present)
Now the simulation starts validating itself.
The simulation protects its narrators: November 2023—OpenAI's board fires Sam Altman. 700 employees threaten to quit. Microsoft offers to hire everyone. Sam returns in five days. The board is replaced. The best narrator, Altman, overpowers value builders (the chief scientist and the board).
October 2024—Hinton and Hopfield win the Nobel Prize for work on stats that model memory—a Nobel not in psychology or stats, but in physics. Demis Hassabis and John Jumper win the Chemistry Nobel for AlphaFold. The Nobel Committee—the ultimate gatekeepers of scientific legitimacy—validated AI not by checking technical claims, but by registering narrative weight. AI won over real science!
January 2025—Stargate project announces $500B for AI infrastructure. Half a trillion dollars. The number is so absurd it glitches. But a number that large must mean something real is happening. The scale proves the substance. Which justifies more investment. Which proves AGI is near.
The scientists leave, the narrators stay: Yann LeCun—Turing Award winner, one of the three "Godfathers of Deep Learning"—steps back from Meta. Meanwhile, Meta hires more narrators, more hype builders. The system selects for storytellers, not scientists. The people who built the foundations are pushed out; the people who sell the story are promoted.
It's turtles all the way down:
Nvidia's trillion-dollar valuation isn't based on chips sold. It's based on AI expectations. Those expectations are based on OpenAI's valuation. OpenAI's valuation is based on Microsoft's investment. Microsoft's investment is based on Azure compute growth. Azure compute growth is based on... startups buying compute with VC money to build AI products whose valuations are based on Nvidia's stock price.
No original. No ground truth. Just a self-referential loop of expectations inflating expectations.
The simulation bootstraps its own reality !!!
The Nature of the Simulacrum
Before you go looking for the architect—some shadowy cabal pulling strings—let me save you time: there aren't any. Nobody designed this. The Matrix wasn't built by machines. It emerged from us.
Fear of exclusion. Need for status. Pattern-matching for safety. These instincts kept us alive in the stone age. Now they drive us to chase metrics we don't believe in, signal belonging to tribes we don't respect.
The system optimizes for its own metrics—valuations, engagement, citations—which drift further and further from anything that matters. Big Tech Giant invests $500 million into hot AI startup. Condition: startup must spend it buying cloud credits back from Big Tech Giant. Tech Giant records it as "Cloud Revenue." Startup records it as "Valuation." Everyone's stock goes up. No customer bought a product. No money left the ecosystem.
The market values Sign Value—hype, prestige, the appearance of innovation—over Use Value—does this solve a boring problem profitably? That gap is where the system eats itself.
Why can't anyone escape?
In a simulacrum, awareness doesn't break the spell—it's part of the spell. Workers are trapped by hope ("If I build good tech, I'll become successful"), belief ("We really are building AGI"), and golden handcuffs (equity vesting, resume building, sunk cost). Narrators are trapped by competition (other narrators will out-simulate you) and lock-in (admitting simulation destroys valuation). Both are trapped. The system runs on everyone's participation.
The Glitches
You know the feeling. Déjà vu. The story stutters. Something doesn't fit.
We still die from cancer. We still live in small apartments. We still sit in traffic. Poverty is still there. A relatively simple pandemic like COVID pushed us to the limit. Tell me again how AI is changing the world?
Meanwhile:
- Valuations go up but customers don't exist
- "Experts" multiply but problems don't get solved
- Revenue grows but it's recycled VC money
- Everyone is "thriving" but everyone is burned out
- Productivity tools proliferate but nobody has more time
- The most valuable companies sell picks and shovels for a gold rush that may never come
The narrative says revolution. Reality says: same problems, shinier dashboards.
And here's the really fucked up part: we see the glitches. We scroll past "thrilled to share" with a smirk. We joke about the grifters. We roll our eyes at the complexity theater. We're all aware the emperor has no clothes.
And we keep playing. We refresh portfolios. We network at conferences. We update LinkedIn with carefully crafted humility. We half-hope our company gets acquired before anyone checks whether the metrics are real.
Because in a simulacrum, awareness doesn't break the spell—it's part of the spell. Cynical distance is how we tolerate our participation. We know it's bullshit. We do it anyway. The performance continues because everyone is performing, and no one wants to be the first to stop. The simulation runs on our participation.
The Inevitable Collapse
So we like the game—why should we care? Couldn't the simulation run forever?
Fortunately or unfortunately, depending on your taste, the answer is no. Simulations need fuel. And the fuel is running out.
Economic Concentration
The AI boom was driven by economic incentive. Big Tech drove AI from a few percentage points of market cap to 45%. That's an order of magnitude growth. Incredible.
But here's the math problem: to double again, either the entire economy doubles, or AI sucks all the money from physical reality—real estate, healthcare, logistics, food. Both are practically impossible.
The market cap ceiling is real. When investors realize the growth story has hit its structural limit, funding stops. And when funding stops, the simulation loses its fuel.
The Automation Paradox
AI promises automation and cost savings. Translation: massive job displacement. The pitch deck says "efficiency." The spreadsheet says "layoffs."
But here's the paradox nobody wants to talk about: automation erodes the customer base. If AI replaces workers, who pays for AI? The product destroys its own market.
You can't automate your way to prosperity if prosperity requires customers with jobs.
Scientific Stagnation
Tons of papers. Tons of conferences. Tons of PhDs. But where are the breakthroughs?
It feels like technology building, not research. Same architectures, more parameters, bigger datasets. The spectacular gains that launched the hype—GPT-2 to GPT-3, GPT-3 to GPT-4—are flattening. The exponential narrative meets logarithmic reality.
We hit the wall. We just haven't admitted it yet.
The Structural Inevitability
Here's the really uncomfortable truth: every simulation eventually collides with something it can't simulate.
The AI simulation will collide with economics (market cap ceiling), society (automation paradox), and science (diminishing returns).
When it does—and it will—the question isn't whether there's a correction. It's how severe.
The Reload
When the simulation breaks, the hype vanishes. But the value stays. The infrastructure stays.
Electricity had its prophets and exhibitions and wild promises. Now it powers everything quietly. The internet had its dot-com crash. What survived? Email, e-commerce, search—tools that actually work.
AI will follow the same path. Not AGI prophecy. Not robot apocalypse. Just tools that solve specific problems. Exciting, useful, grounded.
And here's the beautiful part: what emerges is often more impactful than what was promised. The real revolution happens after the hype dies.
But don't get too comfortable. Every reload plants the seeds of the next simulation. The cycle continues. New technology, new prophets, new promises.
Surviving Simulacra
Here's where The Matrix got it wrong. There's no red pill. There's no blue pill. Take both.
Live in the simulation. Think outside of it. You can't exit—you have bills, a career, a life inside. But you can learn to surf it.
Open source erodes narrator value. Narrators control access, gatekeep knowledge, hide behind mystery. Open source democratizes everything. When the code is public, claims are testable. When models are open, anyone can verify, compete, improve. Llama didn't just challenge GPT—it broke the mystique. HuggingFace didn't just host models—it democratized capability. The narrators lose power when the magic becomes reproducible.
Democracy of infrastructure. Don't rent your capabilities from the narrators. Local compute, open models, community-owned tools. The more decentralized the infrastructure, the less leverage the simulation has over you. Build on foundations you can verify, not promises you have to trust.
Build real value. Use Value over Sign Value. Solve problems customers actually pay for—without VC subsidies propping up the illusion. Internal consistency beats narrative maintenance. When the reload comes, the things that actually work survive. The things that only exist as performance don't.
Surf the glitches. Train your eye for contradictions—that's where opportunity lives. Valuations without customers. Expertise without output. Revenue that's recycled capital. When the story stops making sense, that's your signal. The collapse isn't disaster for those paying attention. It's when the narrators lose their grip and the builders inherit the infrastructure.
You can't exit the simulation. But you can learn to surf it.