0:00
/
0:00
Transcript

The AI Emperor Has No Code: When $1.5B Startups Fake AI and Governments Silence Creators

$450M vanishes into 'AI' that was just 700 engineers. Meanwhile, governments chose Big Tech over creators. Two groups of digital alchemists selling invisible cloth. Who's naked? Find out in EP #97

Once upon a time, in a land not so far away, there lived an Emperor who loved new technology more than anything else in the world.

One day, two groups of ambitious alchemists arrived at his court with perfect proposals.

Now, alchemists, for those who might not know, are the folks who claim they can turn ordinary metals into gold through secret methods.

In medieval times, they promised kings magical transformations. In our modern AI version, they promise investors magical returns.

I learned about digital alchemists the hard way during the dotcom boom.

A well-funded CEO once called me with an irresistible offer:

"Book me a million-dollar ad campaign.

Keep $100,000 for yourself, send me back the rest, but invoice the full million."

Easy money, right?

A year later, that CEO went to white-collar prison.

I turned down the deal because I've learned there's no such thing as easy money, no free lunch, no magical transformation of nothing into something valuable.

Except, apparently, in AI.

The first digital alchemist group claims they transmute ordinary human coding into revolutionary artificial intelligence.

The second group promises to transmute creator protections into competitive advantage, warning that protecting artists' rights dooms the kingdom to irrelevance in the global AI arms race.

Both are selling the same invisible cloth: the belief that you get something unexpected without paying the real cost.

And just like in the fairy tale, everyone is so eager to see the magic that they forget to ask the obvious question:

"Where's the actual gold?"

This is the story of how $450 million disappears into thin air, and how two groups of alchemists use the same magical formula: promise transformation, hide what's really going on, and let people's desire to believe do the rest.

Part 1: The First Alchemists - Builder.ai

The first group of alchemists called themselves Builder.ai—though they started as Engineer.ai in 2016, which should have been the first clue.

They began with an astonishing claim: they'd invented an AI assistant named "Natasha" that builds smartphone apps with 80% automation.

"As easy as ordering pizza," they promised.

The Emperor's court? Mesmerized.

Microsoft opened their treasury. The Qatar Investment Authority, SoftBank, the World Bank—they all lined up with golden coins. Over $450 million poured in at a valuation of $1.5 billion.

But here's where the story gets interesting. In 2019, the Wall Street Journal decided to peek behind the curtain. What they found wasn't revolutionary AI.

Instead it was 700 engineers in India, manually coding every single app. The "artificial intelligence" was more like a smart toaster.

You'd think that ends the story, right? Emperor discovers the alchemists aren't telling the truth, throws them in the dungeon, recovers the gold?

Not in our AI fairy tale.

Builder.ai not only survives. It thrives for six more years. Microsoft doubles down with equity investments. The Qatar Investment Authority keeps writing checks.

Because in the age of AI, even when you catch the alchemists red-handed, the desire to believe in magic is stronger than the evidence of your own eyes.

The deception continued. Revenue was inflated by 300%. In 2024, they claimed $220 million when the real number was closer to $50 million.

When a new CEO finally looked at the books in 2025, he discovered what everyone should have known since 2019: there was no gold, there was no magic, there was no AI.

It's not "no code"—it's lots of code. Human code.

From $1.3B to Bust: The Rise and Collapse of Builder.ai

In June 2025, the creditors came calling. Viola Credit seized $37 million, leaving Builder.ai with just $5 million in restricted accounts.

The company filed for bankruptcy, owing over $100 million against assets worth less than $10 million.

But here's the most fascinating part about their creditor list—it reads like a spy novel.

They owed money to Shibumi Strategy, an Israeli intelligence firm founded by former Mossad operatives. Quinn Emanuel, one of the world's most intimidating litigation firms. Sitrick and Company, crisis communications specialists. T&M USA, corporate intelligence.

When your AI startup needs spies and crisis management experts on speed dial, you're probably not disrupting app development.

More like you're disrupting the truth.

Key Facts: Builder.ai's $450M Deception

Engineer.ai in 2016, later became Builder.ai
  • Company Evolution: Engineer.ai → Builder.ai (2016-2025)

  • The Promise: "Natasha" AI assistant, 80% automation, "easy as ordering pizza"

  • The Investors: Microsoft, Qatar Investment Authority, SoftBank, World Bank - $450M+ raised, $1.5B valuation

  • The 2019 Exposure: Wall Street Journal revealed engineers in India doing manual coding

  • The Continuation: Despite exposure, company thrives for 6 more years with continued investment

  • The Revenue Fraud: 300% inflation - claimed $220M, actual $50M in 2024

  • The Collapse: Viola Credit seizes $37M, bankruptcy filing June 2025

  • The Creditor List: $100M+ liabilities, <$10M assets including:

    • Shibumi Strategy (Israeli intelligence firm)

    • Quinn Emanuel (intimidating litigation specialists)

    • Sitrick and Company (crisis communications)

    • T&M USA (corporate intelligence)

    • AWS ($85M), Microsoft ($30M)

Part 2: The Second Alchemists - UK Government Policy

While the first group of alchemists was busy turning investor gold into nothing, the second group was performing an even more ambitious transformation in the courts of the United Kingdom.

This alchemy involves Getty Images versus Stability AI—a case that should be open and shut. Getty has evidence that their copyrighted images were used to train Stability's AI without permission.

We're talking about wholesale appropriation of creative work on an industrial scale. Even I know Getty has been litigating since the dotcom era—what were they thinking taking on one of the most aggressive legal teams in the business?

While Getty fights for its rights, the UK Government is going in a different direction?

In the overall question of AI and creator’s rights, the UK government chose sides, and it's not the side you might expect.

Instead of protecting creators' rights, the government is pushing for an "opt-out" system.

Picture this: every artist, photographer, writer, and musician in the kingdom must individually track down every AI company, knock on their digital door, and beg them to please not steal their work.

Meanwhile, there's no transparency requirement. Creators have no way to know if their work has already been fed into the machine.

The government's alchemy here is particularly clever: they're transmuting theft into innovation by simply calling it something else.

"We're not enabling copyright infringement," they say, "we're fostering competitive advantage in the global AI arms race."

Still, the UK system fights back, with movements like Make It Fair and celebrities like Elton John pushing the issue hard.

I wish the US was doing as much.

The House of Lords won a series of votes against the government to introduce provisions securing greater transparency from AI firms to determine what they've trained their systems on.

Five times they voted to require AI companies to reveal what copyrighted material they've used. Five times they demanded transparency. Five times they won.

Then the government struck down these provisions.

The final vote came on June 11, 2025. The transparency provisions were removed. The Data (Use and Access) Bill passed without requiring AI companies to disclose their training data.

The government's position is clear: tech companies' convenience wins over creators' rights.

After all, AI is more important than all of us, right? Who's going to live without AI?

This isn't just about Getty Images—though they have the lawyers to fight this battle.

It's about every independent creator who doesn't have a legal team on retainer. In this new system, those with lawyers get paid, those without get plundered.

The government calls this "balancing innovation with creator rights."

But when you look at the balance, one side gets everything they want, and the other side gets the right to opt out of having their life's work stolen…if they can figure out who stole it in the first place.

The UK tech secretary actually told Baroness Kidron, a leading advocate fighting for creators' rights, that creators were asking for a "privilege"—that demanding permission before AI companies take their work without payment was somehow asking for special treatment.

But that's not a privilege.

It's called property rights.

And it's clear that governments and big tech around the world are working together because they think AI is bigger than all of us.

Key Facts: UK Government Chooses Big Tech Over Creators

  • The Case: Getty Images vs. Stability AI - clear evidence of billions of copyrighted images used without permission

  • Government Position: "Opt-out" system instead of "opt-in" for creator protections. Does this apply to Getty? We’ll see.

  • The Burden: Creators must individually track down AI companies to request removal of their work

  • No Transparency: Creators have no way to know if their work was already used in training

  • Parliamentary Battle: House of Lords voted 5 times for transparency requirements

  • Government Response: Struck down transparency provisions each time

  • Final Vote: June 11, 2025 - transparency provisions removed from Data (Use and Access) Bill

  • The Message: Tech companies' convenience prioritized over creators' rights

  • The Reality: Those with lawyers get paid, those without get plundered

  • Government Rhetoric: Called creator rights a "privilege" rather than property rights

The Pattern Reveals The AI Bias of Governments

Both groups of alchemists follow the same playbook.

Builder.ai hid their army of engineers behind claims of revolutionary AI.

The UK government hides wholesale appropriation of creative work behind claims of competitive advantage.

Both demand trust without transparency. Both benefit the powerful at the expense of creators and copyrights.

And both continue thriving long after their methods were exposed. Builder.ai survived six years after the Wall Street Journal revealed their deception.

The UK government pushed through their anti-creator legislation despite five House of Lords victories demanding transparency.

The system rewards AI theater over substance.

When billion-dollar companies can fake AI capabilities for years, and governments can transmute theft into "innovation policy," what else are we missing?

What other emperors are parading around naked while we all politely admire their invisible code?

Choose Reality Over Faux Magic

In the original fairy tale, it takes a child to point out the obvious truth: the Emperor has no clothes.

In our modern version, that child's voice comes from bankruptcy courts and creator lawsuits, from auditors and journalists willing to look behind the curtain.

But what makes this story different from the fairy tale? We don't have to wait for the parade to end to see the truth.

We can choose to look behind the curtain right now, before the decisions are made, while we still have a chance to make AI better by working together instead of against creativity.

Builder.ai is bankrupt, but hundreds of other companies are making similar claims about their AI capabilities.

The UK's transparency battle is over, but similar fights are happening in courtrooms around the world.

And all of this could make AI so much better if we wake up and realize that the engine driving AI's impressive capabilities comes from our imaginations, dreams, creativity, reactions, and experience.

Not just reasoning.

So here's my question for you: When the next digital alchemist arrives at your business, promising to transmute your ordinary problems into automated solutions (sounds like most AI pitches today, doesn't it?)….

Will you ask to see the actual gold?

Will you demand transparency?

Or will you join the parade, hoping someone else will point out that the emperor has no code?

Because in the end, true progress doesn't require creators to surrender their rights.

The future belongs to those who choose reality over beautiful, invisible AI cloth.

The Emperor has no code.

The Emperor is doing things that could be done so much better when they wake up and realize that the AI magic they're selling is powered by human creativity—and that creativity deserves protection, not plunder.

Because the real alchemists aren't the ones promising to turn code into gold.

They're the creators who transform human experience, intuition, and imagination into the art, stories, and visions that make AI worth building in the first place.

Thanks for reading The AI Optimist! This post is public so feel free to share it.

Share

RESOURCES

From $1.3B to Bust: The Rise and Collapse of Builder.ai

This AI startup claims to automate app making but actually just uses humans

Builder.ai

UK government accused of 'gaslighting' as data bill deadlock with upper house goes on

Charting Gen AI
UK government accused of 'gaslighting' as data bill deadlock with upper house goes on
🚨Charting Gen AI is a reader-supported publication. To support my original journalism and enjoy a full uninterrupted Charting experience please become a paid subscriber…
Read more

Victory again for tireless AI transparency campaigner. Will the government now act?

Charting Gen AI
Victory again for tireless AI transparency campaigner. Will the government now act?
🚨Charting Gen AI is a reader-supported publication. To support my work and enjoy a full uninterrupted Charting experience, including searchable access to the archive, please become a paid subscriber today…
Read more

Matthew Clifford - UK AI Plan

Make it Fair UK Campaign

OpenAI warns copyright crackdown could doom ChatGPT

AI Optimist Playlist

Discussion about this video