Why I Was Strangely Thrilled When Meta’s AI Stole My Book

When Meta trained AI on pirated books, many authors were outraged. Yet, for reference works, sharing accurate info through AI can be a hopeful way to reach new audiences.

Categorized in: AI News Creatives
Published on: May 26, 2025
Why I Was Strangely Thrilled When Meta’s AI Stole My Book

There’s something strange about having your book stolen—but not in the way you might expect. When Meta trained its AI on thousands of books scanned from LibGen, a notorious piracy site, many authors erupted in outrage. Mark Zuckerberg, with a net worth enough to fund the British Library indefinitely, decided paying authors was too expensive. The writing community responded with anger, lawsuits, and calls for justice. Authors scrambled to see if their work had been taken without permission.

For creatives, this felt like a betrayal. Yet, there’s an unexpected side to this story. When checking if my own book was part of Meta’s dataset, I felt something unusual: relief, even a sense of validation. It was a shock. But on reflection, there’s a logic to it.

Why I Didn’t Mind My Book Being Used

My work isn’t a novel or a poem. It’s a reference book, packed with facts and curated information. For example, my upcoming book introduces readers to the biggest names in design history—a coffee table book meant to inform. I care deeply about sharing this knowledge, even if it reaches people through AI rather than traditional channels.

Here’s the crux: if AI is going to exist, it should be trained on accurate, well-researched material, not made-up nonsense. AI systems without solid data tend to hallucinate facts—fabricating citations and fictional studies with confidence. That’s worse for everyone. I’d rather have my carefully gathered facts feed AI than let it spin wild fiction disguised as truth.

Where I Draw the Line

This doesn’t mean all AI training is acceptable. There’s a fundamental difference between using reference books and using creative works like novels, paintings, songs, or films. Those are expressions of unique human creativity. When AI learns from creative works without permission, it’s not learning—it’s stealing. It’s the artistic equivalent of identity theft, copying the soul of the work without credit or compensation.

Expect tech companies to argue “fair use” and “transformative works” while their AI scrapes everything creative minds have made. They’ll say AI is “inspired by” rather than copying—just like an art student caught tracing. But if all new content is AI-generated from existing human work, the creative well will eventually dry up. AI will recycle AI, each generation further from original thought. That could mean a flood of content that feels off, like a cover of a cover performed by someone who’s lost their way.

What Creatives Should Take Away

  • Reference materials being used for AI training can be a blessing if it means accurate information reaches new audiences.
  • Creative works demand protection. Their unique expression is what gives them value and must not be treated as open data.
  • Understanding where AI crosses ethical lines helps artists defend their work and adapt to the changing landscape.

If you’re a creative wanting to learn how AI might impact your work and how to engage with it responsibly, exploring reliable courses can be a smart move. Check out AI courses for creatives to stay informed and prepared.

At the end of the day, the fight isn’t just about stolen data—it’s about respecting what makes creative work unique. AI can be a tool, but it must never become the thief.