AI Training Is Not Theft. It’s the Same Way You Learned to Create

No one accuses you of copyright infringement when you read Joan Didion and then write a personal essay with similar rhythm. No one calls it theft when a jazz musician riffs on Coltrane’s phrasing after listening to him for 10 years straight.

AI Training Is Not Theft. It’s the Same Way You Learned to Create

Why accusing AI of “stealing” is missing the point entirely

Let’s get something straight because everywhere I go I keep hearing this same idea. I see tweets and posts from writers and authors all over the world, and I have to address the hysteria.

Training is not copying. Learning is not stealing. And the current panic around AI models “stealing” from artists is dripping with hypocrisy.

Let me explain.

The Quote That Says It Best

I recently read an article by 

Enrique Dans titled “AI is not theft: the UK just got it right”. One paragraph in particular hit like a freight train:

“Let’s be clear: training is not copying. Copyright protects two specific acts — reproduction and distribution. An AI model does neither. It analyzes, abstracts patterns and generates new statistical combinations. To claim this process is ‘theft’ is like accusing a musician of robbery for listening to records or drawing inspiration from other songs. Apparently, it’s fine for McCartney to be ‘inspired’ by Chuck Berry, but a neural network can’t be inspired by McCartney. Hypocrisy in its purest form.”

He’s right. And I’m tired of pretending otherwise just to placate artists who should know better.

We All Learn by Consuming Others’ Work

Every artist, every writer, every musician I know learned their craft the same way:

  • They read the greats.
  • They studied structure.
  • They internalized tone.
  • They mimicked style.
  • They iterated until something that felt like theirs finally emerged.

So let’s call it what it is: a training model.

No one accuses you of copyright infringement when you read Joan Didion and then write a personal essay with similar rhythm. No one calls it theft when a jazz musician riffs on Coltrane’s phrasing after listening to him for 10 years straight.

And yet, when a machine does exactly what we do — listen, analyze, abstract patterns — suddenly it’s illegal? Suddenly it’s unethical?

Come on. We are better than this…

Artists Are Confusing Output With Input

There’s a real distinction here, and we need to get clearer about it.

What AI models generate, that’s where copyright law could reasonably come into play. If an AI model spits out an exact replica of your unpublished manuscript or song, that’s a problem.

But that rarely happens unless someone specifically asks for it. For example, someone prompting an LLM like ChatGPT:

Create a short story written as if I were Stephen King. Write it in his voice and structure.

…and then publishing it as Stephen King. That is copyright infringement.

But the training process? It’s a form of ingestion. Of exposure. Of analysis.

Exactly like what you did when you spent a year obsessed with David Foster Wallace and started writing longer, footnote-heavy paragraphs.

That didn’t make you a thief. It made you a product of your influences.

So why is AI held to a different standard?

What’s Really Going On

I get it. Change is hard. Especially when it threatens your sense of uniqueness. Especially when you’ve wrapped your identity around being a creative.

AI feels scary. It writes. It paints. It composes music. And a lot of people, especially those already teetering on the edge of burnout, feel like they’re being replaced.

So instead of adapting, we lash out. We look for someone to blame. We try to paint the thing we fear as immoral.

We confuse disruption with destruction.

But just because something is disruptive doesn’t mean it’s theft.

There’s Room for Regulation — But Let’s Be Honest

Look, I’m not saying everything about generative AI is ethical or well-handled. There are real conversations to be had about consent, attribution, and compensation.

But those conversations need to start from an honest place, not from emotional panic.

Training models on public work? That’s not theft. It’s just how learning works. The problem is not the training. The problem is transparency and control over outputs.

We don’t need a blanket ban on AI models reading our work.

We need:

  • Clear rules about when something generated crosses the line into replication.
  • Mechanisms to report abuse or plagiarism.
  • Guardrails for how AI is deployed, not how it learns.

That’s the nuance the loudest voices are missing.

The Hypocrisy Runs Deep

You’ve seen the tweets.

“AI is scraping my style!”

“They trained on my portfolio!”

“This is theft!”

Meanwhile, these same people went to art school, studied Picasso, copied Monet brush strokes in sketchbooks, read Virginia Woolf in college seminars, wrote short stories after binging Raymond Carver.

You became who you are by absorbing the work of others.

And now you want to deny the same to the next generation of tools?

That’s not protection. That’s protectionism.

Human Writing Will Still Matter

You know what AI still can’t do?

  • Feel heartbreak.
  • Grieve a parent.
  • Wrestle with mental illness.
  • Fall in love.
  • Regret a choice.

AI can simulate these things. It can echo our stories. But it cannot feel them. It cannot live them. So human writing? Real, raw, emotional, flawed, soulful human writing? That will always have a place.

Good writing is not just about patterns. It’s about resonance.

If your work connects, no machine can touch that.

So, What Should Artists Do?

Here’s the part you might not want to hear:

Stop crying theft. Start getting better.

Improve your storytelling.

Lean deeper into what makes your work human.

Use the tools that are here, not to replace yourself, but to amplify yourself.

That’s what I do.

Every day, I use AI to:

  • Outline articles
  • Clean up clunky drafts
  • Brainstorm ideas
  • Rewrite flat sentences

But the voice? The rhythm? The blood and guts? That’s still mine.

And I wouldn’t have gotten half as much done in the last year without it.

Final Word

If you’re a writer, artist, or creative, you don’t need to fear AI.

You need to do what you’ve always done: evolve.

And if you’re so threatened by the idea that a machine can do what you do, maybe it’s time to ask yourself a hard question:

What is it about your work that makes it irreplaceable?

Answer that, and you won’t need to scream theft.

You’ll just keep showing up and doing the work.

Because no machine can do that for you.