UK Backs Off AI Copyright Plan After Artist Backlash

The UK government has dropped its “preferred option” to let AI companies train on copyrighted work by default (with an opt-out), following intense pushback from artists and industry groups.

This week’s Signals framed the UK as the clearest, highest stakes test case for a “train now, opt out later” model. Hours later, that path broke.

According to reporting by The Guardian, the government backed down after weeks of mounting pressure from the creative sector. Technology Secretary Liz Kendall confirmed on March 18 that the government now has no preferred option, effectively resetting the entire policy debate.

This wasn’t a quiet policy adjustment. The original proposal, which would allow AI firms to use music, writing, and media unless creators actively opted out, had become one of the most controversial AI policy moves globally. Artists including Elton John and Dua Lipa publicly pushed back, arguing it flipped copyright from permission-based to default access.

Underneath that backlash was a deeper issue: the burden. An opt-out system assumes creators can track and remove their work across countless training datasets—something many argued is functionally impossible.

Now, instead of moving forward, the UK is reopening all options: keeping current law, requiring licenses, allowing unrestricted use, or revisiting a modified opt-out. At the same time, it’s exploring AI labeling, deepfake protections, and systems to help creators track usage — signals that infrastructure, not just law, is becoming part of the solution.

What looked like the most decisive move toward a “train first” framework in the West has stalled under cultural pressure. And importantly, it stalled before implementation.

Campaigners like Ed Newton-Rex are already warning this isn’t settled, just delayed. The government itself admits there’s “no consensus” on how to balance AI growth with creator rights.

The signal: the rules aren’t fully crystallizing yet; in some cases, they’re destabilizing.

And that puts the UK back into the same fragmented global map we just outlined: the European Union pushing permission-first, South Korea building payment infrastructure — and now the UK stepping out of the position it briefly held.

For AI music, this matters more than a decision would have.

Because it shows that even when governments try to lock in the rules of training data, creators can still force a reset before those rules become reality.

Previous
Previous

ElevenLabs Launches Music Marketplace, Turning Reuse Into Revenue

Next
Next

At $4 Billion, Luma AI Is Done Generating Assets — It Wants to Own the Process