AI, Inspiration, and Fair Compensation: Finding a Middle Ground in the UK’s Copyright Debate

AI, Inspiration, and Fair Compensation: Finding a Middle Ground in the UK’s Copyright Debate
Photo by Evangeline Shaw / Unsplash

The UK’s proposed AI copyright changes have sparked intense backlash from the creative industry, with musicians like Dua Lipa, Elton John, and Paul McCartney leading the charge against them. At the heart of the debate is the government’s plan to introduce an opt-out system, which would allow AI developers to use copyrighted content for training unless the creator explicitly prevents it.

Critics argue that this favors big tech companies and makes it nearly impossible for creatives to control how their work is used. But on the other side, AI developers insist that training on human-created works is essential for innovation—just as artists throughout history have been inspired by those who came before them.

I believe there’s a middle ground: AI should be allowed to learn from creative works, but companies must fairly compensate artists—without making the costs so high that it stifles AI development altogether.


an abstract image of a sphere with dots and lines
Photo by Growtika / Unsplash

AI Is Inspired, Just Like Humans—But At a Different Scale

Much of the opposition to AI training is based on the idea that AI is simply copying human work. But as Alice Helliwell, an assistant professor in philosophy, pointed out in the article, art has always been referential.

  • Musicians take inspiration from past sounds—The Beatles were influenced by Chuck Berry, and modern artists like Billie Eilish draw from decades of pop and alternative music.
  • Painters and photographers influence each other—Picasso’s cubism was shaped by photography, just as street photographers today borrow techniques from classic painters.
  • Writers build on existing ideas—Shakespeare’s plays were often adaptations of earlier works.

AI is no different. It doesn’t copy in the way piracy does—it learns patterns, styles, and structures to generate something new. In fact, AI tools like Midjourney and ChatGPT don’t store or reproduce exact pieces of copyrighted content; they create outputs that are inspired by vast amounts of data.

However, there is a key difference: Only a few humans will ever be inspired to create music, but AI allows tens of thousands of people to generate music instantly.

In the past, becoming a musician required years of training, practice, and talent. But with AI, anyone can create music effortlessly—without needing musical skill. This floods the market with AI-generated content, making it harder for human artists to stand out and monetize their work.

This is what truly worries musicians—not just that AI is inspired by their work, but that AI-generated songs could oversaturate the industry and devalue human creativity.


1 U.S.A dollar banknotes
Photo by Alexander Grey / Unsplash

The Real Issue: Who Profits from AI?

The real frustration from artists isn’t just about AI being trained on their work—it’s about who benefits financially.

Let’s be honest: AI companies are making billions, and artists are seeing none of it. Meanwhile, the creative industry is already tough—many musicians, writers, and illustrators struggle to make a living. For them, seeing tech giants build powerful AI models using their work without compensation feels like exploitation.

That said, I believe an opt-out system still makes sense. AI training is not stealing—it’s a process of learning, just as humans learn from existing art. The hesitation around AI’s use of copyrighted material often stems from uncertainty about how the technology actually works. Instead of assuming AI is copying or replacing artists, we need to:

  • Provide clearer explanations of how AI models function—helping creatives understand that AI does not store or directly reproduce copyrighted content.
  • Develop stronger transparency measures—so artists can see what data is being used and how it influences AI outputs.
  • Use algorithms to prevent AI-generated content from being too similar to existing works—ensuring that AI creates something new rather than imitating too closely.

By implementing these safeguards, we can ensure that AI advances without unfairly exploiting artists. Instead of blocking AI entirely, the focus should be on fair compensation and responsible AI development.


gray concrete building under blue sky
Photo by Josh Calabrese / Unsplash

AI Should Pay, But Not an Exorbitant Price

I fully support the idea that AI companies should pay for the content they use—but the cost needs to be reasonable. Setting sky-high fees could slow down AI development, making innovation accessible only to the richest corporations while smaller startups fall behind.

The UK's "Make It Fair" campaign pushes for AI companies to declare their training sources and pay a license fee for copyrighted materials. This is a reasonable compromise that ensures artists are compensated without blocking AI progress.

We’ve seen similar licensing solutions work before:

  • Streaming services like Spotify and Apple Music pay artists per stream.
  • Stock image platforms like Getty Images license photos to AI companies for training.
  • YouTube’s Content ID system allows creators to claim ad revenue from videos that use their content.

A licensing model for AI training could work in the same way, ensuring fair compensation without making AI development prohibitively expensive.


gray and black fountain pen and book
Photo by Aaron Burden / Unsplash

Final Thoughts: Balancing Creativity and Innovation

The debate over AI copyright is not about whether AI should be trained on human-created works—it’s about how it should be done fairly. AI, like human artists, learns from what came before. But unlike human artists, AI is controlled by companies that profit from its output, and that’s where the ethical concerns lie.

A fair and transparent licensing system is the best way forward. AI companies should pay reasonable fees to use creative works for training, but the costs shouldn’t be so high that they block innovation or create industry monopolies.

At the same time, awareness is key. The perception that AI is “stealing” art often comes from a lack of clarity on how it actually works. If we focus on informing creatives, enforcing transparency, and using safeguards to protect originality, we can create a copyright system that benefits both artists and AI developers.

Finding this balance is crucial because AI isn’t going away—it’s only getting more advanced. The key is ensuring that as AI evolves, artists are valued, compensated, and included in the conversation.

What do you think? Should AI companies pay for training data, or does AI learning fall under fair use? Let’s discuss in the comments! 🚀

Reference: Dally, P. (2025, March 7). Why musicians like Dua Lipa, Elton John and Paul McCartney are against the UK’s AI Copyright Proposal. Northeastern Global News. https://news.northeastern.edu/2025/03/06/uk-ai-copyright-proposal/

Read more