The House of Lords Has Sided With Artists on AI Copyright. Here’s What It Actually Means for Your Career
On Friday 6th March 2026, the House of Lords Communications and Digital Committee published HL Paper 267, a formal parliamentary report on AI, copyright and the creative industries. Baroness Keeley, the committee chair, didn’t mince her words in the press statement about this report. She stated quite clearly that the UK creative industries face a “clear and present danger” from generative AI.
I’ve been managing artists for over 30 years. I’ve watched the industry go through piracy, the collapse of physical sales, the streaming transition, the social media explosion. Each time, people said it was the end. Each time, things changed and most artists adapted. This feels very different. Not because AI is in anyway uniquely terrifying, but because the legal framework that protects any creative work is genuinely under threat of being dismantled. All before anyone fully understands the long term consequences.

About the Author
I’m Ron Pye, the founder of IQ Artist Management, and I have been working as a professional artist manager for over 30 years. I have traversed the piracy crisis, the collapse of physical sales, the streaming transition, and now the generative AI disruption. I hold an MA in Music Industry Studies (distinction) from the University of Liverpool, where my research focused on the intersection of technology, law, and the music industry, and a BA in Music Business and Finance.
AI copyright is not an abstract policy issue for me, it is something I deal with practically, reviewing management and publishing agreements to ensure they include explicit AI restriction clauses. I advise artists on metadata practices and monitoring the legal landscape so my clients are positioned correctly for whatever regulatory framework eventually emerges.
I have worked across management contracts, music publishing, royalties, sync licensing, and digital marketing. This gives me a multi-dimensional view of how copyright law changes actually ripple through an artist’s income streams. I have written extensively on AI and music copyright, music publishing contracts, and the rights frameworks governing independent artists in the UK.
I like to draw directly on the practical scenarios I encounter while managing artists in today’s market. If you are an independent artist trying to understand what the House of Lords report means for your career, not in theoretical terms, but in terms of what you should do this week, this is the perspective that comes from being on the ground for three decades.
And the timing matters. Just four days previous to this report, on Monday 2nd March 2026, the US Supreme Court declined to hear Stephen Thaler’s appeal on AI-generated copyright. This effectively confirms that AI cannot hold copyright in its own name. The global direction of travel on this most complex of issues is crystallising real fast.
So, what does today’s Lords report actually mean for you? The artist with tracks on Spotify, publishing rights registered with PRS, and a career you’ve built over many years?
Why This Lords Report Carries Real Weight
Not every parliamentary committee report lands with genuine force. This one does.

The House of Lords Communications and Digital Committee is the upper chamber’s specialist body on media, broadcasting and digital affairs. When it conducts a formal inquiry, that’s not a quick panel discussion, it means months of structured evidence gathering. For this particular report, the committee held seven oral evidence sessions, heard from 21 witnesses, and reviewed 29 written submissions. The witness list included Google, Meta, Microsoft, PRS for Music, UK Music, the ISM, the Society of Authors, and the Secretaries of State for both Science and Culture. These are not peripheral voices. This is the full spread of interests, tech giants and creators alike, presenting evidence to parliamentarians who then had to weigh it all and reach conclusions.
And the conclusions matter. In 2023, the creative industries contributed £124 billion to the UK economy and that figure is projected to reach a massive £141 billion by 2030. In direct comparison, in 2024 the entire UK AI sector, generated around £12 billion. The Lords looked at those numbers and found no convincing case that weakening copyright protections would meaningfully benefit AI investment, while the harm to the creative sector is already being felt.
Tom Kiehl, CEO of UK Music, responded to the report, welcoming its findings and warning that music’s £8 billion annual contribution and 220,000 jobs are directly at risk if copyright protections are weakened.
Now, the Lords can recommend. They can’t actually legislate. The government still has to act, and as of this morning, it’s signalling that it still isn’t ready. That’s Section 4. But a report of this weight creates political pressure that is genuinely difficult for any government to dismiss.
What the Lords Are Demanding
The report amounts to 38 formal conclusions and recommendations. Most are procedural. But five matter most to you as an independent artist, and here’s what they will mean in practice.
1. Rule out the opt-out model, permanently.
The government’s original ‘opt-out’ approach would have let AI companies train on your music unless you actively opted out. Only 3% of respondents to its own consultation backed this approach. The Lords are telling the government not to pause on this option but to rule it out entirely. Back in May 2025 you may remember that this is what Paul McCartney, Elton John, Dua Lipa and 400+ artists were demanding when they wrote an open letter to Kier Starmer.
2. A final decision within 12 months.
Or, no more working groups leading to even more consultations. The Lords want a definitive, evidence-based decision published within 12 months. Not a progress update, an actual decision.
3. A public statement: AI companies should be licensing your work now.
Before any legislation passes, the Lords wants the government to publicly state that commercial AI developers should already be obtaining licences to use copyrighted works for training. That statement alone would shift the dynamics in any future licensing negotiations.
4. New legal protection for your voice, style and identity.
Under current UK law, copyright may offer recourse if an AI replicates your actual recordings without permission. But clone your vocal style? Deepfake your voice? As it stands today, there’s no specific right protecting you from that. The Lords are calling for new legislation to close that gap, partly because of Welsh rock band Holding Absence, whose vocal style was cloned by an AI project that accumulated millions of Spotify streams from listeners who believed they were hearing Holding Absence themselves.
5. Mandatory disclosure of AI training data.
There’s currently no legal obligation for any AI developer to tell you whether your music trained their models. The Lords want this made a statutory duty, enforced by a named regulatory body.
There’s also a sixth call worth noting: the report calls on the government to consider whether artists should hold an unwaivable right to payment when their work is used for AI training. This would be managed collectively, the way PRS currently handles streaming royalties. Unwaivable means you couldn’t sign it away, even under any amount of commercial pressure.
The Truth Is The Government Is Still Delaying
Its important to be straight here, because the Lord’s report is genuinely good news, but at the same time I don’t want to oversell it or get ahead of ourselves.
The day before it dropped, the Financial Times reported that the UK government was already planning to kick contentious AI copyright decisions down the road. Back to the drawing board, rather than into the law. And the Lords report itself on paragraph 61, confirms as much. Both Liz Kendall (Science) and Lisa Nandy (Culture) gave evidence suggesting the government won’t be setting out its position this month, despite that being expected.
Lisa Nandy put it quite plainly on paragraph 62: “If we rush into this and get it wrong, we could make a mess. We are not going to rush into it; we are going to take the time to work through it with the working groups.”
So. On the 18th March 2026, under the Data (Use and Access) Act 2025, the government must publish an economic impact assessment on AI use of copyrighted works. That will happen. What won’t happen is any decision on policy. We are extremely interested to see what it says about Option 3, the opt-out model, but we don’t expect to hear anything binding.
Baroness Keeley, to her credit, isn’t having it. Her counter from the report was: “The Government must accept that there is no solution that will satisfy all parties and that delaying in the hope of finding one risks exacerbating these issues further.” And essentially, that’s the tension right now, the Lords are clearly pushing hard, whilst the government is buying time.
Scale of What Is Already Happening
Let’s forget any future hypothetical threats for a moment. The numbers from January 2026 alone should give any working musician pause for thought.

In January 2026, Deezer reported it was receiving 60,000 fully AI-generated tracks for upload, every single day. This was accounting for 39% of daily uploads to the platform. In September 2025, that figure was reportedly around 30,000, so, it’s doubled in four months. Deezer also found that up to 85% of streams, tied to fully AI-generated music, were also fraudulent. This can only mean that most of those uploads exist purely to game the royalty payout system. Across the whole of 2025, Deezer detected 13.4 million AI-generated tracks. 13.4 million, on one platform in one year.
So, think about what that means for your streaming revenue. Your track is competing for the same royalty pool as millions of AI uploads, a significant chunk of which are there purely to siphon payments through massive dilution.
Then there is the wider financial picture. In 2024, CISAC, the global body representing authors and composers, published a study projecting a cumulative €22 billion revenue loss for music and audiovisual creators by 2028, driven by unlicensed AI content and direct market substitution. For music creators specifically, 24% of total revenues are projected to be at risk by that point. That’s nearly a quarter. Not because someone made better music, but because systems built on creators’ work are now undercutting them.
Creators know it too. A PRS for Music survey from 2026 found 79% of music creators worried that AI-generated content will compete directly with their work, up five points from 2023. And 76% believe it could negatively affect their livelihoods, up seven points year on year.
The legal debate in Westminster is about what the law should say. Out here in the actual market, it is already happening and the effects are far and wide.
Your Rights Right Now: What UK Law Actually Says
Let’s start with the reassurance that many are forgetting. In the UK, copyright is automatic. No form, no fees, no registrations are needed. The moment you record an original piece of music, the Copyright, Designs and Patents Act 1988 applies to you. And, your work is protected for 70 years after your death.
The ISM told the Lords committee (paragraph 20 of the report): “UK copyright law is absolutely clear… copying protected works to train AI models engages the restricted act of reproduction and therefore requires rightsholders’ consent and a licence under the CDPA, unless a specific statutory exception applies.” So, in theory, training AI on your music without permission is already an infringement. The law itself is not the problem here.
In practice though, no UK court has ruled on this yet. Not one (Lords report, paragraph 19). Until that changes, enforcement remains expensive and uncertain for most independent artists. And the one specific text and data mining exception the CDPA does contain, Section 29A, only covers non‑commercial research, not commercial AI training.
And there is a second gap. A rather important one I have previously mentioned earlier. Copyright protects your actual recordings and compositions, not your style, your vocal character or your identity. An AI can produce music that sounds exactly like you, and under current UK law you have absolutely no legal recourse. That is precisely what the Lords want to fix.
Six Things You Can Do Right Now
Whilst the government works out what it wants to do, you do not have to sit and wait. In fact, you don’t have that luxury. The ISM has published practical guidance on protecting your music in the AI age, and most of it costs nothing but some of your time.
1. Complete your metadata on every release
Every track needs accurate, complete metadata. Songwriter(s), performer(s), ISRC code and publisher. This is the paper trail that any future licensing scheme will depend on. The ISM also notes you can include an opt out signal in your metadata, though it carries no formal legal weight yet. I would heavily emphasise that this is still worth doing.
2. Add an AI training opt out to your website
A robots.txt directive on your site tells AI crawlers not to scrape your content for training. Does it have legal force? No, not currently. But it documents your intent clearly, which matters if you ever need to demonstrate that you objected. The ISM has guidance on exactly how to set this up.
3. Read platform terms before uploading. Every time.
Several streaming and distribution platforms have quietly added clauses granting themselves, and in some cases third parties, the rights to use any uploaded content for AI training. I would check any platforms position on this before you sign up to anything new. If something isn’t clear, or you cannot find it in the terms and conditions, I would caution that you ask before you make any commitment.
4. Add AI restriction clauses to new contracts
Any recording, publishing, sync or management agreement you sign from here on out should contain an explicit clause restricting the use of your work for AI training. This matters most right now, before any legislation exists. Get it in writing, now.
5. Register with PRS, PPL and MCPS if you have not already
The Lords report specifically calls on the government to consider using collective management organisations to administer future AI licensing payments, much like how PRS/PPL handles streaming royalties today. If you are not registered you would be invisible to any future scheme.
6. Engage politically
Write to your MP. The Make It Fair campaign consisting of a coalition of official bodies representing UK creative industries is worth following. Keep a close eye on BPI, UK Music, ISM and all related bodies, particularly around 18 March. The Lords are pushing for a final decision within the next 12 months, and, what that decision looks like, will depend on how much noise creators make between now and then.
What Happens Next: The Timeline Ahead
There are two dates are worth putting in your calendar.

18 March 2026. Under the Data Use and Access Act 2025, the government must publish an economic impact assessment and progress report on AI use of copyrighted works by this date. It will be published. What it will not be is a policy decision. Watch closely for one thing: whether it rules out Option 3, the opt out model, outright, or leaves the door open. That single signal will tell you more than any press release.
Within 12 months. The Lords report calls on the government to reach and publish a definitive policy decision on AI and copyright within 12 months of now. Not another consultation. Not another working group update. A decision. If the Lords’ recommendation is followed, we should have clarity by early 2027 on whether the UK is heading for a licensing plus transparency regime, with a possible unwaivable right to remuneration for AI uses of your work(s).
Beyond the UK, the direction of travel is worth watching too. In October 2025, Universal Music Group and Udio settled their highly publicised copyright lawsuit and announced a new licensed AI music platform built on authorised music, launching in 2026. That is what commercial licensing in this space can and will look like. And, Australia has already committed to not introducing a broad text and data mining exception, citing the need to provide certainty to creators. Two significant signals that licensing, not exemption, is where this is heading.
The process is slow. But it is moving.
Where This Leaves You
Thirty years in this industry and I’ve learned one thing about moments of genuine change: the people who come out better on the other side are almost never the ones who waited to see what happened.
The Lords report is not a victory. Copyright protection is not secured. The government is still delaying, and there is no guarantee it lands where artists need it to. But, the direction of travel shifted last Friday. The most authoritative parliamentary body on these issues has now told the government, on the record and in public, that creators deserve protection and that the opt out model should be ruled out permanently.
What you can control right now is your own preparation. Metadata. Contracts. Registration. An opt out on your website. None of it is complicated. None of it costs much. And all of it puts you in a better position than you were in yesterday. The clear and present danger is real, but so is your ability to be ready for whatever the government finally decides to do.
Watch what happens on March 18th. And, keep making the noise.








