Regulating AI licensing: The missing piece in the UK’s Copyright strategy

ai-data-technology

The Government’s long‑awaited response to the Copyright and AI consultation marks an important moment for creators and developers alike. In her 18 March 2026 written statement, Tech Secretary Liz Kendall confirmed that the UK will pursue an approach that seeks to balance strong copyright protection with the need to foster a competitive, sovereign AI sector. Kendall stressed that the UK “must be an AI maker, not an AI taker” and that supporting both the UK’s creative industries and its rapidly growing AI ecosystem is central to the country’s industrial strategy.

While this commitment is welcome, the Government must now turn its attention to the rapidly evolving licensing market that has emerged in the absence of detailed regulation. Regulating this market would help level the playing field for creators and, importantly, strategically position the UK between the US and EU approaches – allowing developers to use the UK as a springboard into European markets while maintaining access to high‑quality datasets. This, alongside the Government’s acknowledgment that deepfakes are a growing threat, represents a crucial opportunity for meaningful reform.

The power imbalance at the heart of AI and copyright

Today’s AI and copyright debate is a classic case of David and Goliath. While lacking any specific regulatory proposals, the Government’s announcement is a welcome step for creators who are demanding fair remuneration for their work. An opt-out approach hasn’t provided enough protection for copyright holders, many of whom are significantly smaller than the large AI companies that are seeking to train on their copyright works for free. Further clarification on measures to change this will help create a more level playing field.

The case for a regulated licensing system

In the UK, we are ideally placed to support our world-leading creative industries. Regulatory guidance from the Government would be welcome, but in the meantime, a healthy licensing market has developed, which allows copyright holders to charge developers for training AI models on their work. While a licensing system is ideal for helping support the interests of both AI developers and creators, it would benefit greatly from regulatory support, in particular provisions need to be implemented requiring AI developers to be transparent about the copyright works they have used to train AI models. A regulated licensing system would establish the UK as a healthy middle ground between the stricter regime enforced by the EU AI Act and proposed by the EU Commission, and the laissez-faire approach in the US. Shifting away from the US model and towards the EU allows the UK to become a gateway to the EU market, as AI developers looking to train their models in one jurisdiction could also meet the regulatory requirements needed to access the other. 

How licensing improves AI model quality

One underdiscussed benefit of the licensing system is that it encourages AI developers to seek out the highest-quality data to train their models on. In an opt-out regime, models are often trained on a high volume of subpar material scraped from the Internet. If AI developers require a licence to train their models, they’re incentivised to take a more targeted approach aimed at accessing a better quality data set to include copyright works covered by the licence, which they may find will result in a stronger, more robust (and more popular) model in the long term.

Balancing creator rights and developer needs

While the creative industries have rightly demanded a fair hearing in protecting their copyrighted work, AI developers must also have a seat at the table in deciding how they can viably train their models. In particular, AI developers will be looking closely at the transparency rules the UK chooses to adopt. At present, the UK and US have no requirements for developers to disclose what data they’ve trained on or how the models have been trained, turning AI models into obscure ‘black boxes’. While developers argue that an opaque training diet is vital to retain their competitive edge, it makes enforcing regulations nearly impossible. While transparency is important to ensure copyright holders are protected, this must be carefully weighed with ensuring we foster a competitive AI sector in the UK. More development for sovereign AI is not only good for the UK economy, but it also helps reduce an over-reliance on US developed models.

Addressing the threat of deepfakes

Deepfakes have been a substantial international problem, and it’s positive to see the UK Government has identified this as a growing concern. They might adopt the image rights framework already in place in Guernsey or go further than this and follow Denmark in implementing personality rights as part of copyright law and criminalising the unauthorised creation of deepfakes.

Closing reflections

The Government’s latest announcement signals progress, but meaningful action is now needed to ensure creators are protected while innovation continues to thrive. By regulating the licensing market, improving transparency and tackling emerging harms such as deepfakes, the UK has an opportunity to set a balanced and internationally credible standard. Done correctly, this approach will support both our creative industries and our growing AI ecosystem – strengthening the UK’s position on the global stage and fostering a fairer, more sustainable digital future.

Get in touch

Related