PHARMA + LIFE SCIENCE Adobestock 1044274898 LR

AI and Copyright: Updates from the Data (Use and Access) Act 2025 progress statement

05 Jan 2026

The DUAA progress statement provides useful insights on regulation of AI, so that businesses can proactively address risk.


At a glance

  • The Data (Use and Access) Act 2025 does not change UK copyright law, but it sets out a roadmap for future reform in relation to AI and copyright.
  • On 15 December 2025, the government published the statutory progress statement required by section 137 of the Act, confirming that further reports and an economic impact assessment will be laid before Parliament by 18 March 2026.
  • The progress statement summarises insights from the responses to the AI and Copyright consultation:
    • 88% of respondents supported requiring licences in all cases.
    • Only 7% preferred no change to the law.
    • Support for statutory transparency was particularly strong in the creative sector, while technology-sector views were mixed.
  • Expert working groups are now considering transparency, technical standards, licensing and creator remuneration.

Introduction

The Data (Use and Access) Act 2025 (DUAA) provided a means for the government to take an evidence-first approach to the interaction between copyright and artificial intelligence. Before crystallising the position of the two competing interests of the creative and AI sectors, the Act set requirements for two substantive outputs: an economic impact assessment and a report on the use of copyright works in AI development, both due by 18 March 2026.

On 15 December 2025, the government published the statutory progress statement required by section 137 of DUAA. This confirmed the timetable, summarised the results of the AI and Copyright consultation, and described the work now underway to explore policy options.

This article aims to identify what can be inferred from what has been said so far and what this means in practice for AI developers, rights holders, and customers.

Background: from Bill to Act

The DUAA’s passage through Parliament reveals the tension between a desire to protect right holders and a desire to avoid constraining AI development prematurely.

  • 23 October 2024: Bill introduced into the House of Lords, broadly aimed at fostering data use and digital innovation.
  • 5 February 2025: House of Lords reviewed the Bill and added amendments that included requiring AI developers to disclose works used and comply with copyright law.
  • 11 March 2025: These amendments were removed in the Commons, replaced by a requirement for the government to perform an economic impact assessment and prepare reports on AI’s use of copyrighted works.
  • 19 June 2025: The Lords accepted the proposal for impact assessments and reports, and despite further pressure from the Lords for transparency obligations, the final Act did not include these provisions and subsequently received Royal Assent. 

The result is a deliberately procedural statute. It creates no new copyright rules, but mandates a structured process to inform future reform.

The framework

The DUAA establishes three linked obligations.

  • Section 135 requires an economic impact assessment of different policy options and their effects on copyright owners and AI developers, with particular attention to smaller businesses.
  • Section 136 requires a report on the use of copyright works in the development of AI systems, that considers technical standards, data access and text and data mining, transparency, licensing and enforcement, including in relation to AI developed outside the UK.
  • Section 137 requires a progress statement to Parliament, now published.

What is in the section 137 statement

Beyond confirming that the economic impact assessment and other reports will be laid before Parliament by 18 March 2026, the progress statement sheds light on other activities that will shape future law.

The statement revealed some interesting facts on the responses to the AI and Copyright consultation, building on themes discussed in this previous article. 88% of respondents supported requiring licences in all cases, whilst 7% preferred no change, and only 3% favoured an exception with rights reservation which was the position originally put forward by the government as part of the consultation. A figure that is perhaps suggestive of the respondents is that a mere 0.5% favoured a broad exception for text and data mining, which would be the most pro-AI position. One interesting insight is that several respondents sought targeted exceptions.

Regarding transparency obligations, the creative sector strongly supported statutory transparency, while the technology sector expressed more concern, often preferring voluntary or light-touch approaches to reduce burden.

Beyond the consultation, the statement provided more detail on the four working groups established, focusing on control and technical standards, information and transparency, licensing, and wider support for creatives. These groups are exploring workable and proportionate solutions.

The progress statement emphasises that it is not intended to pre-empt the report and impact assessment, but in light of the low support for the consultation's preferred option, and working groups focusing on practical solutions and reducing the administrative burden, it appears likely that there will be some form of licensing and transparency requirement put in place.

Practical implications and actions

For AI developers

There is currently no obligation to disclose training datasets or license all uses of copyrighted works. However, the consultation results and the focus of the working groups point towards greater transparency and facilitated licensing.

AI developers should prepare by maintaining records of their own training and retrieval data sources and crawler behaviour, mapping where reliance is placed on implied permissions versus express licences, and building governance processes for opt-outs, takedowns and model updates. If working at the application layer, steps to mitigate risks introduced by the foundational model should be looked at.

For creators and right holders

Licensing and statutory transparency are firmly on the policy agenda. Until reforms occur, creators remain reliant on contractual controls, access restrictions and practical self-help to preserve value and monitor use.

Creators and right holders should implement access controls and rights-reservation signals, review website terms and licences to restrict text and data mining and set enforcement routes, and explore individual or collective licensing options.

For businesses using AI

Future changes may affect pricing, availability and risk allocation. Businesses should not wait for regulation before addressing these issues.

Depending on how critical the solution is, businesses should seek transparency, provenance, warranties and indemnities, include audit and co-operation rights and clear incident notification obligations, and address change control on pricing if the provider is required to pay licence fees for its datasets.

Closing thought

The UK has chosen an evidence-first route. While there are no new legal obligations yet, licensing and transparency are clearly central to the policy debate. Preparing now for greater scrutiny, clearer rights allocation and potential cost changes is the most effective way to manage risk and remain flexible as the framework evolves.


If you would like to discuss how this affects your business, whether as a creator, AI developer or purchaser of AI solutions, please get in touch with Jonathan Bywater.

Co-authored with the current Innovation Trainee, Jess Ivy

 

Get in touch today

Are you looking for legal services?

Fill out our form to find out how our specialist lawyers can help you.

See our privacy page to find out how we use and protect your data.