Without a suitable licence or legal exception, that type of copying or publication may amount to copyright infringement by the developer and the user.
There has, for some time, been an exception under UK law allowing text and data mining (TDM) activity (see section 29A of the Copyright, Designs and Patents Act 1988). This is relevant for AI businesses training their models on subject matter covered by copyright. The scope of the exception is, however, limited to non-commercial research. Other exceptions also allow for limited use of materials protected by copyright, but they are unlikely to solve this issue (which would not have been in the minds of the people drafting the relevant legislation).
It is possible that TDM could be done at the research stage in reliance on the existing exception, but there would come a point when a commercial licence would be needed. Negotiations with copyright holders are unlikely to be easy at that stage if the AI technology or its output looks promising. On the other hand, difficult commercial licence negotiations are not a new concept for the life sciences sector.
The UK recently confirmed in a meeting of a Brexit-related committee (the Specialised Committee on Intellectual Property under the UK-EU Trade and Cooperation Agreement) that it is still not planning to change the existing TDM exception. However, it did confirm that a draft UK code of practice on copyright and AI (Code) has been prepared by the UK. This development was recorded in the minutes of the committee meeting. The Code, the development of which is being led by the UK Intellectual Property Office (IPO), is a crucial document to look out for if you are developing or using AI technology.
The development of the Code was the response of the UK Government to the recommendations of the Vallance report on pro-innovation regulation of technology (including AI). The Code would be voluntary, so significant industry support would be required. And legislation would be the next step if this plan fails.
Essentially, a balancing act is required, and the overall innovation goal is to make licences more available to AI businesses seeking to train their generative AI models. In this sense, it would involve a rebalancing towards AI innovators and away from copyright material rights holders. This may be through changes in law, licensing of copyright or new industry practice promoting entry into the Code on a large scale.
In the life sciences sector, a voluntary code is not an unfamiliar concept. Just look at the self-regulation of healthcare professional interactions and other important issues under the ABPI Code of Practice. Therefore the sector is well placed to adopt and benefit from the Code. On the list of members of the working group developing the Code are Deepmind, famous in the sector for AlphaFold and AlphaMissense, Innovate UK and UKRI. So, we hope that the life sciences voice is being heard. It is a sector that can add much to the discussions and has much to gain from a pro-innovation environment for AI technology.
Bearing in mind this background and the interests of the sector and Government:
There are many further, nuanced legal questions in this area. One case to watch carefully is the Getty Images v Stability AI litigation going through the UK high court. This looks set to answer some of those questions. No offence to the parties, but we hope it goes all the way to trial so the rest of us get a detailed judgment to read. Stability AI is on the Code working group, so no doubt their experience (good or bad) will help guide the Code.
At our 2024 PING Conference on the topic 'AI in Pharma - Threat or Opportunity', we will be exploring this and other issues relating to the development and use of AI solutions.