PHARMA + LIFE SCIENCE Adobestock 1044274898 LR

Is it safe for your business to embrace AI?

08 Oct 2025

Legal considerations and practical tips for businesses in the pharmaceuticals and life sciences sector.


AI tools have the potential to revolutionise and bring incredible benefits for the Pharmaceuticals and Life Sciences sector. Businesses in the sector are actively being encouraged to use, implement and adopt AI in their day-to-day operations - but to what extent should they be wary of doing so?  We discuss what challenges businesses in the Pharmaceuticals and Life Sciences sector could face when adopting AI and the practical steps they can take to help overcome them.

What opportunities does AI offer for the sector?

By harnessing the analytical and processing capabilities of AI, there are some great ways in which AI can benefit across the Pharmaceuticals and Life Sciences sector, including:

  • Accelerating the process of identifying compounds that could form the basis for new drugs.
  • Analysing high volumes of data to predict missing links and infer novel relationships in order to identify new purposes for existing drugs.  
  • Analysing and overlaying biomarkers in genomic data, patients' scans and other clinical data and medical literature to predict how patients will respond to treatment, enabling the development of personalised treatment plans.
  • Spotting disparities in healthcare outcomes and informing interventions.
  • Using predictive models to manage supply chain disruption and forecast demand.
  • Classifying and triaging adverse events more quickly in the context of pharmacovigilance requirements.
  • Analysing scans and other test data to flag abnormalities and speed up diagnosis.

I was delighted to speak on a panel titled AI & Digital Innovation Driving Life Sciences Leadership as part of West Midlands Life Sciences Week 2025. During the week, there were some amazing real-life examples of how AI was being harnessed by businesses, such as Myma Medical, who are using the latest AI tools to automate the intracytoplasmic sperm injection process to benefit IVF clinics.

The government has recently published its Life Sciences Sector Plan (see our previous article) in which innovation is a key focus. With government supporting and driving technological advancement within the sector, Pharmaceuticals and Life Sciences businesses will increasingly come across different use cases for AI and AI-facilitated offerings from their partners.  Whether you are considering adopting AI within your own business or procuring AI-powered services or products from third parties, it is important to be alert to the key legal and commercial issues and understand what practical steps should be taken to mitigate risk.

What key issues should your business be thinking about?

Data Protection:

  • If you are going to use AI to process personal data (or if the AI is going to use personal data you upload to train itself), you will need to identify a lawful basis for that processing under the UK GDPR, minimise what you process, keep privacy notices up to date and, perhaps most importantly, ensure you have implemented appropriate technical and operational measures in order to keep that personal data secure.
  • Where you are procuring an AI tool from a supplier which uses and/or has been trained on third party personal data, you should be getting reassurances and contractual commitments from the supplier that this has been done in compliance with all applicable data protection laws.  
  • The deployment of AI within your business will likely require you to carry out a Data Protection Impact Assessment (DPIA). This is something our Data Protection team can help you with.
  • Consider where the provider of the AI tool is based. If this is overseas and the AI tool processes personal data, this may need to be recorded in an international data transfer agreement (IDTA).  
  • You should also bear in mind that, under data protection law, health data is special category data - so, you need to ensure you have the right to process the data under specific grounds, for example, by obtaining explicit consent. 

IP:

It is not always clear what materials an AI tool has trained itself on. If it has unlawfully used third party IP in order to generate outputs for you, then you are at risk of third party IP infringement claims when you come to use those outputs internally or in order to deliver goods or services to clients.

You should ensure that you and your staff are trained on the use of AI tools, so that they do not inadvertently upload your own IP onto any AI tool, which might then be used to generate outputs for other businesses, including your competitors.

You should review the terms and conditions of the AI tools you are using, to ensure that there are no onerous provisions, that there is clarity as to who owns the IP in any outputs of the AI and (either way) that you can freely use the outputs for the intended use.

Quality of Data: If the data uploaded to an AI tool is incomplete, inconsistent or biased, this will affect the outputs generated by the AI tool and could lead to inaccurate conclusions, biased results or 'hallucinations'. Unfortunately, many data sets are limited, compromised, incomplete or of poor quality (e.g. open-source data). You should do due diligence to understand what data the AI tool is using to generate outputs and from what sources that data comes from. This might prompt you to decide to use an AI tool where you have a lot more control over what data goes into the system.

Confidentiality: Before inputting any data into an AI tool, it is important to consider whether such action is appropriate and whether the AI tool offers sufficient protection for any confidential or commercially sensitive information.  You should also consider whether such use would breach any NDA or confidentiality provisions if the information belongs to a third party.  Training and an AI policy can be a useful way to provide guidance to staff on how confidential information may be used with AI.  

Cybersecurity: AI presents another means by which systems can be hacked and otherwise compromised. Following recent high profile cyber attacks, it is of vital importance that you obtain reassurances from any AI supplier as to the security of its systems. In order to further protect your business, you may wish to consider taking out cybersecurity insurance.

What practical steps should your business be taking?

  • Set clear parameters for internal AI usage: Ensure you have an AI policy that clarifies to your staff how AI should be used in their work.
  • Protect yourself if partners are using AI: Consider putting in place a policy which sets out how service providers, suppliers and other partners can use AI in their work for you.  If AI is being used, ensure there are suitable contractual provisions in place to protect you against third party IP infringement claims, cybersecurity incidents and data protection breaches.
  • Be mindful of IP issues: Ensure that you own the rights (or have the necessary licences) in the materials you are inputting into (and getting out of) AI systems. You should also carefully document who develops AI tools and engineers prompts and ensure that ownership sits with your business (whether by virtue of an employment or consultancy contract or a separate assignment of rights).
  • Get leadership buy-in: Get senior leadership involved with plans for AI adoption, and ensure that they have oversight of AI-specific risks and how they are being managed.
  • Build knowledge: Train staff on both the potential and the limits of AI.
  • Test before scaling: use pilots and audits in lower risk areas of the business to identify issues and manage risk before wider adoption.
  • Use 'good' data: Ensure you are using good quality data for training the AI tool.  
  • Maintain human oversight: Some AI tools are unable to give reasoning for reaching a conclusion - this can make it difficult to verify the outputs and confidently rely on them.  Because of this 'black box problem', the fact regulators require a business's use of AI to be as transparent and explainable as possible, and also the potential for hallucinations with some AI tools, it is important that there is still human oversight when using AI.
  • Keep up to date with regulatory changes: There is still uncertainty surrounding the regulation of AI in the UK, but the MHRA is focused on adapting its approach to this area. In the government's recently published Life Sciences Sector Plan, the government states that it is keen to capitalise on the MHRA's thought leadership and reputation in AI and Software as a Medical Device, and we can also expect a new framework for AI in 2026.  
  • Be aware of the sustainability considerations: As discussed in our previous article, the use of AI presents some sustainability challenges.  AI usage needs to be responsible and carefully considered in order to balance the benefits of that use with a business's sustainability policy and strategy.
  • Consider your cyber resilience: Cyber criminals can exploit weaknesses in AI models and systems to launch attacks.  Test your business's cyber resilience and ensure that you take steps to remedy any weaknesses such as taking out cybersecurity insurance.  
  • Demonstrate good governance: Keep records of data sources, approvals and model limits to demonstrate accountability.

Our perspective

AI is already reshaping Pharmaceuticals and Life Sciences, but its adoption comes with legal, regulatory and governance challenges.  Those who do the due diligence on what they are using and prepare early are best placed to innovate confidently and build trust with regulators, partners and patients.  We have specialist AI lawyers who can assist with a range of AI-related matters including the development of AI policies, data protection compliance and IP advice on using and relying on AI outputs - please reach out to a member our team if you would like support.  


If you have any views on the considerations for AI usage in the sector or would like some support with managing risks, please contact Patrick McCallum in our Pharmaceuticals and Life Sciences team. With thanks to Trainee Solicitor Jess Ivy for her contributions.  

Sign up to our newsletter and law briefs

To keep abreast of legal developments in your industry or generally, please subscribe to our law briefs.