• Contact Us

Artificial Intelligence and Employment Law

on Thursday, 29 June 2023.

Artificial intelligence (AI) is a reality for an increasing number of employers within the education sector. Naseem Nabi considers the potential impact of AI in the context of employment and equality law.

AI Within the Employment Relationship

AI is software that can carry out tasks a human would usually complete. Across the independent sector, there is a huge amount of focus on the extent to which AI is going to transform the way pupils learn, are taught, and are assessed.

AI also has a significant potential impact on school employees. It can be used by staff on an individual level to help them perform certain aspects of their roles. Teaching staff may use it to help with things like lesson planning and delivery; HR staff might use it to help with staff onboarding; or marketing staff might use it to create website and social media content.

At a strategic organisational level, AI can also be used in a multitude of ways in order to assist with decision-making and staff management across the lifespan of an employment relationship. Within the education sector, employers can use AI to assist with various tasks, including:

  • sifting through job applications by "reading" CVs, or deciding whether to shortlist a candidate for interview (ensuring that the AI software takes into account the school's usual safer recruitment obligations);
  • conducting initial interviews with job applicants using chatbots in a video call;
  • supporting HR colleagues in handling repeat tasks such as answering employee FAQs; and
  • tailoring training to an employee's specific needs by monitoring their activity and output against their job descriptions and comparing this against the performance of their peers.

It is also possible to use AI in more advanced ways, such as monitoring workers' tasks and performance against targets generated by algorithms, or using it to help employers make decisions as part of a HR process.

Determining your AI Policy

As an employer, you will need to determine the place of AI within your school. In the short-to-medium term, you may wish to develop a policy so that staff can be clear on the extent to which they are permitted to use AI in their day-to-day roles. For example, your policy should emphasise that staff must not input confidential information (such as pupil information) into AI tools such as ChatGPT. You may also wish to use your policy to introduce guidelines on editing ChatGPT generated content before signing it off, This can be helpful from both from a fact-checking perspective, and also in order to mitigate the risk of any arguments about who owns the intellectual property in ChatGPT-produced content.  

The other key benefit of a policy would be to ensure that everyone can understand how the school is using AI; for example if it is being used as a recruitment tool or to make certain decisions without human input.

Employee Engagement

Every member of staff in your school will have an interest in AI, whether at a strategic, organisational level, at an individual level, or both. Some of your staff may already be experimenting with AI personally or professionally. Others might be worried about what the advancement of AI might mean for their job security.

It is sensible to engage with staff on your draft policy before it is finalised. This will allow you to carry out a fact-gathering exercise in the first instance, as you will need to understand how staff might currently be using AI to help them in their day-to-day roles. Staff engagement also offers an opportunity to capitalise on technical know-how within your staff cohort, and to share information with those who may know less about AI or be worried about its advancement. By offering staff a chance to comment on a draft policy before it is finalised, the end result is more likely to be a policy that is fit for purpose and reflective both of your school's AI strategy and staff current use of AI systems.

AI and Job Security

There is a lot of speculation on the extent to which AI has or will cause job losses, where it can carry out particular tasks or roles more cheaply and to the same standard as an employee. In an education context, early reports suggest AI can ease teachers' workloads when used as a planning tool. At the same time, if AI is used as a tool to deliver personalised tutoring, there are concerns that jobs could come under threat.

Given the global AI landscape and the speed at which AI software is developing, your school may well decide to explore ways to make efficiencies through the use of AI. If you conclude that jobs may be at risk as a result, then the usual collective and/or individual consultation obligations will apply.

Understanding the Risks

AI can offer benefits in terms of automating certain tasks, saving time and improving efficiency in the workplace. However, in an employment law context, the use of AI carries risks in terms of the fair treatment of employees, and also in respect of wider equality law and data protection obligations. As an employer, you should consider the following key areas of risk, and how to mitigate them:

  • The output generated by free AI software is not always factually correct. If your school uses AI, you will need to determine the extent to which you can rely on its output before relying on material it produces. It will be important for your school to understand how AI might already be used by staff, so that you can build in necessary checks and set boundaries and expectations on its use. You can do this as part of your staff policy.
  • Staff may not understand the way an AI system works or how it has produced a particular result. It can be difficult to explain how an algorithm works, and indeed the school itself may have limited information, depending on the AI in use. If staff are affected by decisions made by AI they do not understand, this risks damaging staff trust and confidence in the school as an employer. Effective staff engagement and the provision of appropriate training and support may help mitigate this risk.
  • Research has shown that bias can be incorporated into AI algorithms. If your school makes a biased decision as a result of using AI software (for example, as part of a recruitment procedure), the school could find itself subject to a discrimination claim. It could be difficult to defend such a claim if there was limited information about the algorithm used by the AI software.
  • If AI is making automated decisions about individuals where there is no human input, there are specific data protection law requirements that you will have to follow. You should also consider the agreements you have in place with the AI tools if you are inputting any personal information - even if you remove names, it may still be possible to identify individuals from their details.

Schools that adopt a proactive and thoughtful approach to the use of AI in the workplace are likely to be able to harness its potential while also ensuring that employees are protected and treated fairly under the law. As with any new technology, the world of AI will be subject to ongoing legal and regulatory developments. It will be important for schools, as employers, to stay abreast of these changes in order to ensure compliance and mitigate risk.


For more information on AI in the context of Employment Law, please speak to Naseem Nabi in our Employment team on 07500 702 450, or complete the form below.

Get in Touch

First name(*)
Please enter your first name.

Last name(*)
Invalid Input

Email address(*)
Please enter a valid email address

Telephone
Please insert your telephone number.

How would you like us to contact you?

Invalid Input

How can we help you?(*)
Please limit text to alphanumeric and the following special characters: £.%,'"?!£$%^&*()_-=+:;@#`

See our privacy page to find out how we use and protect your data.

Invalid Input