Whilst primarily looking to target companies that deliver user-generated content (UGC), such as Twitter/Facebook and also video sharing platforms, the OSB also imposes a duty of care on organisations to protect users from harmful content by:
Ofcom will be the online safety regulator and is required to prepare codes of practice to assist providers in complying with their duties of care.
The OSB will apply to 'user-to-user' services and to search engines. User-to-user services are those that do one or both of the following:
We have not been provided with any definition of 'user', but the following are identified as not users/falling outside the OSB:
In December 2021, the joint committee published a report recommending major changes to the draft OSB, including additional responsibilities for Ofcom, requiring service providers to conduct internal risk assessments and create an 'online safety policy', to which users must agree.
If accepted, these recommendations will be incorporated into the OSB before it makes its way to legislation.
The aim of the proposed legislation is to reach a balance between internet safety and the right to freedom of expression and privacy. As it currently stands, many unanswered questions remain not only in context of this tension but also arising from the various gaps in the bill. What is clear is that OSB potentially has wide-ranging application.
Universities with an online presence will need to scrutinise the online content and services they provide and consider the adoption of a proactive online safety policy, and put in place a process for ongoing risk assessments.
A version of this article first appeared in University Business on 27 January 2022.