Online Harm, Hate and Illegal Content: UK Adds to Online Safety Bill
Online Harm, Hate and Illegal Content: UK Adds to Online Safety Bill
The UK government has introduced its Online Safety Bill to Parliament. The new Bill was amended at the last minute to include new online offences and greater obligations on technology providers to do more to prevent “harmful but not illegal” online activity.
Ever since the UK government first published its draft Online Safety Bill (the “Bill”) in June 2021, the merits and faults of the proposed new regime have been fiercely debated on a global stage. But between first publication and, now, the first formal step on the legislative journey, the Bill has been strengthened by the addition of a range of further online offences and obligations on digital service providers to do more to prevent online harm.
The Bill will affect a wide range of digital service providers, including search engines and platforms hosting user-generated content (UGC) – but affected companies still don’t have necessary clarity and detail about what will actually be expected of them.
Officially introduced to the UK Parliament in March 2022, the Bill imposes a duty of care on certain online providers to take responsibility for the safety of their UK users. This means that digital service providers, regardless of their location, will be affected if they have a significant number of UK users. The Bill is more extensive than contemplated legislation in other countries (even the EU) by including measures targeting the proliferation of “harmful” content – even if that content isn’t necessarily illegal.
We first wrote about the Bill in 2021 when it was first published – see more in our previous client alert.
The Bill will impose certain “duties of care” on two categories of online service providers (i.e., firstly, “user-to-user” (u2u) services and, secondly, search services) to prevent the dissemination of illegal content and activity online. The first category includes companies that allow users to upload and share their own content. Most obviously, this includes global social media companies but, on closer analysis, this category is very broad and may include any provider that hosts UGC. The second category includes companies that enable users to search multiple websites and databases. Affected companies will have to fulfil specific duties in relation to mitigation and risk assessments – all varying depending on the content (illegal or harmful), who is likely to access it (children or adults) and the type of service provider. To meet the duty of care, companies will need to put in place systems, terms of service and processes to ensure user safety.
These new responsibilities will essentially force digital service providers to take a bigger role in policing content, while at the same time asking them to balance their actions against accusations of censorship and prevention of freedom of expression.
The Bill also appoints Ofcom (the existing regulator of the UK communications and broadcasting sector) as the regulatory enforcement authority, with the power to block sites and to levy GDPR-style fines amounting to the higher of £18 million or 10% of global turnover.
The original 2021 draft of the Bill was subject to much public scrutiny by bodies, rights groups, affected companies and campaigners. Following this feedback and pre-legislative scrutiny, new amendments have been proposed that, in fact, expand the Bill in a number of ways; some are laudable and some present difficult implementation issues for digital service providers:
The UK’s plans are different from the EU’s proposals (the very-soon-forthcoming Digital Services Act and Digital Markets Act make up the EU’s legislative duo). This is likely to add both confusion and extra regulatory compliance requirements for affected companies – which include non-EU and non-UK digital service providers.
Notably, the Digital Services Act is broader in its scope and aims, capturing many online intermediary services such as network infrastructure. It not only addresses content but the traceability of business users, access for researchers to data, personal data use and malfunctioning of services. However, the Bill goes beyond the DSA in including measures to address harmful content. Although both regimes address illegal content, the definition of this could vary from Member State to Member State, adding further complexity.
The Bill will be read in both Houses of Parliament before it gains Royal Assent and so may be subject to further revisions. However, later regulations will determine whether digital service providers are regulated by the Bill depending on their size and functionalities – e.g., What is a “significant” number of users in the UK? What exactly is harmful content?
Ofcom will also have to draft and publish the mandatory codes of practice – and these factors are likely to significantly delay the Bill’s coming into force.