UK Publishes Draft Online Safety and Content Bill
UK Publishes Draft Online Safety and Content Bill
Ever since the UK government published its “online harms” White Paper in April 2019, digital service providers have been waiting for further clarity as to what will be expected of them. The UK government has now set out its proposed legal framework to regulate online content and protect web users. The fact that the UK’s plans are different from the EU’s proposals is likely to add both confusion and extra regulatory compliance requirements for affected companies – which include non-EU and non-UK digital service providers.
The UK Online Safety Bill (the “Bill”) is designed to “protect young people and clamp down on racist abuse online, while safeguarding freedom of expression” by holding companies accountable for their online content. The Bill will apply to the whole of the UK, with extra-territorial effect for any relevant global providers if they have users in the UK.
The Bill is intended to tackle content involving racism, fraud, terrorism, and child sexual exploitation and is complemented by the codes of practice on terrorism and child sexual exploitation published in December 2020 (although these are non-binding). The UK will also appoint a regulator to enforce the new rules.
The Bill establishes new duties of care for the following two categories of online service providers:
a. Category 1: Companies that allow users to upload and share their own content. Most obviously, this includes global social media companies but, on closer analysis, this category is very broad and may include any provider that hosts user-generated content and could be providing “user-to-user” services; and
b. Category 2: Companies providing search engine services that enable users to search multiple websites and databases.
The thresholds for companies falling into these groups will be set under further regulations – likely determined by reference to the number of the company’s UK users. The UK may exclude certain types of providers where risk to users is low.
The Bill has already determined that news publishers’ websites are out of scope.
As noted above, the Bill will have extra-territorial effect to the extent that services based outside the UK affect users in the UK.
a. Content duties
Regulated content is dealt with in the Bill under two headings: “illegal” content and “harmful” content. In respect of illegal content, online service providers covered by the Bill must:
User-to-user service providers will also have duties to protect adults from harmful content. Following the backlash from the government’s response to the White Paper in December 2020, the Secretary of State for Digital, Culture, Media and Sport confirmed that the legislation also covers user-generated fraud.
Affected companies will also have to make special assessments required and fulfil specific duties where services are likely to be accessed by children in respect of both illegal and harmful content.
b. Safeguarding duties
All companies falling within the scope of the Bill will need to balance their content duties with the duty to safeguard freedom of expression. It is expected that appropriate safeguards will be set out in codes of practice issued by the UK Office of Communications (“Ofcom”), but the Bill already outlines the following safeguarding duties that in-scope companies will need to comply with (depending on which category of service is provided):
Ofcom has been appointed as regulator to oversee the regime. The regulator will have powers to require information from providers and to investigate compliance. It will also compile a register of all of the regulated providers, and must prepare codes of practice to assist organisations with their compliance, subject to parliamentary and ministerial approval. Providers will be considered compliant if they follow the code of practice in relation to illegal content. Providers will be relieved to know that non-compliance with any code of practice will not result in legal action.
Non-compliance with the regime could result in eye-watering fines for providers: the higher of £18 million or 10% of annual global turnover. Notably, this goes over and beyond the present fines under the General Data Protection Regulation.
There are also other enforcement tools available to Ofcom (which it is expected to exercise with a proportionate approach in mind). It can issue “Use of Technology Notices” to require providers to use technology to aid in the removal of illegal content, pursue criminal sanctions against senior managers who do not comply with information requests and, notably, apply for service restriction orders, which require the provider to cease aspects of its service. The criminal sanctions are drafted as a deferred power, meaning they will only come into effect after a review two years after the Bill’s enactment. The Bill does anticipate an appeals procedure.
A parliamentary joint committee will now review the Bill before it is debated by Parliament. After the review, a final bill will be prepared for parliamentary approval.
The Bill comes as the European Commission proposes its own content removal framework under the Digital Services Act (for more information on the framework, please see an excerpt from our 10 Key European Digital Regulation & Compliance Developments). The UK framework however, goes beyond the obligations imposed by the EU Digital Services Act.
Georgia Wright, London trainee solicitor contributed to the drafting of this alert.