European Digital Compliance: Key Digital Regulation & Compliance Developments
European Digital Compliance: Key Digital Regulation & Compliance Developments
To help organizations stay on top of the main developments in European digital compliance, Morrison Foerster’s European Digital Regulatory Compliance team reports on some of the main topical digital regulatory and compliance developments that have taken place in the first quarter of 2024.
This report follows our previous updates on European digital regulation and compliance developments for 2021 (Q1, Q2, Q3, Q4), 2022 (Q1, Q2, Q3, Q4) and 2023 (Q1, Q2, Q3, Q4).
In this issue, we spotlight key developments the digital regulatory sphere within the EU, UK, and in Germany. We provide an update on the now-finalized EU AI Act as well as enforcement insights on the Digital Services Act and Digital Markets Act, which are now both fully operational. Meanwhile, the UK’s corresponding digital regulatory framework is advancing as well, with updates on the Online Safety Act and the Digital Markets, Competition and Consumers Bill. With more content-focused regulation being out of the way at least for now, the EU seems to turn its attention towards digital infrastructure with a new white paper for a “Digital Networks Act”. Other key updates concern the right to repair (EU), greenwashing and cookies (UK), and online encryption (Germany).
2. Digital Services Act fully in force; Update on German implementing rules
3. White paper published regarding Digital Networks Act
4. Digital Markets Act update re 7 March 2024 compliance date for gatekeepers
5. eIDAS 2.0 rules for digital identity systems and trust services close to final adoption
6. Provisional agreement on Right to Repair Directive
7. Online Safety Act – Q1 updates and the data protection aspects
8. UK Digital Markets, Competition and Consumers Bill: Even more consumer protection
9. FCA’s anti-greenwashing rule
10. Proactive cookie compliance
11. Leaked proposal to introduce mandatory end-to-end encryption for online communications
The EU Artificial Intelligence Act (“AI Act”) has finally been adopted by the European Parliament on March 13, 2024, marking a significant milestone in Artificial Intelligence (AI) regulation. The AI Act aims to safeguard AI use within the EU while prohibiting certain practices outright. It has an extensive scope, applying to providers, deployers, importers, and distributors of AI systems placed on the market or put into service in the EU. The extra-territorial reach even extends to providers and deployers of AI systems established in third countries in situations where the output of the AI system is used within the EU.
The AI Act introduces a tiered approach to regulation, mainly addressing AI systems applying prohibited practices, high-risk AI systems and general-purpose AI (GPAI), with an additional category for GPAI with systemic risk, each carrying different obligations and potential fines. For high-risk AI systems, organizations must, for example, assess and reduce risks, maintain use logs, be transparent and accurate, and ensure human oversight. Individual citizens will have a right to submit complaints to the relevant Member State regulators and to receive explanations about decisions based on high-risk AI systems that affect their rights. Also, the AI Act addresses pressing concerns surrounding copyright, mandating that GPAI providers adhere to EU copyright laws and observe opt-outs made by rights-holders under the Text and Data Mining exception of the EU Copyright Directive regarding their training data. The AI Act provides limited exceptions for free and open-source AI models. Penalties for non-compliance with the AI Act are severe, with fines of up to EUR 35 million or 7% of global annual turnover.
The AI Act is subject to a final linguist check by lawyers, which is expected to take place in April 2024, and will also need to be formally confirmed by the Council of the European Union. As such, it is expected to be finalized and published in the EU’s Official Journal before the end of the EP’s legislature in June 2024. It will then enter into force 20 days after being published in the Official Journal and provides various transition periods for specific requirements, including: six months for the abolishment of prohibited AI practices, 24 months for the requirements for high-risk AI systems and 12 months for the requirements on GPAI models.
For a deeper dive into the implications of the AI Act, be sure to check out our detailed article covering multiple aspects of the AI Act.
The EU Digital Services Act (DSA) fully entered into force on 17 February 2024. While, since August 2023, the DSA had already applied to a first set of services designated as so-called Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), the rules now apply to all hosting services, online platforms and other intermediary services offered in the EU.
While DSA enforcement in Q1, 2024 has not seen the same flurry of activity as the DMA (see article below), the European Commission has not stood idly by, either. Formal non-compliance proceedings have been opened against two services designated as VLOPs. In one of these cases, the Commission is particularly investigating DSA infringements in connection with the protection of minors, risk management of addictive designs and harmful content. Proceedings in the second case are particularly related to breaches in the areas of risk mitigation, content moderation and the internal complaint handling mechanism.
Other developments have seen the publication of guidelines on recommended measures to VLOPs and VLOSEs to mitigate systemic risks online that may impact the integrity of elections ahead of the June 2024 European Parliament elections.
In Germany, the DSA implementing rules (Digitale-Dienste-Gesetz) were approved by the Bundestag in March 2024, creating the necessary framework for German authorities – in this case, the Federal Network Agency (Bundesnetzagentur) as well as other specialized agencies – to enforce the DSA against service providers subject to German jurisdiction under the DSA.
The draft law also governs fines and penalties for violations of the DSA, allowing for providers to be fined up to 6% of their annual turnover.
The European Commission is already fully engaged in enforcing the DSA against providers under their jurisdiction (i.e., VLOPs and VLOSEs). Enforcement at the Member State level regarding all other services is now expected to ramp up – including as further Member State adopt their relevant implementing laws. The German law discussed above is expected to be finalized during Q2, 2024.
In February 2024, the European Commission unveiled a comprehensive plan to bolster the innovation, security and resilience of the EU’s digital infrastructure. The plan, which lays the foundation for a future “Digital Networks Act”, includes a set of potential actions that the Commission considers crucial for the future competitiveness of Europe’s economy.
The proposed digital connectivity package particularly includes a white paper on “How to master Europe’s digital infrastructure needs”, which analyses the challenges that the EU currently faces in the rollout of future connectivity networks and presents possible scenarios to attract investments, foster innovation, increase security and achieve a true Digital Single Market. The paper highlights several key measures:
The Commission has launched a public consultation on the proposals set out in the white paper, inviting comments from stakeholders, Member States, civil society, industry and academics. The consultation is open for comments until 30 June 2024. It is expected that the Commission will present a legislative draft based on the outcome of the consultation.
As we reported previously (see our Q2, 2023 update), the Digital Markets Act (“DMA”), the EU’s digital gatekeeper legislation, became fully applicable on 2 May 2023, and potential gatekeepers had to notify their core platform services to the Commission by 3 July 2023 if they meet the DMA’s quantitative thresholds.
The Commission received several notifications and has so far designated six companies as gatekeepers under the DMA, with a total 22 core platform services that are now in scope of the DMA.
In March 2024, the Commission received additional notifications from three further companies.
The current designated gatekeepers with their 22 designated core platform services must comply with all obligations under the DMA. The DMA’s main obligations include data access and data use rules, prohibitions on self-preferencing and bundling, and interoperability obligations (see our DMA client alert for details).
The six designated gatekeepers must also publish so-called compliance reports outlining the measures undertaken by the gatekeepers to comply with the DMA. The public version of these reports is available on the Commission’s website. The Commission will review the compliance reports and assess whether the designated gatekeepers indeed comply with the obligations under the DMA. For this assessment, the Commission will also consider the input of interested stakeholders as well as the results from the compliance workshops that took place in March 2024, where the gatekeepers also had the ability to present their views.
On 25 March 2024, the Commission announced that it had opened non-compliance investigations against three designated gatekeepers. The investigations concern:
The Commission intends to conclude the new non-compliance proceedings within 12 months. If it were to find an infringement, the Commission has the power to impose fines of up to 10% of the respective gatekeeper’s total worldwide turnover.
As the EU edges towards the final adoption of the eIDAS 2.0 Regulation, a significant update to the 2014 Regulation on Electronic Identification and Trust Services ((EU) 910/2014, eIDAS) is on the horizon.
In February 2024, the European Parliament adopted the revised Regulation, and the EU Council followed suit in March 2024.
The adopted amendments will introduce substantial enhancements to Europe’s digital identity systems and the overall framework for trust services.
Central to the eIDAS 2.0 Regulation is the introduction of the European Digital Identity Wallet.
This tool aims to streamline how EU citizens access and use their digital identities, providing a secure environment for managing personal data and electronic attestations. Its capabilities extend to signing documents with qualified electronic signatures and sealing with electronic seals, both underpinned by robust security protocols to protect user identities. The Wallet’s design also emphasizes user privacy, featuring a privacy dashboard that allows users to track and manage their interactions with service providers transparently. This function supports not only data access but also the erasure of personal data upon user request, reinforcing the EU’s commitment to data protection and user control.
The regulation also expands the definition of “Trust Services” significantly, covering an even wider range of digital operations. These include issuing and managing certificates for electronic signatures and seals, creating and validating electronic timestamps, and ensuring the secure archiving of electronic documents.
This expansion is poised to enhance trust in digital transactions, ensuring that they meet stringent security and authenticity standards.
All EU citizens and businesses operating in the digital domain should prepare for these changes, which are expected to be published in the EU’s Official Journal and enter into force in Q2, 2024 – subject to two-year transitional periods for pre-existing qualified trust services.
The new eIDAS 2.0 Regulation marks a pivotal moment in the evolution of digital identity within the EU, setting a benchmark for digital trust and security that could influence global standards in the digital age.
Our most recent update on European digital regulation and compliance developments (see Q4, 2023) discussed the Commission’s proposal (and the European Parliament and the Council’s proposed amendments) for a new EU Directive “on common rules promoting the repair of goods” that will impose enhanced obligations on manufacturers and seller of goods (including digital products) to repair defective or broken products.
Since then, the Council, the Parliament and the Commission have discussed the different proposals during trilogue negotiations, which led to a provisional agreement for the “right to repair” (“R2R”) Directive.
It is expected that Parliament and the Council will formally adopt and finalize the R2R Directive before the European Parliament elections in June 2024, so that the repairability requirements of the Directive could apply to products marketed in the EU/EEA from 2026–2027, following their implementation into Member State laws.
The first quarter of 2024 has seen a flurry of activity from Ofcom, the regulator for the UK Online Safety Act (“OSA”), as it prepares for the controversial law to come into force.
In particular, this included launching “Phase three” of its regulatory roadmap, publishing a consultation on fraud and illegal harms and setting out its research guidelines in connection with the OSA. And there are even more consultations in the pipeline, due to be published over the coming months.
While Ofcom is taking the lead on monitoring compliance with the OSA, the UK Information Commissioner’s Office (“ICO”) has also released guidance on content moderation, from a data protection perspective.
Phase 3 – Call for Evidence
In March 2024, Ofcom published a call for evidence around which its codes of practice and guidance on certain additional duties imposed on in-scope businesses by the OSA will be created.
As we wrote previously, Ofcom encourages in-scope entities to engage with its consultations to help ensure that the industry’s concerns are considered when shaping the codes of practice that will inform Ofcom’s approach to compliance and enforcement.
Additional duties will apply to “categorized” services (which are intended to be the most widely used in-scope online websites and apps). While the UK Secretary of State is ultimately responsible for passing the secondary legislation that will set the threshold conditions that determine if a service falls into one of the relevant ‘categories’ established by the OSA, the secretary will take Ofcom’s position on those threshold conditions into account.
To that end, Ofcom has advised the UK government that the thresholds for determining whether an in-scope service is ‘categorised’ should be as follows:
Once the secondary legislation has been passed, Ofcom will create and publish a register of categorised services.
Consultation on Online Fraud and Illegal Harms
On 23 February 2024, Ofcom closed its consultation on protection on online illegal harms and published draft codes of practice and guidance to indicate how in-scope online services can comply with the new rules in the OSA in relation to search results and content that is user-generated (or user-uploaded) on the service.
Focusing first on fraud, Ofcom recommended the following measures for large services with ‘medium-to-high’ risk levels:
Research Guidelines
On 15 April 2024, Ofcom published its online safety research agenda, which summarises its programme of research, outlines its research efforts to date and indicates its “areas of interest” for future research. The primary intention behind Ofcom’s research effort sis to enable it to stay on top of changes in the online world.
By publishing its research agenda, Ofcom hopes to promote transparency and collaboration with the research community.
Ofcom’s areas of interest for future research include understanding:
Further Consultations
Ofcom has announced its intention to launch a consultation in May 2024 on its draft Children’s Safety Code of Practice, which will include practical measures for in-scope businesses to implement to help ensure that children ‘have safer experiences online’. It also intends to consult on the draft risk assessment guidance relating to children’s harms, and the causes and impacts of online harm to children.
A further consultation on the use of automated detection tools to mitigate the risks of the most harmful content to children (and other illegal harms) is also planned, the results of which will feed into Ofcom’s illegal harms draft Codes of Practice.
Draft Guidance
Ofcom intends to publish draft guidance on protecting women and girls online by Spring 2025, after the codes of practice on child protection have been finalized.
Data Protection Aspects of the OSA
The ICO’s content moderation guidance is aimed at organizations carrying out content moderation to achieve compliance with the OSA; the guidance forms part of the ICO’s ongoing collaboration with Ofcom on data protection and online safety technologies.
While Ofcom enforces compliance with the specific obligations under the OSA, the ICO’s remit is ensuring that organizations continue to comply with data protection law when moderating user-generated content (not least due to the high levels of personal information that could be processed as part of any moderation efforts.
The OSA and data protection law regimes sit alongside each other, and so compliance with the ICO’s guidance on content moderation does not necessarily mean compliance with the OSA.
We will continue to monitor any further input from the ICO on topics arising out of the OSA (such as the upcoming children’s risk assessment guidance which Ofcom is scheduled to prepare as part of its OSA roadmap).
The UK’s Digital Markets, Competition and Consumers Bill (the “Bill”) was introduced by the UK government in 2023 to more closely regulate so-called “Big Tech” companies, as well as to grant increased protection to consumers generally. For more details on the Bill’s key provisions, see our previous coverage.
Since our last update, the Department for Business and Trade (“DBT”) launched a consultation on consumer price transparency and product information for customers, which led to amendments to the Bill; and the Bill also underwent separate changes in relation to its enforcement provisions.
In response to the consultation responses, the government has proposed amending the Bill and the Price Marking Order 2004 (“PMO”). The DBT’s key recommendations include:
Some of the Bill’s competition-related provisions have also been amended, this time in favor of in-scope companies. Companies facing fines under the Bill will now be able to challenge these decisions based on their merits (rather than just based on the process). Also, regulators will not be able to impose conduct requirements or pro-competitive interventions on companies, unless such action is proportional and there is a strong evidence base behind the intervention. Further information on how the Bill will be enforced is available on GOV.UK.
The Bill has almost reached the finishing line in the legislative process and is expected to come into force in Autumn 2024. Once in force, organizations will need to keep a close eye on the CMA, as it is expected to consult on, and publish, guidance in relation to the Bill. Certain provisions (such as subscriptions rules) will face a delay in implementation and will come into effect in October 2025 at the earliest.
An anti-greenwashing rule intended to moderate sustainability claims about financial products and services will come into effect in the UK on 31 May 2024. It will cover both environmental and social sustainability claims that are included in any communication to clients by an FCA-authorised firm.
The rule closely follows actions by other UK regulators considering greenwashing.
The Advertising Standards Authority published specific guidance (which we discuss in our client alert) last year, as did the Competition and Markets Authority, which was shortly followed by its investigation into green claims made by fashion brands.
This mirrors similar interest by the EU; you can learn more about the EU’s new anti-greenwashing laws.
The FCA’s rule (ESG 4.3.1R) means that any reference to a sustainability claim must be: (1) consistent with the sustainability characteristics of the product/service, and (2) fair, clear and not misleading.
It has been introduced into the FCA Handbook to protect consumers from greenwashing and help them make informed decisions that are aligned with their sustainability preferences. The FCA will challenge firms if it considers that they are making misleading claims about their products or services and, if appropriate, take further action.
Crucially, this applies to any firm, regardless of whether it is managing a sustainability in-scope business (i.e., a UK AIF or UK UCITS).
The guidance recommends that firms ensure that sustainability-related claims are:
The rule is accompanied by proposed guidance, which the FCA has consulted on. The final version of the guidance is yet to be published.
In January 2024, the ICO released a blogpost, warning organizations to proactively make changes in how they implement advertising cookies in order to comply with data protection law.
This relates to a previous set of warnings issued by the ICO in November 2023 to 53 of the UK’s top 100 websites, stating that the relevant organizations would face enforcement action if they did not make changes to their websites’ cookie banners within 30 days.
The ICO was primarily concerned that these websites did not give users a fair choice over whether they wanted to be tracked for personalized advertising.
In line with previous guidance, the ICO reiterated that organizations must provide users with an option to “Reject All” advertising cookies, in a way that is as easy as to “Accept All”.
Reporting on the outcome of the warnings issued in November 2023, the ICO described an “overwhelmingly positive response to [its] call to action”. Thirty-eight organizations (out of the 53 contacted) changed their cookie banners to be compliant, and several other organizations are now developing alternative solutions, including contextual advertising and subscription models.
In its January 2024 blogpost, the ICO foreshadowed that it is preparing to write to the next tranche of websites on the same cookie banner compliance issue. In addition, the ICO noted that it is developing an AI solution to help identify websites that are using non-compliant cookie banners, and it will also be running a hackathon event in early 2024 to run through what the AI solution is expected to look like in practice.
Draft legislation by Germany’s Federal Ministry for Digital and Transport was leaked in February 2024, pursuant to which providers of number-independent interpersonal communication services, such as email and messaging apps, would be obliged to implement end-to-end encryption (E2EE) by default. This legislative move aims to uniformly secure communications from sender to receiver, addressing gaps where services either do not use E2EE or apply it inconsistently.
The initiative is part of broader efforts to amend the Telecommunications Telemedia Data Protection Act (TTDSG). The TTDSG governs specific privacy arrangements for electronic communications services and online services more generally and it implements the EU’s E-Privacy Directive. The proposed changes are influenced by the German Federal Cartel Office’s recent findings, which advocate for stronger data protection measures in messaging and video services. By mandating E2EE, the draft seeks to protect the fundamental rights related to the secrecy of telecommunications and the confidentiality of information technology systems.
Additionally, the legislation introduces a transparency obligation for service providers. This requirement mandates that providers inform users about the encryption protocols used or, if E2EE is not implemented, explain the technical reasons why. This approach aims to enhance user awareness and trust in digital services.
The proposal also aligns with ongoing discussions at the EU level, particularly concerning the balance between privacy protections and law enforcement requirements, as seen in the Digital Markets Act and the proposed regulations on online child sexual abuse material. Notably, in its position on the planned EU CSAM Regulation, the European Parliament proposed exempting end-to-end encrypted communications from certain detection orders, highlighting the complex interplay between privacy and security.
It is expected that the German government will officially present (an updated version of) the leaked draft for public consultation as the next step towards turning this into law. If Germany advances this draft, this could set a precedent for privacy regulation that could influence future digital privacy and security standards across Europe.
We are grateful to the following member of MoFo’s European Digital Regulatory Compliance team for their contributions: Michal Pati and Jacqueline Lee, London office Trainee Solicitors.