European Digital Compliance: Key Digital Regulation & Compliance Developments
European Digital Compliance: Key Digital Regulation & Compliance Developments
To help organizations stay on top of the main developments in European digital compliance, Morrison Foerster’s European Digital Regulatory Compliance team reports on some of the main topical digital regulatory and compliance developments that have taken place in the third quarter of 2024.
This report follows our previous updates on European digital regulation and compliance developments for 2021 (Q1, Q2, Q3, Q4), 2022 (Q1, Q2, Q3, Q4), 2023 (Q1, Q2, Q3, Q4), and 2024 (Q1, Q2).
In this issue, we report on developments in the EU and UK, many of which follow changes in government at both the European Commission level as well as the national level. In the EU, one of the key pieces of legislation for businesses will be the local implementation of NIS2, with many countries not yet over the line (although Germany’s law is now in motion). Youth protection and consumer support are also repeated motifs within the EU, as well as Germany and the UK. And we provide an update on the drive in momentum in AI legislation in the UK.
In this issue, we highlight…
1. Digital Regulatory Compliance Aspects of the Draghi Report on European Competitiveness
2. Digital Services Act: Consultation Regarding Youth Protection Guidelines
4. Digital Fairness: European Commission Report Provides a Fitness Check on EU Consumer Law
5. Competition: DG COMP Issues a Policy Brief on Generative AI and Virtual Worlds
8. NIS2: Legislative Procedure for NIS2 Implementation Finally Kicking off in Germany
9. Updates from Ofcom: Online Safety Act and TikTok Enforcement
10. An Overview of AI Updates in the UK
11. Digital Markets, Competition and Consumers Act: Timelines, Regulations and Guidance
As requested by the European Commission, former Italian Prime Minister and European Central Bank Chief Mario Draghi has published his long-awaited report on the future of European Competitiveness (the “Report”). Consisting of almost 400 pages, the Report takes stock of the weaknesses in the EU’s industrial policy, and proposes a strategy to remedy them, supported by an in-depth analysis and specific recommendations (Part A and Part B). The Report pays particular attention to the digital sector, where, according to Draghi, Europe must make particular efforts to avoid being left behind as a result of innovation gaps.
The Report calls for partial deregulation in various sectors, highlighting that the current regulatory environment is too fragmented, leading to incoherent and inconsistent enforcement. In particular, the Report identifies digital technologies as a priority sector, emphasizing that regulations should be designed to facilitate market entry, and align with Europe’s broader goals rather than act as barriers.
In the Report, Draghi notes the growth-limiting danger of proscribing specific business practices up-front in order to avoid potential future risks, particularly in relation to AI, which is recognized as a significant growth area. Instead, the Report suggests lowering compliance costs related to data storage and processing limitations to enable the creation of large, integrated data sets essential for training AI models.
Furthermore, the Report aims to accelerate AI development across 10 strategic sectors (automotive, advanced manufacturing and robotics, energy, telecoms, agriculture, aerospace, defense, environmental forecasting, pharma, and healthcare) where EU business models would benefit most from rapid AI introduction. Additionally, Draghi also calls for strategic market concentration, notably within the telecoms sector, to foster more robust market dynamics and innovation.
The Report is likely to influence the European Commission’s political ambitions in the 2024–2029 term, although leadership will likely cherry-pick certain items of focus. In addition, it remains to be seen whether and to what degree the EU Member States will lend their support – and funding – to the Report’s recommendations. Even so, it signals a shift towards a more inward-looking agenda going forward – especially with regard to digital regulation. Where the previous focus was primarily on reining in so-called Big Tech, more emphasis may now be placed on how the digital regulatory framework can be designed to benefit and build European equivalents.
The European Commission is currently in the process of drafting guidelines on Art. 28(1) of the Digital Services Act (DSA) to ensure robust implementation and enforcement of the DSA for children’s rights as well as clear and concrete guidelines for companies and regulators on maintaining a high level of privacy, safety, and security for children.
On 31 July 2024, the European Commission launched a call for evidence to gather feedback on the proposed scope and approach of the guidelines, as well as on good practices and recommendations related to mitigation measures for risks that children may encounter online. All stakeholders were encouraged to participate and provide supporting scientific reports and research materials. The call for evidence was open until September 2024, and was followed by a multi-stakeholder workshop on the matter on 4 October 2024.
The Commission’s key objective is to prioritize the rights and best interests of children as part of the service design process, as well as to harmonize the approach to age verification mechanisms. Examples of the scenarios discussed within the draft guidelines include access to age-inappropriate content, algorithms fostering addictive behavior, cyberbullying, sexual harassment, and the promotion of self-harm and suicide content. Platforms will be obliged to adopt a risk-based approach, regularly carry out impact assessments, and implement mitigation measures for any potential risks.
The European Commission aims to submit a draft of the guidelines for public consultation in early 2025. The guidelines are expected to become effective in Q2 2025.
Following the full adoption at the EU level of the new Directive on measures for a high common level of cybersecurity across the EU (NIS2) – see our previous Q4 2022 coverage – the Commission has now published a draft Implementing Regulation (IR) on cybersecurity risk management measures and reporting obligations for digital businesses.
The IR lays down the technical and the methodological requirements of the cybersecurity risk management measures referred to in Article 21(2) of NIS2, which must be implemented by a broad range of companies (including, e.g., cloud computing and data center service providers, online marketplace providers and social networking services platform providers, and trust service providers). In addition, the IR sets out general and entity-specific cases in which an incident is considered to be “significant” as referred to in Article 23(3) NIS2, thus triggering reporting obligations.
The IR’s detailed Annex contains the specific requirements with which in-scope entities will have to comply, sorted according to overarching topics (such as requirements for network security, risk management, incident handling, supply chain security, and basic cyber hygiene practices and security training). It is important to note that the IR calls for entities to take into account the principle of proportionality, providing certain companies with some leeway with regard to implementation (e.g., allowing smaller companies to consider alternative measures where they cannot implement certain requirements due to their size).
The European Commission’s adoption of the IR is planned for Q3 2024. Given that the IR’s effective date was originally scheduled as 18 October 2024, it is unclear right now whether a delay in adoption will cause a slight push-back in the effective date. Notably, progress towards implementation at the EU Member State level has also been uneven (see Item 8 below regarding our report on the legislative procedure for NIS2 implementation finally kicking off in German).
As reported in our Q4 2022 update, the European Commission previously launched a public consultation to evaluate the digital fairness of existing consumer laws. The consultation covered the three Directives that have only recently been amended by the EU’s Omnibus Directive, i.e., the Unfair Commercial Practices Directive 2005/29/EC, the Consumer Rights Directive 2011/83/EU, and the Unfair Contract Terms Directive 93/13/EEC.
The results of the public consultation have now been published in a “fitness check” report, which highlights harmful practices such as:
The report (published on 3 October 2024) also notes that the fragmentation of national laws, insufficient enforcement, legal uncertainty, and a lack of incentives all act as barriers for businesses seeking to adopt higher standards of protection.
While the report does not include a concrete plan for future actions that the European Commission should take, it outlines the state of the current situation and identifies areas for improvement, offering a basis for future analysis and development.
The European Commission plans to propose measures aimed at creating a Digital Fairness Act to address the harmful practices that the report has identified, but drafts are not expected before mid-2025.
The European Commission’s Directorate-General for Competition (“DG COMP”) has released a policy brief following calls for contributions at the start of 2024, aiming to understand the impact of generative AI and virtual worlds on competition within Europe. The document – while not representing an official position of the Commission – serves as a staff-prepared document to inform on-going discussions and actions in these areas.
Based on “fruitful collaboration and exchange” with competition authorities in France, Hungary, Portugal, and the UK (as well as the U.S. Federal Trade Commission (FTC)), the policy brief outlines DG COMP’s mission to “ensure that citizens, small and large businesses can enjoy the benefits that competitive generative AI and virtual worlds markets can bring, in terms of price, choice, innovation and quality.” To this end, the brief explores market dynamics, emerging tendencies (e.g., towards vertical integration to access input resources and distribution channels), and potential barriers to entry in the generative AI and virtual worlds sectors.
The policy brief further describes preliminary frameworks for case analysis, including possible theories of harm and efficiency gains, and discusses potential anticompetitive concerns (for example, so-called acqui-hires) and the tools available to address them, such as antitrust enforcement, merger control (taking into account the ECJ’s recent decision in Illumina/Grail), plus the Digital Markets Act as a complementary tool.
DG COMP will continue to actively monitor the AI and virtual worlds sectors to ensure that competition is not negatively affected. Increased scrutiny is expected for investments and partnerships between large digital players and generative AI developers, from both antitrust and merger control viewpoints.
In light of the global nature of AI players’ business operations, DG COMP has also committed to strengthening coordination efforts to effectively address potential concerns, following up on a previous joint statement published with the U.S. Department of Justice, the FTC, and the UK Competition and Markets Authority.
With the growing prevalence of deepfakes, concerns about their misuse have sparked discussions about the need for stronger legal measures in Germany. The debate now centers on whether current laws are equipped to handle the complexities of these digitally manipulated media or if new, more targeted regulations are required.
The German Federal Council recently adopted a legislative proposal, introducing a new criminal offence to be added as Section 201b into the German Criminal Code criminalizing the violation of personality rights through the making and use of deepfakes (i.e., realistically appearing media content, such as sound, video, or image recordings, created or altered using artificial intelligence or other technological means). Issued on 5 July 2024, the proposal highlights a gap in the current legal framework, suggesting that the nuanced threats of deepfakes require more targeted regulations.
In a brief statement provided in response to the German Federal Council’s proposal, the German Federal Government acknowledged the potential dangers of deepfakes but expressed reservations about the immediate need for a new criminal offence. It claimed that most situations involving deepfakes are already covered by existing laws, such as defamation (§ 187 German Criminal Code), the violation of privacy and personal rights through images (§ 201a German Criminal Code), as well as other provisions dealing with inter alia the distribution of pornographic content. The German Federal Government also raised concerns about the legal clarity provided by the proposed offence and the impact of its narrower scope compared to existing laws.
The German Federal Council’s proposal has been submitted to the German Federal Parliament, where it will undergo thorough discussions to determine whether it will be implemented into law. Given the German Federal Government’s position on this topic, it seems unlikely that the proposal adopted by the German Federal Council will be passed without substantial amendments.
As reported in our Q2 2024 update, the European Commission adopted a detailed opinion (“Opinion”) on the draft Youth Media Protection State Treaty (“Treaty”) in July 2024. It found that the draft violated the country-of-origin principle of the e-Commerce Directive and conflicted with the binding youth protection rules of the DSA.
The Opinion triggered a standstill period which, after an extension, ended on 5 August 2024.
The German federal states have now published a counterstatement to the Opinion. As part of this response, the draft Treaty was also reviewed and modified in some respects; for example, to ensure compatibility with the DSA, an exception was added to the draft Treaty, clarifying that intermediary services within the meaning of the DSA do not fall under the scope of the Treaty. In the counterstatement, Germany also emphasizes that it believes that cooperation at the national and European levels is necessary to enforce effective protection of children.
Importantly, the counterstatement clarifies that the provisions of the revised draft of the Treaty are not linked to specific categories of services, but rather to their content. Furthermore, a distinction is made between content from within Germany and content from abroad, and, accordingly, stricter requirements are applicable for foreign content; it is hoped that this modification will demonstrate to the European Commission that the Treaty is taking a targeted approach, based on specific service content, in an effort to combat previous criticisms from the European Commission that (i) the draft Treaty relied on applying general and abstract measures to services, and (ii) the country-of-origin principle of the e-Commerce Directive had been violated.
The revised draft of the Treaty will likely be adopted by the German legislative branch. If, however, the European Commission continues to consider that the Treaty violates the e-Commerce Directive or the DSA, it may choose to launch an infringement procedure against Germany before the European Court of Justice. While this appears to be a drastic measure, it cannot be ruled out at this stage.
As one of the many Member States to miss the 17 October 2024 deadline for the transposition of NIS2 into national law, Germany is making significant strides towards implementation with a Q3 2024 approval of the draft bill of the German NIS2 Implementation Act (the “Implementation Act”) by the Federal Cabinet.
The Implementation Act comprehensively modernizes and restructures German IT security law.
In line with the NIS2 requirements, the Implementation Act extends the obligations to implement cybersecurity measures and report cyber-attacks to an increased number of companies across a broader range of sectors. As part of this extension, the previous one-stage reporting obligation for cybersecurity incidents by covered in-scope companies will be replaced by a three-stage reporting system. As NIS2 requires, an initial report must be submitted within 24 hours, an update within 72 hours, and a final report within one month. The German Federal Office for Information Security (BSI) will be given new supervisory and enforcement competences and instruments.
Notably, however, the Implementation Act does not go beyond the minimum harmonisation standards required by NIS2, which may be welcome relief for businesses.
The Implementation Act is expected to enter into force during Q1 2025, with BSI estimating that approximately 29,500 companies will be in scope. The BSI offers support services in the form of an NIS2 applicability assessment instrument and an FAQ catalog on NIS2 issues, to help inform companies, during the on-going legislative process, if they are potentially affected by the Implementation Act.
The UK Office of Communications (“Ofcom”) recently published an updated timeline for the various guidance and regulations under the UK Online Safety Act (OSA), which will give the OSA its bite. This follows the closure of numerous Ofcom consultations, which we covered in our previous updates (Q4 2023, and Q1 & Q2 2024). In addition, Ofcom has issued a fine (its first under the OSA) to TikTok for TikTok’s failure to provide accurate information to Ofcom.
What’s Next?
Organizations should keep an eye on the following dates, which reflect Ofcom’s updated timeline for the OSA’s obligations and related guidance.
Since our last update on AI, the UK regulatory landscape has made some (albeit rather slow) progress. The AI (Regulation) Bill, which was introduced in November 2023 as a private members’ bill, lapsed with Parliament’s dissolution in July 2024 and although the newly elected government pledged to implement “binding regulation” for companies developing the “most powerful AI models,” no substantial progress has been reported thus far.
While we may anticipate some additional comprehensive AI legislation from the UK government in the current parliamentary session, the ICO is (perhaps more reliably) expected to release draft guidance on GenAI shortly.
We previously discussed the key provisions of the UK’s Digital Markets, Competition and Consumers Act (DMCCA) during its infancy as a bill and as it passed into law. Since our last update, the Minister for Employment Rights, Competition and Markets has issued a statement setting out the Government’s DMCCA implementation timeline, and consultations from each of the UK CMA and the Department for Business and Trade (DBT) have closed.
The CMA consultation, which closed on 18 September 2024, focused on the draft consumer enforcement rules and guidance. The rules and guidance detailed how the CMA proposes to exercise its new powers and functions acquired under the DMCCA. In particular, the draft guidance explains how the CMA will generally conduct direct consumer enforcement investigations and the draft procedural rules will be legally binding when the CMA exercises its powers.
The DBT consultation, which closed on 10 September 2024, focused on: (i) how turnover should be estimated or calculated; and (ii) when a person is considered to have control over an enterprise. These factors are relevant for determining whether a business has “Strategic Market Status,” as well as determining statutory maximum penalties for non-compliant entities or individuals who are required to pay.
The DMCCA is expected to be implemented in a staggered fashion, with key dates to watch falling on December 2024/January 2025, April 2025 and Spring 2026 as follows:
In relation to the closed consultations, the CMA will look at the responses it received before publishing its final guidance and the DBT is expected to publish the final version of the regulations in the next few months.
As we previously discussed in our Q1 2023 alert, Part 1 of the Product Security and Telecommunications Infrastructure Act 2022 (the “PSTI Act”) introduces security requirements for consumer “Internet of Things” or “smart” devices made available in the UK (the “Products”).
Part 1 of the PSTI came into force on 29 April 2024 via secondary legislation (together, the “PSTI Regime”), which also introduced long-awaited security requirements that will apply to the in-scope Products.
The secondary legislation sets out the following security requirements for certain hardware or software products:
After determining whether they fall within the scope of the PSTI Regime as a manufacturer, distributor, or importer of the Products, organisations will need to bring themselves up to speed with the minimum security requirements. This may be easier for those who already meet certain industry standards, such as ETSI EN 303 645. However, other organizations can leverage existing processes (such as policies for investigating and reporting vulnerabilities, administering security updates, or amending terms to provide support periods to customers) to meet the standards of the PSTI Regime.
Those falling under the PSTI Regime and providing the Products to the EU should also consider the scope of the EU Cyber Resilience Act, which may also apply.
We are grateful to the following members of MoFo’s European Digital Regulatory Compliance team for their contributions: Safwan Akbar and Abigail Pacey, London office trainee solicitors; and Lotta Stroehlein, Berlin office research assistant.