European Digital Compliance: Key Digital Regulation & Compliance Developments
European Digital Compliance: Key Digital Regulation & Compliance Developments
To help organizations stay on top of the main developments in European digital compliance, Morrison Foerster’s European Digital Regulatory Compliance team reports on some of the main topical digital regulatory and compliance developments that have taken place in the second quarter of 2024.
This report follows our previous updates on European digital regulation and compliance developments for 2021 (Q1, Q2, Q3, Q4), 2022 (Q1, Q2, Q3, Q4), 2023 (Q1, Q2, Q3, Q4) and 2024 (Q1).
In this issue, we highlight…
1. Finally! The EU AI Act: published and coming into force
2. Right to Repair Directive formally adopted by the EU
3. Political Advertising Regulation formally adopted by the EU
4. European Media Freedom Act formally adopted by the EU
5. Louvain-la-Neuve Declaration: promotion of common EU rules on youth protection
6. European Court of Justice upholds country-of-origin principles applying to B2B services in the EU
7. Green Claims: EU Parliament and Council adopt positions for trilogue negotiations
10. Update on NIS2 implementation in Germany
11. Germany moves quickly on Digital Services Act implementation
12. UK Digital Markets, Competition and Consumers Act published and in force
13. UK Online Safety Act: ongoing updates
14. House of Commons report on AI governance
15. Dissolution of the UK Parliament: the legislative casualties of the general election
The EU Artificial Intelligence Act (“AI Act”) has finally been enacted. As we have repeatedly reported in the past Initial Draft Regulation, April 2021; AI Regulation in Europe, October 2022; Political Agreement on the EU AI Act, December 2023; Parliament Approves AI Act, March 2024; and Copyright Compliance under the AI Act, June 2024, the AI Act is one of the EU’s current major pieces of legislation, placing extensive obligations on providers, deployers, importers, and distributors of AI systems and general-purpose AI models.
The AI Act was formally adopted by the Council on 21 May 2024 and has been published in the Official Journal of the EU as Regulation (EU) 2024/1689 on 12 July 2024, thereby paving the way for it coming into force on 1 August 2024.
Even before the AI Act has fully come in force, the EU Commission and the EU member states have already been very actively working on its implementation. In June 2024, the first high level meeting of the AI Board took place at the Commission. Created pursuant to Articles 65 and 66 of the AI Act, the AI Board comprises one representative per Member State and has the task of assisting the Commission to ensure consistency and coordination of implementation between national competent authorities. At that inaugural meeting, the AI Board discussed its role and organizational matters together with the necessary first steps for the implementation of the AI Act, initial deliverables, and priorities.
Also, the Commission is currently intensively preparing the process for drawing up the code of practice for providers of general-purpose AI models. This code of practice is a central tool for compliance with the obligations under the AI Act and pursuant to Article 56(9) of the AI Act, the Commission’s AI Office has to finalize the code of practice within nine months from the date of the AI Act’s entry into force.
After publication of the AI Act in the Official Journal of the EU on 12 July 2024, it will enter into force 20 days later (1 August 2024). Its provisions will then start taking effect step by step over the next 6, 12, 24, and 36 months. For more details on the timeline, please see our March 2024 client alert on the EU AI Act.
The EU has been working for some time on a “Right to Repair” (R2R) Directive that will impose enhanced obligations on manufacturers and sellers of goods (including digital products) to repair defective or broken products.
In our Q1 2024 update, we discussed the EU’s preliminary agreement on the R2R Directive reached by the Council, Parliament, and the Commission during their trilogue negotiations.
Since then, members of the European Parliament voted in favor of the draft proposal and the Council also gave its final approval. Finally, the R2R Directive was published in the Official Journal of the European Union on 10 July 2024.
The published Directive creates a number of tools and incentives to make repair more attractive to consumers (as opposed to replacement). These include:
As the R2R Directive has now been published, it will enter into force on the 20th day after the publication, i.e., 30 July 2024, and Member States will then have 24 months to transpose the Directive into national law.
As reported in our Q4 2023 update, the EU institutions had reached an agreement on proposed new rules regarding political advertising in the form of a “Regulation on the Transparency and Targeting of Political Advertising” in November 2023.
Since then, the new rules were published in the EU Official Journal as Regulation (EU) 2024/900 in March 2024 and entered into force on 9 April 2024. Key provisions regarding definitions and non-discrimination are already in effect.
The new law aims to increase transparency in political advertising, both online and offline, across elections, referendums, and EU/Member State legislative procedures. It applies to advertising services as well as publishers.
Among other things, the regulation requires in-scope services to provide mechanisms for declaring involvement in political advertising, publish clear transparency notices on political ads, and disclose political advertising income in their annual financial statements. The lawmakers also introduced restrictions on specific targeting and amplification techniques in political advertising.
All substantive obligations under the new regulation will start to apply in October 2025. Also, by January 2026, EU Member States must establish and notify the European Commission of their specific rules for enforcing compliance with these obligations. These rules may include financial penalties for breaches of up to 6% of the annual income/budget of the advertiser or provider of advertising services (whichever is higher) or 6% of their annual global turnover.
The EU has taken significant steps to bolster media pluralism and independence in the EU with the European Media Freedom Act (EMFA), which is intended to uphold the EU’s status as the torchbearer of free media as a crucial pillar of democracy and healthy market economy. See our previous reporting in our Q2 2022, Q3 2022, and Q4 2023 updates.
Since our most recent update, the EMFA was formally adopted by the European Council, published in the Official Journal as Regulation (EU) 2024/1083, and entered into force on 7 May 2024.
To fulfill its lofty ambitions, the EMFA introduces a diverse set of rules. It is set to fortify the landscape of media operations by ensuring editorial independence, protecting journalistic sources, and maintaining the autonomy of public service media.
Additionally, the EMFA will enhance media ownership transparency, defend against unjustified content removal by very large online platforms, and introduce measures for customization of media on devices, while also increasing transparency in state advertising and audience measurement, and requiring Member States to assess the impact of significant media market concentrations.
Media providers will need to comply with the majority of relevant new EMFA obligations from 8 August 2025 onwards, with select obligations (e.g., regarding devices and user interfaces) subject to longer transitional periods.
Effective and consistent application of the new framework will be ensured by the new independent European Board for Media Services (EBMS), which is scheduled to assume operations in February 2025. The EBMS will replace the current European Regulators Group for Audiovisual Media Services (ERGA), which had been set up under the Audiovisual Media Services Directive (AVMSD).
In April 2024, the European Council issued its Louvain-la-Neuve Declaration to promote a safer, more responsible, and trustworthy online environment across Europe, particularly through improvements in the field of youth protection online. While not legally binding, the declaration is intended to serve as a catalyst for future EU political initiatives and legislative actions.
A notable emphasis of the declaration lies in safeguarding children on the internet. It is particularly based on the finding of pervasive harmful online content and systemic risks inherent in social media platforms, which particularly impact children and adolescents who encounter these threats daily. The Council also considers that children are increasingly susceptible to manipulative and behavioral advertising tactics.
Furthermore, the declaration underscores the need to enhance cybersecurity measures by combatting phishing attacks, fortifying data traffic security, and limiting instances of online harassment and identity theft, which can potentially lead to financial fraud. Mitigating the proliferation of fake profiles, which poses a significant threat to online democratic processes, is also a key objective.
To these ends, the declaration suggests four key measures:
Currently, there is no unified European legislation governing online protection for children and adolescents.
While this declaration represents a foundational step towards such regulatory coherence, it lacks direct regulatory force.
The Council does not have powers to trigger new legislative initiatives at the EU level. It is thus unlikely that the declaration will trigger concrete changes of the law for the time being. At the same time, it is possible that Member States will reshape their own enforcement efforts regarding youth protection with one eye on the approaches highlighted by the declaration.
In the current climate of ever-increasing national regulation of information society services, recent rulings by the Court of Justice of the European Union (CJEU) have reaffirmed and strengthened the country of origin principle as set out in Article 3, paragraphs 1 and 2 of the e-Commerce Directive (cases C-662/22 & C-667/22, C-663/22, C-664/22 & C-666/22, and C-665/22). This principle provides that an information society service is only subject to the national regulations of the Member State in which it is established, and that other Member States cannot restrict the free movement of this provider by requiring compliance with obligations that exceed those that apply in the provider’s own Member State of establishment.
Several information society service providers established in Member States outside Italy, challenged amendments to the national legal framework adopted by Italian authorities that were intended to ensure the application of the EU Platform-to-Business Regulation (“P2B Regulation”). The adopted measures require providers of online intermediation services, on pain of penalties, to be entered in a register maintained by the Italian Communications Regulatory Authority and communicate certain information to that authority, as well as to pay a contribution to finance its supervisory activities.
In its rulings in several cases from 30 May 2024, the CJEU declared these measures to be incompatible with the country-of-origin principle, applying a broad definition of its scope of application and a narrow interpretation of its exceptions.
The country-of-origin principle encompasses all legal requirements of Member States to which service providers must adhere when providing an information society service. The CJEU has determined that national obligations constitute such requirements even if:
Furthermore, the CJEU ruled that national measures adopted with the declared aim of ensuring the enforcement of an EU regulation may only be exempted from the country-of-origin principle if there is a direct link between the objective of that EU regulation and the objectives listed in Article 3, paragraph 4 of the e-Commerce Directive (e.g., safeguarding public safety and protection of minors).
These decisions follow a November 2023 judgment by the CJEU which confirmed that Member States may only deviate from the country-of-origin principle by way of case-by-case exemptions, and not by way of abstract and general legislation (case C-376/22).
These decisions confirm and strengthen the country-of-origin principle as a fundamental tenet of the EU internal market and contribute to legal certainty and clarity for information society services operating internationally. Member States seeking to regulate information society service providers established outside their jurisdiction will face significant challenges following these judgments.
We reported in our Q1 2023 update that the European Commission proposed, as part of the EU’s Green Deal, a Green Claims Directive that will oblige companies to verify their environmental claims before advertising products or services. Both the EU Parliament (see press release) and the Council (see press release) have now adopted their positions for the trilogue negotiations on this issue.
Both the EU Parliament and the Council agree with the general concept of the proposal for the Green Claims Directive, i.e., that environmental marketing claims must be pre-approved by an officially accredited third party before advertising products or services. The differences between the adopted positions only concern certain nuances.
Under the EU Parliament’s position, environmental claims based solely on carbon offsetting schemes would be prohibited, and companies will only be allowed to mention offsetting and carbon removal schemes in their ads if they have already reduced their emissions as much as possible and use these schemes for residual emissions only. According to the Council’s position on this topic, carbon offset claims would only be allowed if companies prove a net-zero target, show progress towards decarbonization, and disclose the percentage of total greenhouse gas emissions that have been offset.
Both the EU Parliament and the Council also proposed some simplifications for the verification process with respect to environmental marketing claims. Based on the EU Parliament’s position, claims would be assessed within 30 days, but simpler claims and products could benefit from quicker or easier verification. According to the Council’s position, certain types of environmental claims would be eligible for a simplified procedure for the verification process.
The adopted positions by the EU Parliament and the Council will form the basis for the trilogue negotiations with the EU Commission. Due to the recent European elections in June 2024, negotiations are expected to begin in the new legislative cycle (likely in autumn 2024).
As previously noted in our Q2 2023 update, the German Ministry of Justice is planning to introduce a new “Digital Violence Act.” This was followed in autumn 2023 by two court rulings from the German Federal Court of Justice and the European Court of Justice, which brought substantial changes to the planned legal framework of the Digital Violence Act. Consequently, the Federal Ministry of Justice organized an expert discussion in May 2024 and published a corresponding discussion paper to outline unresolved issues for its upcoming draft.
The Federal Ministry of Justice had originally planned to oblige social networks and messenger services to disclose IP addresses of perpetrators of digital violence to their victims. However, due to regulations prohibiting the retention of traffic and location data, IP addresses can only be stored for a maximum of seven days and, in some instances, they are not retained at all in practice. The Ministry is thus contemplating whether to introduce an obligation for these platforms to temporarily retain the relevant data.
Another problem is identifying perpetrators and taking action against them if they are located abroad. In this context, the Federal Court of Justice had determined that German courts lack jurisdiction when a German retailer seeks access to data held by a platform based outside of Germany. In its discussion paper, the Ministry suggests alternative approaches, but also cites considerable uncertainty regarding the jurisdiction of German courts.
Further, the discussion paper raises the issue of the use of court orders vis-à-vis online platforms to suspend the accounts of alleged perpetrators of digital violence. As social networks and online platforms are already obliged to suspend accounts under specific conditions without judicial oversight under Article 23 of the Digital Services Act, the Ministry has doubts whether German law can still address this issue.
Finally, the Ministry had planned to require that service providers must appoint an authorized recipient for the service of official documents in Germany. However, following the European Court of Justice’s November 2023 ruling regarding the EU country-of-origin principle, this obligation can now only apply to providers from third countries outside the EU. However, the discussion paper also asks whether workarounds might be possible regarding this issue.
An actual legislative draft for the proposed Digital Violence Act is still nowhere to be seen because the Ministry’s work has been significantly impacted by the recent rulings from both the Federal Court of Justice and the European Court of Justice discussed above.
Nonetheless, the Ministry seems to be committed to addressing these emerging issues and integrating the implications of the case law into its ongoing legislative work, aiming to present a draft later this year.
As previously noted in our Q4 2023 update, the German federal states have finalized discussions on an amendment to the German Youth Media Protection Treaty to introduce mandatory device-level parental controls regarding the installation of, and access to, apps.
In April 2024, the proposed amendment was formally submitted to the European Commission, as is required for all technical rules to be adopted at Member State level that would affect online services.
The notified wording remains largely unchanged compared to the most recent update. The proposed amendments aim to enhance the protection of minors and facilitate the easier implementation of parental control systems. In essence, the amendment would require providers of operating systems designated by the German State Media Authorities to implement new parental control mechanisms that comply with the prescriptive requirements of the draft. These mechanisms must allow parents to set a specific age level and, if activated, block installation of, and access to, all apps that are unsuitable for users below that age level. To facilitate this mechanism, app stores must have machine-readable age ratings for all their apps, or they will be blocked as well. Further restrictions also apply to the use of web browsers.
On 1 July 2024, the Commission adopted a detailed opinion on the draft, finding that it would violate the country-of-origin principle of the EU E-Commerce Directive, and that it would further conflict with the youth protection rules of the EU Digital Services Act (DSA). The Commission noted that the latter conclusively regulates the matter of youth protection for all services in scope of the DSA, leaving no room for separate Member State rules.
The notification of the draft amendments to the Commission triggered a standstill period which was set to expire on 4 July 2024 and extended until 5 August 2024 through the issuance of the Commission’s detailed opinion. While this detailed opinion forces Germany to respond to the Commission’s findings, it does not formally bar the German federal states to move forward with the initiative.
Upon expiry of the extended standstill period, the amendment is thus still likely be adopted by the heads of the German federal states in October 2024, regardless of the Commission’s opinion. It can then enter into force subject to ratification by all 16 German state parliaments. The only further measure for the Commission would then be initiating infringement proceedings against Germany that could eventually take this matter before the European Court of Justice.
The NIS2 Directive, effective from January 2023, sets out enhanced cybersecurity obligations and expands the range of entities that fall under its purview, compared to its predecessor, the NIS Directive.
EU Member States are required to incorporate the Directive into their national legislation by October 2024 (see our Q1 2023 update).
In May 2024, the German Federal Ministry of the Interior (BMI) released a draft law (NIS2 Umsetzungsgesetz) to transpose the NIS2 Directive into German national law. Subsequently, an updated draft was shared with stakeholders in June 2024, reflecting ongoing consultations and adjustments.
The German draft law seeks to align with the NIS2 Directive by broadening the scope of entities covered. It categorizes services into three groups: critical entities, essential entities, and important entities. The draft law stipulates comprehensive risk management measures, the possibility of prohibiting the use of certain critical components, a registration requirement for operators, and an obligation to report significant safety incidents.
It also outlines the role of the Federal Office for Information Security (BSI) as the competent regulator and main point of contact, with extensive investigative powers for enforcement. The potential maximum fines for non-compliance are substantial, reaching up to €10 million or 2% of global annual turnover for critical and essential entities, and €7 million or 1.4% of global annual turnover for important entities.
The BMI’s draft was adopted by the Federal Government largely unchanged on 24 July 2024.
The government draft will now undergo discussions in parliament, which are likely to commence only after summer recess. With that, it is unlikely that Germany will meet the October 2024 deadline to transpose the NIS2 Directive into national law. Once approved by parliament, the law will be signed and will come into effect upon its publication in the Federal Law Gazette.
In Germany, the Digital Services Act (DSA) implementing rules (Digitale-Dienste-Gesetz or DDG) have been published in the Official Journal and have been in force since 14 May 2024. Previously approved by the Bundestag in March 2024, the new law contains the necessary framework for German authorities – in this case, the Federal Network Agency (Bundesnetzagentur) as well as other specialized agencies – to enforce the DSA against service providers subject to German jurisdiction under the DSA. See our previous reporting in our Q1 2024 update.
The Federal Network Agency did not lose any time in displaying its shiny new toys to the public, providing access to new tools on its website right as the DDG entered into force. Stakeholders can now find information on a host of topics, including out-of-court settlement procedures, information for providers, and data access for scientists. In addition, organizations registered in Germany can now apply for trusted flagger status through the website, and perhaps most importantly, users can now lodge complaints through the Federal Network Agency’s own complaints portal.
At the same time, the DDG is not a toothless tiger. This is proven by the new Section 33 which stipulates fines for various administrative offenses. Mainly, these offenses relate to violations of information, disclosure, verification, and access obligations under the DDG and the DSA as well as the P2B Regulation. Applicable fines can reach up to €300,000 and, in more serious cases, even allow for providers to be fined up to 6% of their annual turnover.
As further EU Member States adopt relevant implementing laws, enforcement at the Member State level regarding the respective in-scope services is now expected to ramp up. However, it remains to be seen whether national authorities will display the same ambition as the European Commission, which is already fully engaged in enforcing the DSA against providers under their jurisdiction (i.e., very large online platforms and very large search engines).
In our Q1 2024 update, we discussed the UK’s Digital Markets, Competition and Consumers (DMCC) Bill and its key provisions. Since then, the DMCC Bill received Royal Assent on 24 May 2024 and has passed into law as the DMCC Act.
The DMCC Act contains a number of changes from the DMCC Bill. These include:
The DMCC is expected to come into force in autumn 2024, but the precise timing will be guided by the new UK government’s priorities.
Ofcom, the regulator for the UK Online Safety Act (OSA), continues to prepare for the legislation to come into force.
Apart from closing the call for evidence intended to inform Ofcom’s codes of practice and guidance on the additional duties applicable to in-scope businesses under the OSA, Ofcom has launched consultation on protecting children from harms online.
Before the end of 2024, Ofcom is also expected to publish an additional consultation and specific proposals on its use of “tech notices” to require service providers, in certain circumstances, to use accredited technologies to deal with some kinds of illegal content.
Phase 3 – Call for Evidence was closed on 20 May 2024
As we covered previously in our Q1 2024 update, Ofcom was seeking evidence from stakeholders to inform its codes of practice and guidance on the additional duties imposed on in-scope businesses by the OSA. Additional duties will apply to the most widely used online sites and apps, which are designated as “categorized” services. They will fall within Category 1, Category 2A, or Category 2B depending on the type of their service, functionality, and user numbers.
Although the Secretary of State is responsible for passing the secondary legislation setting the thresholds for the respective categories under the OSA, Ofcom’s recommendations must be taken into account. After the secondary legislation is passed, Ofcom will collect information from the regulated services and publish a register of categorized services.
Consultation on protecting children from harms online
In May 2024, Ofcom published draft codes of practice and guidance that outline how it expects online services to meet their legal responsibilities under the OSA to protect children online.
The consultation documents set out the following steps that online service providers need to take in order to protect children:
Ofcom suggests more than 40 measures in its draft Children’s Safety Codes to make user-to-user and search services safer by design. The measures include:
After the end of the consultation process in July 2024, Ofcom will assess the responses and finalize its proposals by spring 2025.
Later this year, Ofcom plans to launch a consultation on how automated tools (such as AI) can be used to proactively detect illegal and other content harmful to children (such as child sexual abuse material and content encouraging suicide and self-harm).
Additionally, the regulator aims to publish specific proposals on its use of “tech notices” to demand providers, in certain circumstances, to use accredited technologies to tackle specific kinds of illegal content.
The House of Commons Science, Innovation and Technology Committee (the “Committee”) published its Governance of Artificial Intelligence (AI) Report (“Report”) in May 2024, examining domestic and international developments in AI governance and regulation since its August 2023 interim report. The Report also revisits the 12 Challenges of AI Governance identified in the interim report and suggests ways to address them.
The Report recognises the uncertainties in shaping the future of the UK’s AI governance framework, particularly in balancing AI risks and opportunities. Key Report recommendations for the new UK government to implement include:
The Committee emphasises that the issues and recommendations addressed in the Report are applicable to any future government and therefore should be considered by the new Labour government.
Notably, the new UK government’s manifesto includes some indications for the future AI regulatory landscape, such as robust regulation to prohibit the creation of sexually explicit deepfakes and to regulate companies developing “the most powerful AI models”. However, the King’s Speech on 17 July 2024 did not contain any proposals for an AI bill or UK legislation equivalent to the EU’s AI Act.
After the former UK prime minister called for a snap election, Parliament was dissolved on 30 May 2024 to pave the way for a newly elected Parliament. Before dissolution, pending bills faced their fate during the “wash-up” period, determining whether they would be passed or if they would fall away. Any bills that didn’t survive the wash-up period could still be covered in the next Parliamentary session but would require freshly drafted legislation.
Despite being close to the final legislative stages, the Data Protection and Digital Information Bill (“DPDI Bill”) did not make it past the finish line during the wash-up period. We previously covered the DPDI Bill in our Q1 2023 update, and the bill sought to be the new UK GDPR by regulating access to, and processing of, personal information, including in areas such as automated decision-making.
Other bills that failed to survive the wash-up period include the Criminal Justice Bill, which addressed various matters such as creating an offence of causing death by dangerous cycling, and the Tobacco and Vapes Bill, which would have banned anyone born after 2009 from buying cigarettes. As we previously discussed, one digital bill that did succeed during the wash-up period was the Digital Markets, Competition and Consumers Bill.
Numerous commentators expect the DPDI Bill to return in some form with a key question being if it will look hauntingly familiar or if it will more closely align with new EU legislation such as the EU AI Act or Data Act. Lord Clement-Jones has suggested that the Labour government’s new digital bill could come in the autumn “along entirely different lines” and would include provisions on AI (although the government has expressly not proposed a separate AI Act).
We are grateful to the following member(s) of MoFo’s European Digital Regulatory Compliance team for their contributions: Michal Pati and Jacqueline Lee, London office trainee solicitors, and Nina Graw and Lotta Ströhlein, Berlin office research assistants.