European Digital Compliance: Key Digital Regulation & Compliance Developments
European Digital Compliance: Key Digital Regulation & Compliance Developments
To help organizations stay on top of the main developments in European digital compliance, Morrison Foerster’s European Digital Regulatory Compliance team reports on some of the main topical digital regulatory and compliance developments that have taken place in the second quarter of 2023.
This report follows our previous updates on European digital regulation and compliance developments for 2021 (Q1, Q2, Q3, Q4), 2022 (Q1, Q2, Q3, Q4) and 2023 (Q1).
In this issue, we highlight further examples of the growing regulatory divergence between the EU and UK in digital issues. While the EU’s Digital Markets Act and Digital Services Act are in force, the UK has started the process of implementing its own regulatory regime for digital markets with the publication of its UK Digital Markets, Competition and Consumer Bill, which will affect both digital providers and increase the enforcement of consumer laws generally by regulatory authorities. And, while the EU is close to agreement on a new EU Data Act and EU AI Act, the UK continues to push forward its controversial Online Safety Bill.
1. Digital Markets Act: Next steps
2. Digital Services Act: Update regarding designations as VLOP/VLSE
3. EU Recommendation on online piracy of live content
4. EU proposal for a new Regulation on Standard-Essential Patents
5. The EU Data Act: Political agreement reached
6. UK Digital Markets, Competition and Consumers Bill: Key competition and consumer aspects
7. Online Safety Bill progress: The “Trolling” offence
8. Greenwashing clampdown: ASA takes tough stance on misleading environmental claims
9. German Digital Violence Act: Key points published
On 2 May 2023, the Digital Markets Act (“DMA”), EU’s digital gatekeeper legislation, became fully applicable (see our most recent Q4, 2022 update). The corresponding Implementing Regulation to specify notifications, requests, reports and other submissions of information, and the opening of proceedings also became applicable at the same time.
By 3 July 2023, potential gatekeepers must have notified their core platform services to the Commission if they meet the DMA’s quantitative thresholds – meaning, if they have (i) an annual turnover of at least €7.5 billion in the EU or a market capitalization of at least €75 billion and (ii) at least 45 million monthly active end users and 10,000 yearly active business users.
Once the Commission receives a notification, it will assess whether the respective business meets the thresholds and, if so, designate the company as a gatekeeper (by 6 September 2023).
Designated companies will then have six months to ensure that they comply with a list of obligations and prohibitions under the DMA, at the latest by 6 March 2024. These main obligations include data access and data use rules, prohibitions on self-preferencing and bundling, and interoperability obligations (see our DMA client alert for details).
Subsequently, the designated gatekeepers must issue a report demonstrating their effective compliance. The Commission is currently preparing a draft template for the compliance reports to ensure that they include all of the relevant information needed by the Commission to assess the effective compliance of designated gatekeepers with the DMA.
There are increasing tendencies in other, parallel legislation to use the gatekeeper status under the DMA and/or the corresponding obligations as a reference. At the EU level, this first became apparent with the proposed Data Act (see The EU Data Act: Political agreement reached), which suggests excluding DMA gatekeepers from certain data sharing options.
In Germany, the proposed implementation for the EU Representative Actions Directive (see our Q1, 2023 update) and the recently adopted implementation of the EU Whistleblower Protection Directive both include references to the DMA (see our alert). This means that violations of the DMA may be subject to legal action by consumer protection associations in the future and that it will become easier for employees to report such violations anonymously.
The EU is getting ready to enforce its new content moderation rules under the Digital Services Act (“DSA” – see our Q4, 2022 update).
The Commission designated a first set of 17 “Very Large Online Platforms” (“VLOPs”) and two “Very Large Online Search Engines” (“VLOSEs”) under the DSA on 15 April (see Commission’s press release). These services were so designated because they manifestly crossed the threshold of 45 million monthly active users, based on the user numbers that all online platforms in scope of the DSA had to publish by February 2023.
VLOPs and VLOSEs are subject to the highest tier of DSA obligations (see our previous DSA client alert for more detail on the obligations). They will also be under direct supervision by the Commission, whereas all other services in-scope of the DSA will be enforced by the EU Member States.
Companies designated as VLOPs and VLOSEs must comply with their DSA obligations within four months after their designation – which, judging from the Commission’s press release, may be by late August 2023. The Commission has been inviting VLOPs and VLOSEs to participate in “stress tests” of their content moderation systems before the DSA takes effect, to ensure that platforms are ready to comply.
For all other services in-scope of the DSA, the new rules will apply from February 2024. EU Member States are currently in the process of appointing their enforcement authorities and, in particular, their “Digital Services Coordinators”. Relevant implementing laws will also establish procedural rules and the impact of the DSA on existing Member State laws in the areas now covered by the DSA (see our Q1, 2023 update on the German implementation efforts).
The European Commission is taking (small) steps to combat online piracy of sports and other live events, such as concerts and theatre performances. The aim is to strengthen the competitiveness of the EU sport and creative industries. But, so far, the Commission has only issued a Recommendation, rather than any more formal legislative proposals.
In May 2023, the Commission published a non-binding Recommendation on combating online piracy of sports and other live events. Based on a prior call for evidence and stakeholder meetings, the Commission has examined the issues surrounding unauthorized online retransmissions of sports and other live event broadcasting.
To the disappointment of some in the industry, the Recommendation doesn’t introduce either any new legal instruments for rights-holders nor any additional obligations for intermediary service providers. Instead, it encourages stakeholders to make use of existing legal tools.
The Commission will closely monitor the measures taken under this Recommendation and intends to finalize the establishment of clear Key Performance Indicators. The Recommendation’s impact and the potential need for additional measures will be re-assessed by November 2025.
Rights-holders have already expressed their dissatisfaction with the non-binding nature of the Commission’s current approach and are likely to step up lobbying for more effective legislative action.
In April 2023, the European Commission proposed a new EU Regulation on Standard-Essential Patents (“SEPs”), with the goal of making the SEP licensing system more transparent, predictable and efficient. This proposal brings significant changes to patent holders, patent applicants of SEPs, and implementers of SEPs in the EU.
The Commission suggests setting up a competence center at the European Union Intellectual Property Office (“EUIPO”) that will have numerous new and far-reaching tasks and competences. To that end, the draft Regulation provides for:
The EUIPO will administer an electronic register and an electronic database containing detailed information on SEPs in force in one or more Member States, including essentiality check results, opinions, reports, available case law and results of studies specific to SEPs.
SEP holders will be subject to notification duties (e.g., notifying the EUIPO of their SEPs in force or the aggregate royalty for the SEPs). According to the proposal, an SEP that is not registered at the EUIPO within a given time limit may not be enforced before national courts or the Unified Patent Court (“UPC”) in relation to the implementation of the standard for which a registration is required.
The EUIPO shall also select annually a sample of registered SEPs from different patent families from each SEP holder and with regard to each specific standard in the register for essentiality checks.
The proposal also sets out a mandatory – but non-binding – FRAND determination procedure at the EUIPO prior to any initiation of an SEP infringement claim before a competent court of a Member State or the UPC. Within nine months, the EUIPO shall issue a decision on FRAND determination in the respective case.
The Commission’s proposal is extremely ambitious and far-reaching. It can be expected that it will be met with resistance in the legislative process and may therefore be the subject to numerous amendments. In particular, the comprehensive legal monopoly that the EUIPO will receive is a major point of criticism. Against the background of the upcoming European elections in June 2024, the proposal is not expected to be implemented before 2025/2026.
We previously wrote about the EU Data Act (the “Act”), which was first proposed in February 2022 as part of the EU Commission’s strategy for data and on which the European Parliament and the Council of the EU have now reached political agreement.
What’s New?
The European Parliament and the Council of the EU reached political agreement on the Act on 28 June 2023.
The Act intends to facilitate access to industrial data to “ensure that the benefits of the digital revolution are shared by everyone”. In pursuit of this objective, the Act incorporates provisions that, among others:
Read more about the scope and impact of the Act in our previous article, with discussion of the potential UK impact.
This Act still needs to receive formal approval from the European Parliament and the EU Council (but it is unlikely that there will be any significant changes made to the Act at this point). After adoption, the Act will enter into force 20 days after it has been published in the Official Journal of the European Union. It will then become directly applicable to everyone in scope 20 months after it has entered into force.
Following the introduction of the EU’s DMA and in line with the increasing trend toward regulating big tech players, the UK government has introduced the Digital Markets, Competition and Consumers Bill (the “Bill”). The Bill introduces regulation specific to big tech players and separately overhauls the UK’s existing consumer protection regime.
Expected to come into force in the second half of 2024, the Bill is the UK government’s response to criticisms that existing regulatory frameworks: (i) do not provide a sufficient toolkit to regulate big tech players from a competition law perspective, and (ii) do not adequately protect consumers from unfair practices in general.
The UK Competition and Markets Authority (“CMA”) will be given new powers (in line with its existing competition powers) to take action against breaches of consumer protection law. The Bill will also amend existing merger control regulations and increase the CMA’s investigative and enforcement powers more generally.
These proposed reforms have followed a series of studies and reports detailing concerns around the concentration of market power among big tech players. The CMA has long lamented its inability to directly and effectively enforce consumer protection laws in the same way as it can in relation to competition rules. While the digital markets aspects of the Bill are targeted solely at big tech players, the wider consumer protection aspects impact a broad range of businesses operating in the UK.
Consumer behaviours are adapting to the constant evolution of technology, with consumers increasingly taking advantage of the ever-expanding digital world to purchase the majority of their goods and services online. The consumer-focussed sections of the Bill aim to bolster enforcement of consumer protection law by introducing a number of new provisions:
The CMA is expected to publish guidelines on interpretation and enforcement once the legislation is in place – while, in the meantime, it continues to focus on the digital markets sector as a key priority, using its existing powers to address the policy objectives behind the Bill.
In terms of the Bill’s progress through Parliament, various stakeholders are currently submitting evidence regarding the Bill’s provisions to be considered in Committee debates. A number of big tech players are (unsurprisingly) lobbying to curb the CMA’s “very open-ended powers” while the CMA defends its position that the legislation is essential to safeguard the competitiveness of UK markets in the future by ensuring that the Bill enables “current incumbents [… to be] knocked off their perch by tomorrow’s innovators”.
The UK’s Online Safety Bill (“OSB”) is progressing through the legislative process, still undergoing a number of amendments as it inches toward receiving Royal Assent, with many previously controversial provisions left intact and new protections introduced.
See more about the OSB in our previous client alerts on the first draft of the Bill in 2021, its first introduction in 2022 and our most recent update in March 2023.
With the OSB now having reached the committee stage of the parliamentary process, the government is still grappling with wave after wave of amendments. One recent amendment includes the introduction of a new criminal offence targeting individuals who publish online content that encourages serious self-harm – but this may well change, as further amendments are anticipated during the final days of the committee stage. Separately, Ofcom has engaged in industry dialogue to better inform the exercise of its anticipated powers before they come into effect.
The UK Ministry of Justice has recently introduced to the OSB a new offence of encouraging or assisting serious self-harm, with perpetrators facing a maximum of five years’ imprisonment upon conviction. The government recognized that there was a gap in the law in relation to the encouragement of non-fatal self‑harm, and consulted extensively with interest groups and campaign bodies to draft a new, targeted offence. This follows the Law Commission’s 2021 review, which warned against the serious effects of targeted “trolling” (internet slang for the dissemination of inflammatory messages to provoke emotional response).
The new offence would make it a crime to encourage someone to cause serious self-harm, irrespective of whether the victim actually goes on to injure themselves, and regardless of whether they were directly targeted by the perpetrator or not. This means that “trolling” should be squarely caught within this offence, because the perpetrator need not know who they are targeting. These amendments have now been accepted in the House of Lords, with members of the UK Parliament making clear that these new offences will criminalize the most damaging communications, without compromising the freedom of expression.
It was originally anticipated that the OSB would receive Royal Assent and become an Act of Parliament in April 2023. However, the timeline for obtaining this milestone has been revised to Autumn 2023. There is still scope for further changes to be introduced to the OSB, particularly after the penultimate stage where each House of Parliament considers the other’s amendments. As emphasized by the House of Lords, the challenge of legislating for a fast-moving industry that is fuelled by new technologies will remain an increasingly important, yet difficult, obstacle to overcome.
Separately, Ofcom, the regulator that will ultimately be responsible for the OSB, recently revised its “roadmap to regulation” (the first iteration of which was published in July 2022) to provide updated timeframes for its future consultations and the publication of its guidance. Additionally, to prepare for its expected powers under the Bill (an overview of which can be found in our previous article), Ofcom has been developing its understanding of user-generated fraud and illegal harms by engaging in discussions with tech platforms and carrying out further research to inform its policy and attitude to enforcement in these areas. Industry respondents have highlighted the difficulty of creating moderation policies that are narrowly and specifically construed, but that remain wide enough to be applicable worldwide, and also voiced concerns over the particular challenges that are posed by regulating livestreamed and ephemeral content. The detailed results of this consultation, together with drafts of codes of practice, enforcement guidelines and further guidance that have been prepared during the OSB’s legislative journey, are expected to be published shortly after the commencement of Ofcom’s powers.
The UK’s Advertising Standards Authority (“ASA”) has put its greenwashing guidance into practice by releasing rulings against three multinational companies for misleading online consumers about the true environmental impact of their respective initiatives. At the same time, the ASA has issued an update to its guidance about greenwashing. The ASA continues to monitor the digital and online advertising space, and we don’t expect the rate of these rulings to slow down given the ASA’s focus on cracking down on the greenwashing phenomenon. This mirrors similar increased regulatory efforts related to greenwashing by the EU.
In the UK, the ASA is the independent advertising regulator that makes sure that ads across UK (online and offline) media stick to the Advertising Codes. The Committee of Advertising Practice (“CAP”) is the sister organization of the ASA and is responsible for writing the Advertising Codes.
In March 2023, we wrote about the new advertising guidance (“Guidance”) introduced by the ASA and CAP, which aimed at clamping down on companies for using “greenwashing” in their adverts (see more in our previous client alert).
Since then, several companies have been on the receiving end of this crackdown against “greenwashing” as the ASA establishes its more rigorous stance on the Guidance.
The ASA continues to focus on claims that exaggerate the positive environmental impact of a business. Even if businesses make investments into lower carbon alternatives and commit to sustainability targets, if it is still overwhelmingly contributing to greenhouse gas emissions, the ASA will take a hard-line approach. The ASA is assessing environmental claims with a balanced view to the overall company’s operations and investments.
These rulings and updated guidance demonstrate that it’s not enough to have only minor green initiatives as evidence for environmental claims where a significant proportion of business operations completely undercut those claims. This is in line with recent legislative proposals at the EU level related to greenwashing and dealing with requirements for green claims (see our previous client alert and update).
The ASA will continue to monitor environmental claims for the next three months to assess the impact of the Guidance. If the monitoring reveals that businesses continue to make poorly substantiated environmental claims in their advertising, then the ASA will request a review by CAP to provide further guidance on how to substantiate these claims, which will include detail from expert insights and policy developments.
At the EU level, the legislative process regarding the current proposals will continue, but enforcement against greenwashing in EU Member States will also continue on the basis of the current rules on misleading adverts (see examples in our previous client alert).
Watch this space for the latest developments at EU level and more rulings that are likely to be produced as the ASA remains vigilant against misleading environmental claims, especially in digital and online advertising.
In April 2023, the German Ministry of Justice published key points for its upcoming legislative draft for a new “Digital Violence Act” or “Gesetz gegen digitale Gewalt”. This aims to enhance existing content moderation regimes with specific provisions to help victims of “digital violence” with the private enforcement of their rights.
According to the key points, the German government intends to ensure that anyone who experiences “digital violence” (i.e., an infringement of personal rights in the digital space) can effectively defend their rights in court. It is based on the finding that existing law does not adequately meet this requirement. For example, there is currently no method to quickly acquire information about the identity of the perpetrator or author of illegal content, and existing laws also lack an effective instrument for protection against repeat or notorious infringers.
While the exact scope of the proposed rules is not yet entirely clear, it is likely that they will apply to all providers of online services where user-generated or similar third-party content is available. It will also likely capture any violation that affects any of the victims’ statutory rights including, most notably, personality rights of natural persons or businesses. It could also apply to violations of intellectual property rights.
In terms of relevant obligations, the key points paper suggests three key enhancements to the private enforcement by victims of digital violence:
The Ministry of Justice held a public consultation on the key points for the upcoming legislative proposal and made the comments that it received available on its website.
Considering these comments, the Ministry of Justice will now finalize this proposal, which is expected to be published in the second half of 2023. The Digital Violence Act could then be adopted and enter into force during the first half of 2024.
In June 2023, the European Parliament finally adopted its position on the draft EU AI Act, taking the EU a major step toward establishing the first-ever regulations for AI systems (see our Q1, 2023 update). This clears the way for trilogue negotiations between the Parliament and the other EU institutions, aimed at finalizing the AI Act as soon as possible.
The EU AI Act aims to set global standards for AI technology (see our previous client alert). The EU aims to strike a balance between protecting citizens and promoting AI innovation and deployment in Europe. It covers a wide range of applications, from chatbots to surgical procedures and fraud detection. The AI Act will apply to providers, developers, and users of AI systems in the EU, as well as companies outside the EU, under certain circumstances. It categorizes AI applications based on risk levels. In this context, the Parliament suggests a significant expansion of the list of prohibited systems, which shall include real-time facial recognition in certain contexts, as well as social scoring systems. It also suggests classifying additional AI applications as high-risk, such as those that can be used for influencing elections and large social media platforms, which would face strict restrictions.
The Parliament specifically addresses generative AI, a topic not mentioned in the Commission proposal or the Council’s position. In this context, the Parliament suggests transparency requirements, demanding the disclosure of AI-generated content and the publication of summaries of copyrighted data used for training.
If AI systems fall into the high-risk or prohibited categories, violating the obligations of the AI Act can lead to significant fines. Penalties can reach up to €40 million or 7% of a company’s global annual turnover. The AI Act also mandates the creation of regulatory sandboxes for testing AI systems, provides citizens with the right to file complaints, and establishes an EU AI Office for monitoring enforcement.
Following the adoption of the Parliament’s position, Members of the EU Parliament will now come together with the European Commission and the EU Council – each with their own AI Act proposal – for trilogue negotiations in order to agree on the final text of the regulation.
Due to the timeline of the legislative process, only the Parliament’s text has been able to respond to the recent emergence of publicly available generative AI applications, such as text-to-image generators. In this context, creatives and content creators are arguing for a legislative intervention, arguing that these generative AI systems are trained with their content available online and are generating creative output that can rival the creator’s content. As a last-minute addition, the Parliament included a copyright-conscious provision in its version of the draft AI Act, but limited it to a transparency obligation. Under the proposed Art. 28b(4)(c), providers of AI models used in such a generative AI application must document and make publicly available a “sufficiently detailed summary of the use of training data protected under copyright law”.
Axel Voss, Member of the EU Parliament and rapporteur on the AI Act for the Parliament’s Committee on Legal Affairs (“JURI”), expressed optimism on a Copyright Policy Conference held in Berlin on 26 June 2023 that the regulation will pass the trilogue negotiations and thus be part of the final AI Act. Mr. Voss also emphasized that the upcoming Spanish EU Council presidency, under whose leadership the trilogue will be held (and potentially even concluded), has expressed an open mind on copyright law issues in AI.
While it is unlikely that further copyright provisions will be added to the AI Act this late in the game, this could lead to specifications on the transparency obligation – and AI copyright regulation could be the next big challenge for EU lawmakers.
We are grateful to the following member of MoFo’s European Digital Regulatory Compliance team for their contributions: Brittnie Moss-Jeremiah and Edie Essex Barrett, London office Trainee Solicitors; and Susan Bischoff and Tim Stripling, Berlin office Research Assistants.