European Digital Compliance: Key Digital Regulation & Compliance Developments
European Digital Compliance: Key Digital Regulation & Compliance Developments
To help organisations stay on top of the main developments in European digital compliance, Morrison & Foerster’s European Digital Regulatory Compliance team reports on some of the main topical digital regulatory and compliance developments that have taken place in the third quarter of 2021.
In this issue, we look at some familiar areas that produce regular developments for businesses operating online in Europe, including child online protection, intermediary liability for online harms, and the implementation of user bans on social media. We consider whether a German law will require changes across Europe in how consumers must be allowed to terminate online contracts. And we have updates on restrictions in digital advertising, implementation of digital services tax, and new requirements for use and transfer of cyber surveillance software. We also look at how the UK’s digital regulatory policies are developing post-Brexit, and how (and when) the UK will replicate the existing CE marking/labelling rules in place in the EU.
The protection of children online is a constant focus for regulators and lawmakers across Europe - in terms of legislative action, compliance efforts and enforcement. While we only focus on developments in Germany and the UK in this section, regulators across Europe (including in France, Ireland, Italy and Norway) are also paying close attention to children’s online experiences.
The UK Age Appropriate Design Code (referred to as the Children’s Code) came into effect in September 2021, echoing other child-focused frameworks such as the U.S. Children’s Online Privacy Protection Act.
The Children’s Code is overseen by the UK Information Commissioner’s Office (ICO) and its aim is to protect children from online harms and empower them online. The Children’s Code is a statutory data protection code of practice, containing 15 standards that apply to qualifying information society services e.g., providers of online apps, programmes, games and connected devices. The 15 standards address how organisations should build data privacy safeguards into their online services so that they are appropriate for use by children (such as making sure that parental controls or geolocation tracking are explained to children in an age-appropriate way). Notably, these services need not be directly addressed to children: the Code will apply if such services are likely to be accessed by children in the UK.
While it is not new law, the Children’s Code will be used by the ICO as a key benchmark against which to check organisations’ compliance with the wider data privacy regime in the UK, meaning that any breach of the Children’s Code could trigger enforcement action from the ICO. As such, any organisations whose operations in, or accessible from, the UK could fall within the remit of the Children’s Code should carefully review the 15 standards either to implement appropriate measures to comply with the Code and the underlying data privacy regime or demonstrate why the Code does not apply to their services.
The drive to protect children’s privacy in the digital world has also spilled over into other areas, such as the UK Advertising Standards Authority (ASA). In July 2021, the ASA published its findings from an investigation into the distribution of ads for alcohol, gambling, and high fat, salt, and sugar (HFSS) products on websites and other prominent global online platforms and streaming channels. Although 75% of the audience of these websites were adults, the ASA called on marketers to design their targeting tools to become more age appropriate, to minimize children’s exposure to HFSS ads, as well as gambling and alcohol ads.
Many social media platforms are taking steps to apply age-restrictive changes to their online content. In the UK, the high profile afforded to these efforts is because breaches of the Children’s Code may trigger the same level of enforceability and fines as those of breaches under the UK GDPR. As such, any complaint under the Children’s Code could draw negative attention to organisations, and potentially even a wider audit by the ICO of an organisation’s approach to data protection. Nonetheless, with increased use of the internet by children, especially against the backdrop of the pandemic, the ICO has made clear that it seeks to protect children within the digital world, rather than denying them access altogether. Compliance with the Children’s Code may just be the beginning of more robust standards required of organisations in the UK when it comes to handling children’s privacy.
The highest German court has ruled in three parallel cases on how influencers need to indicate clearly the advertising on their social media channels. Germany’s Unfair Competition Act and the EU Audiovisual Media Services Directive prohibit surreptitious advertising in the interest of transparency. And the law only allows product placement if the influencer identifies it as such. The court thus found that Instagram “tap tags” must be identified as advertisement unless the entire Instagram account is identified as commercial advertising or the influencer is not paid for that particular post.
In August 2021, a study commissioned by the German media authorities highlighted the risks of influencer marketing targeted at children. The study found that children are heavily exposed to influencer marketing and highly susceptible to (covert) incentives to buy products or use services recommended by the influencers that they follow. The study also found that there is often a lack of clarity regarding the advertising character of certain influencer communications. Based on the study, the German media authorities have announced that they will expand their enforcement of existing rules on advertising towards children on relevant social media services.
The German Federal States continue to work on the revision of online youth protection rules. In July 2021, they presented a new concept paper outlining their current plans to amend the Youth Protection State Treaty, which currently regulates how providers of broadcasting and online services must protect minors from content that is inappropriate for their respective age groups.
Based on the concept paper, the new rules will put a greater emphasis on technical protection measures: providers of operating systems will have to implement a “child protection mode” that, when activated, provides centralized age information to all apps installed on a minor’s devices. Apps will have to ingest this information and activate their existing youth protection features in accordance with the centralized age settings. This will allow parents to activate the youth protection features of all of these apps with only one click.
A legislative draft implanting the concept paper is forthcoming. The new rules will apply to all devices and apps offered in Germany, regardless of where the respective provider is based – so it’s possible that the new rules may set standards even beyond German borders.
As part of the new “Fair Consumer Contracts Act” (which also limits automatic subscription renewals in German consumer contracts to one month at a time), Germany has adopted new requirements that will require businesses to offer a specific two-click termination mechanism for consumer subscriptions. Significant implementation effort is expected for affected providers.
“Websites through which consumers can enter into paid subscriptions” will have to prominently feature a button labeled “Terminate contracts here” (Verträge hier kündigen) that leads to a form where identifying information (e.g., name, contract/customer number etc.) as well as an intended end date can be entered and which can be submitted with a second button labeled “terminate now” (jetzt kündigen). As long as sufficient data is entered, the submission of the form will be a valid termination, which cannot be made dependent on further steps such as logins or second factor (e.g., email, app) confirmations.
Open questions remain - particularly as to the scope of this “two-click termination” requirement. Importantly, the limits of the term “website” are not clear (i.e., are app-based shops or even device-integrated shops such as on smart TVs included?). And it’s not clear what will make a subscription “paid,” i.e., does the newly introduced EU-wide concept of “personal data as payment” apply, meaning that the termination button would have to be implemented for many “free” services as well?
The “termination button” will have to be implemented on relevant “websites” by July 2022. Failing to (correctly) implement it will invalidate any minimum subscription terms or termination notice periods in contracts for which such button should have been made available, and may also open providers to cease and desist claims by competitors as well as consumer and fair competition associations.
Germany and the EU are in conflict about Germany’s new national law imposing additional rules on media intermediaries and third party media services.
In November 2020, Germany enacted its new Media State Treaty (MST). Among other things, the MST regulates how search engines, app stores and other “media intermediaries” must treat third party media services to which they provide access - including by imposing transparency and non-discrimination obligations on affected providers.
Crucially, these rules are set to apply not only to providers of media intermediaries who are (or who are deemed to be) established in Germany under the EU’s country-of-origin principle, but also to any such services offered on the German market. This approach triggered criticism by the European Commission when the MST was first adopted, but the Commission ultimately did not interfere.
Now, this conflict has broken open again as a result of the adoption by German State Media Authorities of statutes to further specify the MST’s media intermediary rules (as mandated by the MST). These statutes are necessary because the actual MST rules are rather vague and thus lack clarity for both regulators and affected providers.
The statutes were originally set to enter into force on 1 September 2021. However, when responding to the notification to it of these statutes, the Commission issued a “detailed opinion” arguing that the statutes violate EU law, particularly because they create barriers to free trade between member states by imposing additional rules on media intermediaries based in other EU member states. The Commission asked Germany to remedy these concerns, otherwise threatening to open infringement proceedings under Art. 258 TFEU.
On that basis, Germany initiated negotiations with the Commission on how to address these concerns by amending the statutes, thus further delaying their entry into force. It is currently not expected that this case will be concluded before November 2021. So, providers offering media intermediary services in Germany will be able to enjoy an extended grace period before proper enforcement of the respective rules finally begins. Nonetheless, as already demonstrated by the MST regulators, they are not afraid to step in against suspected infringements even without being able to rely on the controversial statutes.
In August 2021, Green lawmakers in the EU, alongside the Tracking-Free Ads Coalition, began an initiative to amend the EU’s proposed Digital Markets Act in relation to targeted digital advertising.
Targeted advertising is a key feature, which many so-called “Big Tech” companies have implemented to personalise advertising to their consumers by combining personal data across multiple platforms.
The proposed amendment would have the effect of prohibiting the combination of consumers’ personal data in this context, on the grounds of respecting consumers’ rights, and increasing compliance with EU data protection and privacy law. While it does not contain a blanket ban on all targeted advertising, the proposed amendment seeks to increase the obligations of platforms not to use personal data without consumers’ explicit consent. Under the proposed amendment, non-compliance could result in companies being fined up to 10% of their total worldwide annual turnover.
Notably, the obligations under the Digital Markets Act apply only to those companies that meet the requirement of “gatekeepers”, i.e., companies with large, systemic online market platforms. Supporters of the proposed ban believe that the amendment would increase the ability of small and medium-sized businesses (SMEs) to compete fairly in the digital world. While the proposals are still under review by EU lawmakers, the amendment’s critics (which include over 1,800 SMEs and certain EU MEPs) suggest that a total ban could possibly increase costs of advertising for SMEs.
The EU has broadened the scope of its export control regime to add new due diligence, reporting and potential authorisation requirements that affect EU companies with digital cross-border offerings relating to surveillance technology. The amendments are included in the EU’s recast of the EU Dual-Use Regulation.
One of the key drivers for effective export control regulations is the protection of human rights. To enhance this goal and protect against human rights violations more generally, the EU now obliges exporters to ensure that they export so-called “cyber surveillance items” only if certain checks have been conducted or – if needed – an authorisation has been obtained. A new duty of care obliges exporters to analyse carefully whether there is an intent to use the items in connection with internal repression or human rights violations.
“Cyber-surveillance items” are newly defined as items “specially designed to enable the covert surveillance by monitoring, extracting, collecting or analysing data” from IT systems. This applies for example to intrusion software or certain surveillance software.
Effective 9 September 2021, if an exporter becomes aware (based on its due diligence findings) that any cyber-surveillance items are intended for human rights violations, the observations must be reported to the authorities. Under certain conditions, an export control license is required.
Digital companies should adapt their internal compliance programmes to these additional restrictions to comply with the enhanced duty of care.
In July 2021, the OECD/G20 Inclusive Framework on Base Erosion and Profit Shifting (IF) agreed a two-pillar solution to address the tax challenges arising from the digitalisation of the global economy and ensure that multinationals pay a fair share of tax wherever they operate.
Under current proposals, pillar one would apply to approximately 100 of the largest and most profitable multinationals with turnover above €20 billion and profitability above 10%. Between 20% and 30% of residual profit (profit in excess of 10% of revenue) will be subject to tax in market jurisdictions in which these multinationals have at least €1 million in revenue. It is expected that compliance and management, including tax filings, will be through a single entity.
The commitment under pillar two is to a global minimum tax of at least 15% (applicable to any company with over €750 million of annual revenue) applied on a country-by-country basis.
Despite the progress being made on the two-pillar solution, around half of European countries, including the UK, have unilaterally implemented a digital services tax (DST). The UK currently applies a 2% levy on revenue of groups that provide a digital services activity and where these revenues exceed £500 million. The UK has committed to remove its DST once a comprehensive global solution is in place.
Finalisation of the proposals, along with a detailed implementation plan, is scheduled for October 2021, with a view to implement the changes in 2023.
On 2 September 2021, the European Court of Justice (CJEU) decided that zero rating tariff options are generally incompatible with EU net neutrality rules.
“Zero rating” describes a practice where internet access providers do not count the data traffic caused by certain applications towards the customer’s contractually agreed data cap.
The CJEU’s decision was somewhat unexpected. German courts had actually only asked the CJEU whether certain elements of two large global telecom companies’ zero rating tariff options comply with EU net neutrality rules (e.g., a limitation of the video streaming bandwidth or an exclusion of mobile tethering).
German courts had assumed that zero rating business models are legal, based on guidance from BEREC, the joint body of the EU telecoms regulators, and previous decisions by the German telecoms regulator, BNetzA. The CJEU has now ruled that this underlying assumption was false and that any zero rating tariff option violates EU law, thus essentially declaring the German courts’ specific questions moot.
In practice, these CJEU decisions will most likely require telcos offering services in the EU to eventually revamp (or even discontinue) their zero rating programs. Under EU law, telecoms regulators across the EU must now ensure that EU net neutrality rules will be applied in light of the CJEU’s interpretation. While it is unclear whether that will actually trigger regulators to revisit their original assessment of existing zero rating programs, at least the German regulator expects that existing zero rating programs will not be able to continue in their current form (according to a press report (in German)). In any case, changes to existing zero rating programs will most likely also affect content providers participating in these programs.
In July 2021, the German Federal Court of Justice issued a decision on the legitimacy of processes used by social media companies to implement user account bans and content moderation.
The court decided to reinstate the account of a Facebook user after Facebook had banned the user for violation of its community guidelines. Facebook’s community guidelines prohibit “hate speech”. They allow the platform to delete content flagged as hate speech and to block users disseminating such content.
Facebook had deleted posts alleging that “Islamist immigrants” are free to murder with impunity in Germany without action by Germany's main domestic security agency. “Migrants can murder and rape here and nobody cares! It's about time the Office for the Protection of the Constitution sorted this out”, one post read.
In its decision, the Federal Court of Justice emphasized the important role in society played by social media platforms (and especially Facebook), and ruled that Facebook must follow a four-step process when enforcing its guidelines. The court put emphasis on the need for a social media platform to:
In some respects, this puts social media companies like Facebook in a difficult position, between a rock and hard place. That is, they are sanctioned for hosting or enabling hate speech (which many commentators criticize Facebook for doing) and then sanctioned (as this decision does) for removing that hate speech when it’s flagged to them.
Facebook welcomed the court’s decision to focus on the process for removal, not on the right to remove. It confirmed that: “We have zero tolerance for hate speech, and we’re committed to removing it from Facebook.” It is reportedly reviewing the judgment to ensure that it can continue to effectively remove hate speech in Germany.
This decision focuses on user rights in the on-going debate over online content moderation, hate speech and fake news. In a similar vein, the German parliament recently amended the Network Enforcement Act to include a complaint procedure for deletions based on that Act.
A similar law in France (Loi contre les contenus haineux sur internet) was largely voided last year by the French constitutional court as infringing on free speech. The UK introduced a draft Online Safety Bill in May 2021, which also includes the obligation to establish a complaints procedure for affected users.
In the clash between U.S. tech companies (and their understanding of free speech) and European governments (and their desire to regulate harmful speech), the decisions in Germany and France mark a point for U.S.-style free speech, albeit a narrow one.
A case concerning the current German data retention regime is progressing: the parties travelled to the European Court of Justice (CJEU) in Luxembourg for oral hearings in September 2021.
The current German regime is the country’s second attempt at implementing data retention after the first one had been declared unconstitutional by the Federal Constitutional Court in 2010. And, to date, this second attempt has been similarly unsuccessful: even before its entry into force in 2016, several parties had initiated injunctive proceedings requesting a moratorium on their obligation comply with the new rules.
German regional courts granted these requests, ruling that the new German rules violate the requirements, which had been set up by the CJEU in the meantime. Ever since then, enforcement of the German regime has effectively been suspended, although the respective rules legally remain in force.
Eventually, the Federal Administrative Court referred these cases to the CJEU, which is now expected to issue its final decision in February 2022 at the earliest.
This CJEU ruling is also expected to inform the fate of other existing data retention regimes across the EU as well as the EU-level discussions on a potential way forward. To that end, the European Commission has recently launched a non-public consultation that presents possible legislative and non-legislative solutions for data retention on the basis of CJEU case law.
The Commission sets out three potential policy solutions, i.e., (1) having Member States implement their own national data retention initiatives without any EU-level involvement, (2) having the Commission publish recommendations or a guidance document to assist Member States in aligning national and EU law, and (3) adopting a new EU-level legislative proposal on data retention. These proposals will now be discussed within the European Council.
The deadline for implementation of the EU Copyright Directive passed in June 2021, but most EU member states did not transpose the directive into national law before the deadline. So, after a long slow process to get to this point, the implementation of some of the more controversial provisions – including Article 17 which affects online content-sharing service providers’ liability for copyright-infringing content – are still not fully in force across the EU.
The EU Commission is trying to speed things up and has formally requested 23 member states (including Spain, Italy, France and Austria) to provide information on their implementation process by end of September 2021. This notice opens formal infringement proceedings against those member states.
Some countries, such as Austria, have published implementation drafts. Germany’s implementation of Article 17 entered into force on 1 August 2021. Nevertheless, in general, stakeholder dialogues are continuing in most countries. There is some hope that upcoming implementation drafts may at least display a more harmonized approach than the differing approaches of Germany, France and other countries.
In June 2021, the Commission published its long-awaited guidance on the interpretation and implementation of Article 17. The guidance gives a more precise outline of the requirements and obligations to be met by online intermediaries and rights holders, and so could well serve as a blueprint for the outstanding national implementations.
However, following the European Court of Justice (CJEU) decision in the YouTube & Peterson and Cyando cases (CJEU, C-682/18, C-683/18), the Commission has already begun to consider whether it should amend its guidance. In that judgment, the CJEU determined the criteria that an online service provider has to satisfy to avoid liability under a pre-Article 17 legal regime. These criteria are less far-reaching than the obligations imposed by Article 17. So, while it’s still an open question if and how this decision will influence the implementation of Article 17, these criteria, in any case, remain relevant for online intermediaries that do not fall under Article 17.
The European Court of Justice (CJEU) has revisited the question of whether the supply of software constitutes a sale of goods or the provision of services. The answer to this question is important because of the legal consequences under different European laws of selling goods or providing services.
For such an apparently simple question, surprisingly there has never been a comprehensive answer that would apply uniformly across all different legal regimes.
The most recent CJEU decision (interestingly, in this post-Brexit world, on a question referred to it by the UK Supreme Court) arose in the context of the Commercial Agents Directive – which imposes obligations to compensate commercial agents for the sale of goods made after the termination of an agency relationship.
The CJEU ruled that, where it is supplied via an agent on the basis of a perpetual licence, software does constitute goods and the Commercial Agents Directive does apply. One of the deciding factors for the CJEU was that the term “goods” means a product that has a monetary value and can be part of a commercial transaction; and so, because software meets this threshold, it should be classified as good, regardless of whether or not it is in electronic form. From a commercial perspective, it would also undermine the protection of the Commercial Agents Directive if commercial agents supplying software were excluded from its remit.
The UK Supreme Court now needs to implement the CJEU’s decision by formulating its own judgment. If followed, this would mean that companies with agents appointed in the UK for the resale or marketing of software, may find themselves triggering the UK Commercial Agents Regulations – including the obligation to compensate or indemnify agents for post-termination sales. Regardless, until the matter is definitively addressed by UK parliament, this will likely not be the last that we hear on it in English courts, particularly as UK consumer legislation and policy differentiates between goods, services, and digital content.
The UK has given product manufacturers, importers and sellers extra time to implement post-Brexit rules on product labelling. This means that existing CE certifications will continue to be sufficient for entering the UK market in lieu of applying the new UKCA (UK Conformity Assessed) markings.
But, from 1 January 2023, a wide range of products (including most electronic and digital hardware) sold in the UK will need to bear the new UKCA marking (and to have been through the assessment process to allow the UKCA mark to be affixed). Of course, the corollary is that after that date affected products sold in both the EU and UK will need to comply with both the EU’s CE marking scheme and the new UKCA marking.
A CE conformity assessment marking is evidence of a manufacturer’s self-assessment that its product meets all of the specified essential safety requirements set out in certain EU directives. A wide range of products must bear the CE marking if they are put on the market in the EU. After Brexit, the UK announced the UKCA as a UK equivalent to the CE mark. The original timetable was that, from 1 January 2021, the UKCA (and not the CE marking) would be the marking for goods placed on the market in Great Britain that were previously subject to the CE marking. But to give manufacturers time to adjust, the UK said that manufacturers could continue to use the CE marking until 1 January 2022.
However, because many companies will not be ready to move to UKCA markings by the end of 2021, the UK has announced that the CE marking will now be accepted for a further year, until 1 January 2023. So companies can choose to comply with either or both of the CE and UKCA regimes until that date.
The UKCA marking will need to be used when placing goods on the UK market from 1 January 2023, unless there is a further extension.
Northern Ireland is covered by different rules. Under the Northern Ireland Protocol that came into force on 1 January 2021, Northern Ireland will align with relevant EU rules relating to the placing on the market of manufactured goods.
One effect of Brexit was the greater freedom that it gives to successive UK governments to shape policy to encourage investment and the growth of different sectors of the UK economy. The downside for affected global businesses is if policy changes lead to different operational rules across the main European markets.
Given the importance of the Tech sector to the UK, maybe it’s no surprise that the UK government will prioritize the digital sector for new policy-making and legislative initiatives. As we have previously reported (Q1, 2021 UK’s New Digital Task Force targets Big Tech), the UK has already set up a Digital Markets Unit with a focus that targets digital companies with a “strategic market status” (SMS).
Now, the UK is consulting on a new regime for digital markets that it claims will be “pro-competition” – although whether the UK interprets that phrase in the same way as possibly affected businesses will be open to question.
The UK is going to look at a range of topics before formulating its next legislative and policy steps. These include:
The UK has at least expressly said that it considers that the statutory duty of the DMU should be to promote competition in digital markets for the benefit of consumers. While there’s also a mention of supporting innovation when promoting competition, it seems unlikely that innovation will be expressly included in the DMU’s remit.
Large technology and digital businesses will want to keep a close eye on any proposals for an enforceable code of conduct that will set out how any tech/digital business with strategic market status is expected to behave. In particular, if any UK code of conduct conflicts with the equivalent EU rules, that may raise operational problems as the UK and EU rules diverge.
Separately, the UK government also published in September 2021 a policy paper setting out a five-point plan to deliver its vision “for the UK to be a global leader in digital trade, with a network of international agreements that drive productivity, jobs, and growth across the UK.” The key areas for future policy and legislative development are: opening digital markets; developing a “balanced approach” by seeking to prevent unjustified barriers to data flows; championing consumer benefits and necessary business safeguards in digital trade, including by advancing digital consumer rights; developing digital trading systems with trading partners, including by promoting customs and border processes that are “digital by default”; and collaborating with the WTO to promote free, fair and inclusive global governance for digital trade.
We are grateful to the following members of MoFo’s European Digital Regulatory Compliance team for their contributions: Felix Helmstadter, Patricia Ernst, Jannis Werner and Dominik Arncken, and trainee solicitor Sakshi Rai.