In the last eight months, regulators in Europe have launched action against Meta in four different policy areas. We argue this approach is the result of the predictable and continued reality of regulating perverse incentives in the digital economy.
Following the rollout of Meta’s “pay for privacy” policy, regulators in Europe have taken action in four different policy areas: privacy, consumer protection, competition and online safety. Despite a growing criticism of the regulatory burden in the EU, the bloc appears to be embracing the confluence between its new and existing regulatory frameworks.
The administrative weight of compliance alongside the uncertainty and complexity of the laws’ overlaps are criticised as having hindered innovation in Europe. This burden, according to Draghi, is to blame for the bloc’s failure to compete globally in tech markets, even though it largely emerged after many big tech platforms’ rise to dominance.
Though the EC accepts shortcomings in its existing digital rulebook, it proposes the need for more, not less regulation to respond to ongoing consumer harm. The proposed Digital Fairness Act reflects an anxiety around the durability of a digital rulebook that hasn’t even been fully tested against the present, let alone future, dynamics of digital markets.
Without speaking to the need for more (or less) regulation, the situation with Meta can be understood as an expected outcome of the perverse incentives and resulting malicious compliance in the regulation of the digital economy. The success or failure of the EU’s policymaking is likely to be defined by its ability to continue legislating at pace with technological change, effectively defend and extend the application of existing laws or fundamentally restructure the markets and business models of those operating under it.
Despite growing criticism of regulatory burden, the EU appears to be embracing the overlaps in its digital rulebook
Since the adoption of the General Data Protection Regulation (GDPR) in 2016, the EU has developed digital regulation at a pace unmatched globally. This rapidly growing rulebook includes headline legislative packages such as the Digital Services Act (DSA), Digital Markets Act (DMA), Data Act and AI Act among other, smaller pieces of legislation and alongside numerous directives and added guidance. Following this particularly active period of policymaking from the prior EC, the new College of Commissioners was introduced as the leaders of a period of implementation and enforcement of regulation in the tech sector, as opposed to continued legislation. As this enforcement has gotten underway in earnest, however, a perhaps predictable friction has become increasingly apparent, not only between regulators and industry but among regulators themselves that could destabilise this mission.
Regulators have not shied away from “double-dipping” or targeting the same platform conduct repeatedly under multiple regulatory frameworks. From the perspective of industry, these cases exemplify a serious cause for concern in the EU’s allegedly overlapping, overwritten and overly burdensome rulebook for digital services that is hampering economic growth in the bloc. By the EC’s own assessment, these instances of intersecting enforcement are proof of existing regulation’s shortcomings. However, the EC sees these issues as solved by further regulation, such as through a proposed Digital Fairness Act, to further tighten and codify the terms for platform conduct. Having considered whether this case reflects a convoluted framework in need of simplification or an incomplete regulatory project straining to keep pace with technological innovation, we instead find a rulebook that is simply reflective of the cross-sectoral realities of digital technology as well as the perverse incentives of platforms conducting performative compliance.
Meta’s “pay or consent” policy has acted as a test case for the convergence of the bloc’s new and existing rules
Since at least 2018, Meta’s (then Facebook’s) use of user data to drive its personalised advertising system has been a primary target of EU regulatory scrutiny, first under the jurisdiction of the GDPR and relevant competent privacy authorities, including the Irish Data Protection Commission (DPC) and the European Data Protection Board (EDPB). Both bodies have engaged extensively with Meta in regards to changes in the legal justification under which the platform can process personal data, cycling through multiple GDPR-based permissions and along the way inciting tension among European data protection authorities (DPAs) over the intent of the law altogether. The EDPB’s urgent binding decision in October 2023, which barred Meta from claiming the legal bases of contract and legitimate interest under the GDPR, resulted in the platform introducing its now infamous “pay or consent” subscription plan for ad-free service in the EU and set off a rapid succession of regulator and third sector responses across a range of policy areas (see Figure 1).
From March 2024, regulators, including the EC and the Consumer Protection Cooperation (CPC) Network, have launched actions related to Meta’s “pay for privacy” policy under the DSA, DMA, Unfair Commercial Practices Directive and the Unfair Contract Terms Directive in addition to the ongoing investigations under the GDPR. These actions were complemented by a series of complaints filed by BEUC, the European Consumer Organization, on behalf of its national members which have also filed complaints to their national authorities. Though the outcomes of these actions all remain pending, Meta has perhaps already demonstrated an effort to respond to the concerns of regulators and achieve compliance in announcing an amended plan to offer “less personalised” advertising for EU users based on a more limited set of user data.
Arguments around regulatory burden tend not to speak to all of the conditions that created the platform economy
In its November 2024 announcement on less personalised ads, Meta repeatedly referenced the influence of regulator interventions on its business decisions. The firm’s announcement argued that the various actions by regulators launched since March “go beyond what is written in the law”, suggesting the application of the EU’s rulebook has been uncertain at best and unfounded at worst. Though this confusion could be reflective of a learning curve in the still early days of implementation, the industry and even the EC to some extent suggest that the complex regulatory system the EU has created includes inconsistencies and was designed to overlap in ways which have harmed growth.
The EC-commissioned Draghi Report on the future of European competitiveness cautioned against the “administrative and compliance burdens and legal uncertainties” of the overlaps of the DSA, DMA and AI Act, which has already occurred in the context of the GDPR. The report warns that, due to this failed harmonisation and overwhelming complexity, the EU could again lose out on major economic gains from emerging tech markets, including AI and other data-based industries. As the bloc emphasises its desire to achieve greater global competitiveness and tech sovereignty, the actions of the EC, CPC Network and national regulators outlined here appear to exemplify the sort of burden identified in the Draghi Report as acting as a barrier to a homegrown tech ecosystem. In this theory of the case, the ability of the EU to compete in the global economy is dependent on the simplification and unification of regulation in a Digital Single Market. While a compelling problem statement, the premise of regulatory burden fails to reckon with the reality that many of the market dynamics of the platform economy were developed well before the DSA, DMA and AI Act were even introduced. Though an argument on regulatory burden could explain some challenges faced by new entrants and limited subsequent investment from existing big tech firms, these concerns do little to grapple honestly with conditions that allowed these firms to flourish in the first place, including tax and industrial policy as well as macroeconomic conditions.
Additional regulation before implementing the existing rulebook could prove premature
Though the EC agrees generally with the premise that the bloc’s digital rulebook is insufficient, it reaches a distinctly different conclusion, even compared to the Draghi Report, as to how to rectify it. As previewed in EC President Ursula Von der Leyen’s letters to her incoming College, the EC views the replicated work of EU and national regulators in this case as evidence of persistent holes in the broader regulatory framework for digital services. Its recent digital fairness fitness check detailed continued complaints of consumer harm that it claims will go unaddressed even among the potential for replicated investigations as underway in the Meta case. To address this harm, further regulation in the form of a Digital Fairness Act would update the Unfair Commercial Practices Directive, Consumer Rights Directive and Unfair Contract Terms Directive for the digital age. While the details of who and how such a law would be enforced remain undetermined, the act would aim to update consumer protection law more so than supersede the DSA or DMA in better equipping regulators to respond to uniquely digital manifestations of unfair trading practices such as poor transparency and extortionate contract terms.
The fitness check also predicts a number of forthcoming innovations that would present further need for additional regulation. Not unlike the anxiety expressed around the longevity of the AI Act in light of the technology’s development, the instinct to respond to emerging use of dynamic pricing and automated contract tools with additional legislation reflects an apparent fear that existing law fails to achieve true technological neutrality in the digital age. Through a Digital Fairness Act or even across the already intersecting provisions of the GDPR, DSA, DMA and consumer law, the intent of the law and therefore the bad outcomes to be prevented largely remain the same: poor transparency, limited choice and unfair terms must be outlawed in order to protect the rights of consumers online. However, in the EC’s estimation, the development of new technologies – including AI, blockchain, quantum computing or some innovation not yet known – requires additions, clarifications and specifications to consumer protection only possible through new legislation. This urge to update, however, is driving forward additional legislation before existing laws can even be tested through full implementation and robust enforcement. While the EC in other contexts is arguing for the need to develop a lighter regulatory touch to support tech sovereignty, this drive to legislate is already discrediting the effectiveness of existing regulation in policing the conduct of existing firms and shaping the practices of new entrants.
The Meta investigations are the product of a predictable and accelerated cycle of malicious compliance
The Meta case can also be understood as a predictable and often repeated outcome of a clash of incentives between industry and regulators. As first discussed in the context of the GDPR, regulation of the digital economy has emerged as a regular site of ‘malicious compliance’ – responses to regulatory obligations that meet the letter but not the intent of the law. In the case of the GDPR, the often cited malicious privacy compliance practices of big tech include repeatedly requiring consumers to manage their consent for data collection across regulated sites and building out a professionalised privacy compliance industry, including the hiring of in-house staff and consultancies. Through these practices, tech firms achieved compliance while offloading the responsibility of privacy protections onto individual consumers and raising the barriers to entry in some digital markets particularly for smaller firms less able to invest in robust and technocratic privacy departments.
Though malicious compliance is possible in the regulation of any industry, the speed of technological development alongside the considerable market power of some players in the digital economy has unsurprisingly made the practice common in the still early years of enforcement activity in the EU. Similarly, the wide-ranging application of platform tools means that acts of malicious compliance are capable of undermining public policy objectives outside of a given regulator’s jurisdiction. Nonetheless, as the EU’s digital rulebook has grown, instances of apparent malicious compliance have continued to crop up, but regulators appear to have better anticipated these outcomes and to be more prepared to respond with force. While the introduction of Meta’s “pay or consent” model for privacy was devised to respond to obligations under the GDPR, the plan could be violating the principles of consumer protection and act as a competitive advantage for the platform. Though Meta claims it responded directly to the letter of the GDPR, the move would’ve undermined the broader public policy objective of the legislation in bolstering the rights of consumers online. With the passage of the DMA and the ready enforcement of existing consumer protection law alongside it, the EU has been able to target the malice within the platform’s compliance through specific obligations related to obtaining consent for data combination, upholding high standards of transparency and prohibiting undue pressure in the completion of a contract.
This is not to say the strategy of the EU in leveraging these frameworks in this way should be considered broadly a success. Even in the specific context of delivering on the public policy objectives set out in these frameworks, it is unclear the extent to which the EU will be able to keep pace with and match the force of changes in digital markets. With the pace of technological innovation on course to continue to accelerate, the ability of the bloc to legislate as quickly as new use cases emerge seems unlikely. While developing case law around the application of these frameworks could assist in making the enforcement tools included within them more flexible, the EU would only get the chance to do so if they can successfully defend these frameworks in legal challenges brought by platforms. And yet, the incentives to subvert regulatory obligations in order to protect business models and market power for big tech firms will still remain in the absence of a structural renegotiation of digital markets. These questions – of whether the EU can continue to legislate at pace with technological innovation, whether the EU can defend and strengthen its laws in court or whether these laws as enforced will serve to actually reshape the digital economy – are likely to define the success or failure of this era in regulating the digital economy in the long term.