La Commissione europea ha chiuso l’indagine sui ‘termini e condizioni’ di Meta a fine aprile 2025, ritenendo l’azienda responsabile di aver violato il Digital Markets Act. Il contributo analizza l’impatto di questa decisione sull’approccio europeo al mercato digitale e ipotizza l’avvio di una nuova era: l’impegno della Commissione a impedire l’accumulo e l’aggregazione dei dati da parte dei gatekeeper è ben lontano dalla neutralità del regolatore, che è stata un pilastro della regolamentazione europea sin dall’inizio della globalizzazione. La nuova consapevolezza dei limiti del potere di alcuni dei principali attori del mercato digitale è quindi considerata il preludio ad un nuovo discorso giuridico, in cui sembra affiorare il germe di un approccio costituzionale europeo senza precedenti.
At the end of April 2025, the European Commission closed its investigation into Meta’s Terms and Conditions and found the company responsible for breaching the Digital Markets Act. The paper analyses the impact of this decision on the European approach to the digital market and suggests the opening of a new era: the Commission’s commitment to preventing data accumulation and aggregation by gatekeepers is a far cry from the regulator’s neutrality, which has been a pillar of European regulation since the beginning of globalisation. The new awareness of the limitations of the power of some major digital market players is then considered a prelude to a new legal discourse, in which it seems to be the seed of an unprecedented European constitutional approach.
1. The decision about Meta and its impact on the European political project
At the end of April 2025, the European Commission published the results of an investigation into Meta’s privacy policy and ordered the company to pay €200 million for a breach of Article 5 of the Digital Markets Act (DMA)[1].
Although the DMA is a regulation specifically dedicated to competition in digital markets, academic interest in the procedure against Meta extends far beyond the field of Competition Law.
This paper analyzes the juridical impact of DMA, showing how it has resulted in a serious «identity crisis»[2] for the European Union; the DMA actually opened a new era in which Article 102 TFEU assumes a differentiated meaning for certain large digital companies, known as gatekeepers, which are subject to antitrust obligations and prohibitions that apply preventively. That perspective challenged some long-standing notions that, having contributed to shaping the European economy since the beginning of globalization, also helped shape the European political project.
In fact, according to Article 3 of the DMA, gatekeepers are companies that have a significant impact on the internal market, provide a core platform service that is an essential gateway for business users to reach end-users, and enjoy an entrenched and durable position in their operations, or it is foreseeable that they will soon enjoy such a position.
The power of gatekeepers, thus, is not only about altering competition with other market actors; their dominant position also implies strong influence over the personalities of their users, thanks to their ability to allow or prevent access to the goods and services they govern. That extends the legal discourse on the regulation of digital markets to a field directly related to fundamental rights and constitutional law: users can be nudged to give their consent to data treatment because digital services have become a relevant part of individuals’ lives, exposing their personalities to exploitation and commodification that were difficult to imagine only a few years ago[3].
DMA attempts to address this issue through Article 5, which prescribes that gatekeepers must obtain users’ consent for combining their data across different services and provide an equivalent alternative to users who decline consent. The European surveillance on Meta focused specifically on the company’s compliance with Article 5.
In the next pages, the case of Meta is analyzed with the goal of highlighting how the current European regulation on digital markets is emerging as a new frontier for shaping legal protection in the digital world, reconnecting the personal data and digital activities of people to a comprehensive notion of human personality that could not be fragmented and reduced to specific qualification such as the consumer, the user or the data subject.
Paragraph two illustrates how a decade ago, two national antitrust authorities, the Italian and the German, considered the accumulation and aggregation of personal data by some Big Tech as a peril not only for the free European market, but also for the users’ personality who could be asked for consent to data processing without real protection from sharing data between services.
Paragraph three, then, considers how this new awareness paved the way for a radical change in European regulation. After the European Court of Justice confirmed what had emerged at the national level, the European institutions decided to tackle the risk of exploiting personal data that can alter competition while influencing human behavior.
This process led to the definition of the 2019 European Digital Strategy and, specifically, has a relevant implementation in the Digital Markets Act.
In the same paragraph, it is then considered how the investigation into Meta represents a significant shift in approach for EU Law. The Commission considered Meta’s privacy policy potentially hindering one of the DMA’s ultimate goals, as it allegedly encouraged users to consent to data interoperability, making it difficult to achieve a genuine transformation in the digital market.
As a conclusion, paragraph four emphasizes that surveillance on Meta is one of the main pieces of evidence indicating that the European political project is currently abandoning the logic of Globalization in favor of what appears to be a proper constitutional approach.
In the Commission’s decision, there’s no such thing as the optimistic view of the free action of market players that has been the grounding for the European Competition Law. On the contrary, the shaping power of gatekeepers extends significantly beyond market boundaries and has a direct impact on the core of juridical protection of human personality, which, through the prescriptions of the DMA, is considered the new main goal of European Law.
That pushes the European political project in a direction so far unexplored: political integration is no longer considered a natural consequence of economic freedoms, but it is now clearly centered on the personal rights protected by the European Charter of Fundamental Rights and the Constitutions of Member States.
2. When it all began. The decisions of the Italian AGCM and the German Bundeskartellamt against Facebook
A long time before the Digital Markets Act, some national antitrust authorities noticed that traditional European competition law was not performing correctly in the digital economy. In the mid-2010s, the proliferation of proceedings for abuse of dominance against tech companies began to emerge as a symptom of a larger issue involving the personal data market[4].
The question wasn’t simply that the competition in acquisition, processing, and reutilization of personal data was increasing significantly. It was also apparent that companies with a leading market position had a vast amount of personal data at their disposal, which made it easier for them to acquire a strategic advantage that was altering the market.
In fact, digital platforms that process billions of data points every day can develop goods and services that, through personalization and targeted advertising, consolidate their market position while keeping new players out[5].
In 2014, Facebook’s (now Meta) acquisition of WhatsApp presented a significant opportunity to examine the application of competition law in digital markets. During the European Commission’s merger review, the interoperability of data between Facebook and WhatsApp was considered a potential violation of European competition law. The exploitation of information exchanged by users on WhatsApp appears to have enabled Facebook to improve its social network more effectively than its competitors.
The Commission cleared the merger only after Facebook explicitly alleged that, technically, it wasn’t possible to synchronize users’ accounts to share data between different services[6].
However, some national authorities began paying particular attention to how consent for interoperability was requested from users.
In 2017, the Italian Competition and Market Authority (AGCM) revised some decisions from the early 2000s, in which it qualified as misleading advertising the promotion of services as free of charge, when consent to the processing of personal data was actually requested[7].
Regarding the merger between WhatsApp and Facebook, the AGCM sanctioned the companies for engaging in aggressive commercial practices related to data sharing between the two companies. The authority found that the consent to data processing was requested in a manner that suggested refusal would affect the proper functioning of digital services. In fact, there was a pre-selection of consent for data sharing that the AGCM considered as preventing users from exercising an informed and active choice. Additionally, the information provided at the time of subscription was too vague to ensure authentic consumer awareness[8].
During the same time, the German Antitrust Authority (Bundeskartellamt) sought new ways to prevent the distortions of competition that could result from the accumulation of personal data in the hands of some Big Tech companies. The enforcement of the GDPR provided the authority with the opportunity to develop an unprecedented combination of antitrust law and privacy protection, resulting in a new perspective on digital markets.
According to Bundeskartellamt, the leading market position of some Big Techs could easily result in the acceptance of a privacy policy not compliant with the GDPR: the users were inclined to undervalue their data and give their consent without considering the consequences, just because they didn’t want to lose access to digital services that dominate the market, and that could have a significant impact on the user’s social and economic life.
In the Facebook-WhatsApp merger, the Authority found that users were compelled to accept Facebook’s privacy policy, specifically regarding data interoperability, because they didn’t want to lose access to two of the most popular digital services. As a result, the Bundeskartellamt concluded that Facebook’s terms of service breached the GDPR and that the company’s dominant position contributed to this violation.
Facebook was then found to have abused its dominance and was required to introduce additional consent requirements for data interoperability. The Authority also stated that refusing consent to data exchange must not affect the normal functioning of the digital services[9].
After this decision, Facebook, rebranded as Meta, appealed to the German courts, which referred the matter to the European Court of Justice (ECJ).
In 2023, the ECJ broadly confirmed the approach of the German authority and settled two points that had a significant impact on European regulation.
Firstly, the ECJ clearly stated that it is not sufficient to ask for consent; it is also necessary to provide an alternative for users who don’t want to give their consent.
Secondly, the Court set that the alternative given should be effectively equivalent and, even if less personalized, couldn’t result in a reduction in the service’s functionality[10].
This decision completely changed the perspective on digital regulation. Considering digital technologies’ impact on social and professional life, it would be unrealistic to ask people to opt out of social media or digital platforms: only a few will evaluate their data enough to make such a radical choice, while the vast majority of users will prefer bearing a privacy policy that, potentially, could also harm their rights and free will[11]. Therefore, the European Court’s ruling that a refusal to consent cannot result in the renunciation of digital services represented a significant achievement in safeguarding individuals’ autonomy in the digital world.
Thus, the decisions of the Italian AGCM, the German Bundeskartellamt, and the European Court of Justice paved the way for a new era of European digital regulation, grounded in the intersection of antitrust law, consumer law, and privacy protection, and resulting in a fundamentally new approach to digital markets[12].
3. The European digital strategy, the DMA, and the new gaze on digital markets
In 2019, the European Union elaborated a new regulatory strategy for the digital economy. With Communication COM 118 (2021), poignantly titled Digital Compass: the European way for the Digital Decade, the Commission outlined an approach completely different from the past: competition law wasn’t meant any more as a legislation ruling a specific field, but became a primary legal basis for strengthening the adequate protection of fundamental rights as enshrined in the Charter of Fundamental Rights of the European Union[13].
The EU Directive 2019/770, as the first implementation of the Strategy, openly addressed the risk of commodification of human personality that could be associated with the economic value of personal data[14].
Then, in a few years, five regulations (Digital Services Act, Digital Markets Act, Data Act, Data Governance Act, and AI Act) rapidly rewrote the European approach to digitalization.
The competition law for digital markets, particularly, was revolutionized. Private digital companies with a leading position in the market have a power that must be tackled with a comprehensive legal action that goes beyond the old dichotomies between public and private, and must have the ultimate goal to preserve the constitutional meaning of human personality in the digital society. It is no coincidence that the new regulations contain several expressions (e.g., systemic risks, gatekeepers, unacceptable risks) that denote an intrinsic danger to fundamental rights in digital environments.
Regarding DMA, it primarily prescribes specific antitrust rules for a few digital companies, the gatekeepers, that can disrupt market balance due to the large amount of personal data at their disposal.
In fact, the aggregation and processing of personal data on a massive scale enable them to build ecosystems of services that can keep users within a bubble, creating significant competitive advantages through products developed using market information derived from big data[15].
One of the main remedies introduced by DMA is Article 5, which defines restrictions on the use of personal data for commercial purposes. The second paragraph states that gatekeepers must explicitly request users’ consent to combine their data across different services. If permission is refused, operators are then obliged to provide access to an alternative, which, although less personalized, must be equivalent.
The DMA has then codified the path taken by national antitrust authorities and the ECJ, making the specific consent to interoperability and the equivalent alternative the two new pillars of the relationship between gatekeepers and users.
In addition, the Regulation also introduced certain requirements for determining whether consent is valid. Recital 37 establishes that the digital experience of users denying consent should not be worse, even if, without profiling, the service could be less customized. Similarly, it is also clarified that not giving consent should not be more complicated than giving consent, and that the gatekeeper should provide an intuitive solution for giving, modifying, or withdrawing consent in an explicit, clear, and straightforward manner.
The gatekeeper must also give explicit information about the fact that a less personalized offer won’t lead to a deterioration of the basic functionality of the digital service. Finally, if consent cannot be given directly to the gatekeeper’s basic platform service, end-users should be able to consent through each third-party using that service.
The direction indicated by the DMA, in general, is clear: the regulation aims to preserve users’ freedom of choice in a context where algorithmic interference can be pervasive. That kind of freedom should also be guaranteed across the service as a whole, and through the possibility of adjusting the level of data sharing and profiling in its various parts.
The impact of the Digital Markets Act has been relevant. The new regulation significantly helped reveal the true value of the data economy. While just a few years earlier, Facebook publicly declared that data had no economic value, after the implementation of DMA, it suddenly became clear that personal data plays a structural role in the development of digital products.
The enforcement of DMA began with the European Commission designating six gatekeepers (Alphabet, Amazon, Apple, ByteDance, Meta, Microsoft)[16]. Among them, ByteDance openly declared that compliance with the DMA obligations could significantly affect their business, to the point of explicitly admitting that, if many users had chosen not to consent to the sharing and profiling of their data, their business models could have been compromised at their very foundations[17].
Subsequently, tech companies began to consider it a priority to avoid a mass opt-out of data sharing. The six gatekeepers employed different strategies, with the privacy policy implemented by Meta immediately raising several concerns. In fact, Meta decided to present its users with two different alternatives: on one hand, it proposed a monthly subscription that included ad-free services and no kind of data exchange. On the other hand, it created a second option that didn’t involve monetary payment but implied data interoperability and full-targeted advertising.
That Consent or Pay scheme pushed the Commission to open an immediate investigation: such a binary system of choice didn’t offer a real alternative for users and undermined the regulation’s overall goal of preventing the accumulation of data by gatekeepers[18].
In fact, according to the Commission, Meta’s model would induce users to consent simply because they didn’t want to pay a fee for a service that, until then, had not been subject to any payment[19].
This could compromise one of the underlying objectives of the DMA: the regulation aimed to stimulate greater market openness by enhancing the freedom of choice for users, who, by not sharing their data, would encourage the market, from the bottom up, to offer digital services based on privacy by design. Meta’s Pay or Consent, on the other hand, had the specific objective of maintaining the previous situation, with users underestimating the value of their data in order to access digital services without paying.
During the confrontation with Meta, the Commission publicly affirmed that the equivalent alternative prescribed by Article 5 had to be interpreted as an alternative that is free of charge and requires reduced use of personal data[20].
This interpretation marked a new phase in European law, as it represented a significant shift in the relationship between the European ruler and the market by questioning the traditional neutrality of regulation.
The following steps in the confrontation between the Commission and Meta may reveal a legal trend in which Europe is beginning to assume what seems to be a proper constitutional role.
4. Is the new European approach marking the end of Globalization?
Following the Commission’s conclusions in November 2024, Meta made adjustments to its privacy policy, introducing a third option with no monetary fee and reducing the processing of personal data.
Nonetheless, on April 23, 2025, the Commission sanctioned Meta with a fine of €200 million for the Consent or Pay model submitted to users from March to November 2024, while the three-option model was under assessment.
The core of the Commission’s argument consisted of two points: first, the alternative offered wasn’t equivalent, and second, users weren’t allowed to freely decide whether to grant data interoperability[21].
Interpreting the Commission’s final decision is not straightforward.
On the one hand, this decision represents an achievement in preventing the commodification of human beings through the exploitation of their personal data. In fact, the Commission’s approach emphasizes the risk of equating personal data with money and suggests the illegitimacy of a binary choice between consent and payment. The intent of protecting human personality is clear: if private companies normalize the exchange of data for services, the idea that personal data can become the currency of the digital world can be instilled in users’ minds without their awareness of being subjected to economic exploitation.
However, the Commission’s decision raises some concerns from a strict legal perspective, as it goes in the opposite direction of the approach firmly pursued by European Law during the years of Globalization, when European legislation openly supported the business development of tech companies, considering their activity in the markets as a pure positive force, made by neutral facilitators allowing people to express their personalities.
Now, in the case against Meta, the Commission chose to scrutinize a private company’s business decision, setting significant boundaries on the definition of service prices, which is at the core of a market actor’s autonomy.
The same DMA does not provide a clear legal basis for the Commission’s reasoning.
The main issue considered by the Commission, namely the equivalence of the alternative provided, isn’t addressed by the Regulation, which provides no clear guidance on the criteria for determining whether a service is equivalent.
In fact, there are only a few textual references that could help with interpretation.
Recital 36 states that «gatekeepers should enable end users to freely choose to opt-in to such data processing and sign-in practices by offering a less personalized but equivalent alternative, and without making the use of the core platform service or certain functionalities thereof conditional upon the end user’s consent».
Recital 37, then, specifies that «the less personalized alternative should not be different or of degraded quality compared to the service provided to the end users who provide consent»[22].
However, while these two provisions imply that the alternatives shouldn’t be worse, the authentic nature of the equivalence isn’t deeply explored.
Probably the strongest legal basis for the EC’s reasoning can be found in the sixth paragraph of Article 13, which requires gatekeepers not to degrade the conditions or quality of their services for users who do not wish to grant consent to data processing.
Even with this disposal, it is difficult to argue that a paid service is inherently of poorer quality, and that a fee of a few Euros per month would be a significant barrier for users[23].
In fact, the Regulation appears to treat equivalence as a matter of strict functionality. It’s the service’s functionality that must not be degraded when users withdraw their consent, whereas access to digital services for free hasn’t, so far, been considered a matter of concern by European legislation. In truth, it is quite the opposite: being grounded in Globalisation, many European provisions strongly defend the right of private companies to set the prices for their services freely. The most important of them is, of course, Article 16 of the European Charter of Fundamental Rights, which guarantees the freedom to conduct business in accordance with European and national laws. The usual interpretation of the Article allows businesses to set prices for their activities without any boundaries.
This is a point that the European Court of Justice specifically considered when it decided on the obligation of digital companies to provide an equivalent alternative for users who do not wish to give their consent.
In the same decision in which it confirmed the findings of the German Bundeskartellamt[24], the ECJ clarified that the dominant position of Meta created an imbalance that must be compensated with the freedom to refuse to give consent to specific data processing operations that are not necessary for the performance of the contract, without being required to completely renounce the use of the service offered by the online social network operator[25].
It is the ECJ that suggests that a balance could be created by offering an equivalent alternative, but it adds explicitly that such an alternative could be provided «if necessary for an appropriate fee»[26].
The same position has also been reinforced by the European Data Protection Board (EDPB), which, in its Opinion 8/2024, has directly addressed the question of the price for consent, admitting, at Paragraph 76, that «there is no obligation for large online platforms to always offer services free of charge»[27].
The Opinion notes, in addition, that gratuity could help enhance the freedom of choice for individuals when it is accompanied by no targeted advertising, but it considers the question primarily as a matter that can facilitate controllers in demonstrating that users’ consent was freely given.
Consequently, until the European Commission’s decision on Meta, the juridical question underlying the Consent or Pay model was considered a matter of fairness in consent requests: data treatment must be transparent, meaning that digital companies must design their services and interfaces to avoid any possibility of deception or manipulation. But when the commercial offer is clearly presented to users, the legal basis for presuming the automatic illegitimacy of a payment request appears to be lacking[28].
However, in its decision against Meta, the Commission took a different approach, finding that the pure alternative of full data treatment in exchange for payment was an incorrect incentive for users to give their consent.
This change in perspective is a relevant shift in European legislation and the opening of a new era in which there is little room for some of the pillars of Globalization: the Commission seems to be on a page where it is no longer possible to ask legislators just to step back from interfering in the business decisions of private companies.
Underlying the conclusion of the Commission, then, there’s an abandonment of the optimism about digitalization and a new awareness of the manipulative nature of the digital world: Digital markets can’t be effectively governed by regulating the various interests that arise from the free development of the market, but they require a consistent safeguard of human personality in digital environments.
However, this awareness isn’t authentically embedded in the Digital Markets Act, which seems to be a piece of legislation that depicts a transition from the old, globalized approach to the new concern about the perils of social life strongly influenced by technology.
Realistically, proper protection of human personality will require a different kind of legislation, not merely market-oriented, but one that openly addresses the question of the permanence of rights in digital environments.
Of course, it is extremely difficult to achieve that with the current European system of competencies, which is directly linked to the ideological pillars of Globalization.
Proposing the spread of a lex mercatoria adopting the mechanisms of economic interests, Globalization created the illusion of a new universal legal language able to offer possibilities previously restricted by the traditional law of States. The opportunities for individuals, thus, have become increasingly dependent on private actors and on the services they develop for the new society grounded on a strong dematerialization of social relationships[29].
In this perspective, the protection of fundamental rights has remained broadly confined within national Constitutions, without enabling them to become a general boundary for any kind of power, public or private[30].
European legislation on markets has sought to replicate the illusion that it is possible to maintain European law and national law operating at different levels, with a market governed by European rules and a constitutional world governed by Member States. But the transcendence of the singularities of different legal systems has led to an open field, or, as some would say, a Wild West, where the traditional instruments of the States are struggling to contain the ability of technology to dematerialize any kind of social rooting proper to human personality, while the regulation on markets could not be push to act as a tool for strongly defending the human personality[31].
The Digital Markets Act, as some other Regulations derived from the European Digital Strategy (mainly the Digital Services Act and the AI Act), is the beginning of a new path, significantly defined by the protection of fundamental rights, and by the connections between the European Charter of fundamental rights and the Constitutions of the Member States. An effective argument for the preservation of juridical protection of humanity in digitalization, however, could arise only if the European political project dared to conceive of itself as genuinely constitutional.
- Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act). ↑
- S. Frank, E. Lewis, The European Commission’s Challenge to Consent or Pay: Demystifying the Digital Markets Act, in World Competition, 2, 2024, 427 ff. ↑
- G. Teubner, Hybrid Laws: Constitutionalizing Private Governance Networks, in R. Kagan, K. Winston (eds.), Legality and Community, Berkley Public Policy Press, Berkley, 2002, pp. 311-331. ↑
- M. Botta, Sector Regulation of Digital Platforms in Europe: Uno, Nessuno, Centomila, in Journal of European Competition Law & Practice, vol. 12, No. 7, 2021, 500 ss. ↑
- W. Kerber, Digital Markets, Data and Privacy, in V. Falce, G. Ghidini, G. Olivieri, Informazione e big data tra innovazione e concorrenza (eds.), Giuffré, Milan, 2018, 1 ss.; G. Muscolo, Big Data e concorrenza. Quale rapporto?, Ivi, pp. 173 ss. ↑
- European Commssion, COMP/M.7217, Facebook/WhatsApp (3 Ottobre 2014). ↑
- AGCM, Decisions 10276, 10277, 10278 e 10279, 20th December 2001; V. Pagnanelli, Una “valutazione d’impatto” della privacy sulle Big Tech. Riflessioni a margine della sentenza n. 2632021 della sesta sezione del Consiglio di Stato, in E. Cremona, F. Laviola, V. Pagnanelli (eds.), Il valore economico dei dati personali tra diritto pubblico e diritto privato, Giappichelli, Turin, 2022, 5 ss. ↑
- ACGM, 11th of May 2017, WhatsApp; G. Giannone Codiglione, I dati personali come corrispettivo della fruizione di un servizio di comunicazione elettronica e la “consumerizzazione” della privacy, in Diritto dell’informazione e dell’informatica, 2, 2017, 418 ss. ↑
- Bundeskartellamt, B6-216, Facebook, 6 february 2019. On the decision S. Frank, M. Frank, A “Facelift” to the Abuse of Dominance – The German Competition Perspective on Facebook, in Australian Journal of Competition Law, 28, 2020, 188 ss. ↑
- European Court of Justice, decision of 4th of July 2023, Case C-252/21, Meta v. Bundeskartellamt, ECLI:EU:C:2023:537, paragraph 150. ↑
- S. Rodotà, Il diritto di avere diritti, Laterza, Bari, 2012. ↑
- M. Rhoen, Beyond consent: improving data protection through consumer protection law, in Internet Policy Review, 5, 2016; G. Giannone Codiglione, I dati personali come corrispettivo della fruizione di un servizio di comunicazione elettronica e la “consumerizzazione” della privacy, in Diritto dell’informazione e dell’informatica, 2, 2017, 418 ss. ↑
- G. Pitruzzella, Big Data, Competition and Privacy: a Look from the Antitrust Perspective, in Concorrenza e Mercato, 23, 2016, 15 ss.; F. Costa-Cabral, O. Lynskey, Family Ties: The Intersection Between Data Protection and Competition in EU Law, in Common Market Law Review, 54(1), 2017, 11 ss. ↑
- Directive (EU) 2019/770 of the European Parliament and the Council of 20 May 2019 on certain aspects concerning contracts for the supply of digital content and digital services. Significant is, e.g. the Recital 24, which explicitly declares that «the protection of personal data is a fundamental right and that therefore personal data cannot be considered as a commodity». ↑
- Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act), Recital 36; A.C. Witt, The Digital Markets Act – Regulating the Wild West, in Common Market Law Review, 60, 2023, 625 ss. ↑
- The identification was made on the 6th of September 2023. ↑
- This was declared by ByteDance in the appeal against the designation as gatekeeper. Judgment of the General Court of 17 July 2024 , Case T-1077/23, Bytedance v Commission, ECLI:EU:T:2024:478. ↑
- European Commission, Press release of March 25, 2024, No. 1689: «The Commission is concerned that the binary choice imposed by Meta’s “Consent or Pay” model may not provide a real alternative in case users do not consent, thereby not achieving the objective of preventing the accumulation of personal data by gatekeepers». ↑
- European Commission, Press release of July 1, 2024. ↑
- Speech by European Commission Executive Vice President Vestager and Commissioner Breton on the opening of investigations for non-compliance with the Digital Markets Act, March 25, 2024. ↑
- European Commission, Press release of April 23, 2024. ↑
- M. Frank, E. Lewis, The European Commission’s Challenge to Consent or Pay, cit., 439 ss. ↑
- D. Zimmer, J.F. Göhsl, Enforcement of the Digital Markets Act, 10 April 2024, https://verfassungsblog.de/enforcement-of-the-digital-markets-act/. ↑
- European Court of Justice, decision of 4th of July 2023, Case C-252/21, Meta v. Bundeskartellamt, ECLI:EU:C:2023:537. ↑
- Ibid., Par. 149, 150. ↑
- Ibid., Par. 150. ↑
- European Data Protection Board, Opinion 8/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms, 17th of April 2024. ↑
- S.A. Elvy, Paying for privacy, in Columbia Law Review, 6, 2017, 1383 ss. ↑
- G. Teubner, Breaking Frames: The Global Interplay of Legal and Social Systems, in The American Journal of Comparative Law, 1, 1997, pp. 149 ss. ↑
- M.R. Ferrarese, Il diritto al presente. Globalizzazione e tempo delle istituzioni, Il Mulino, Bologna, 2010. ↑
- A.C. Witt, The Digital Markets Act – Regulating the Wild West, in Common Market Law Review, 60, 2023, 625 ss.; M. Ruotolo, Il potere, tra pubblico e privato. Tracce per un dialogo tra civilisti e costituzionalisti, in Costituzionalismo.it, 3, 2024; M.R. Ferrarese, Le istituzioni della globalizzazione. Diritto e diritti nella società transnazionale, Il Mulino, Bologna, 2000, 35 ss.; G. Vettori, Contratto e costituzione, in Enciclopedia del Diritto, I Tematici, I, Contratto, Milano, 2021, 266 ss. ↑