- 8580 reads
Table of Contents: 1. Introduction. – 2. The challenges posed by disinformation: a threat to EU values. – 3. The EU’s legislative response to disinformation. – 3.1. Enhancing platforms’ responsibility: the Digital Services Act. – 3.2. Regulating political advertising and restricting microtargeting: the Political Advertising Regulation. – 3.3. Protecting the media to fight disinformation: the EMFA. – 4. Assessing the EU’s legislative response to disinformation: three lessons. – 4.1. Legislation protecting democracy can be adopted through Article 114 TFEU. – 4.2. The choice for a co-regulatory approach to enforce democracy. – 4.3. Even without defining disinformation, EU legislation enriches the notion of democracy. – 5. Conclusion.
Abstract: The article addresses the challenges to European democracy posed by the phenomenon of disinformation and evaluates the European Union’s (EU) legislative response. It examines the EU’s recent legislative initiatives to combat disinformation as a case study for safeguarding democracy as an EU value under Article 2 TEU. After a brief reflection on the types of challenges that disinformation poses for the protection of EU values, the article analyses three EU legislative acts and their contribution to defending democracy from disinformation: the Digital Services Act (DSA), Regulation 2024/900 on transparency and targeting of political advertising, and the European Media Freedom Act (EMFA). The article then identifies three key lessons from the EU’s anti-disinformation strategy that are instructive for the role of the EU legislature in fighting disinformation and, more broadly, in safeguarding the value of democracy: the use of internal market legislation to protect democracy, the shift towards a co-regulatory approach, and the expansion of the content of the EU value of democracy. In the end, the article argues that, despite some constraints, it is positive that the Union is protecting democracy (also) through legislation.
Keywords: disinformation – democracy – EU legislature – platforms – political advertising – media freedom.
1. Introduction
For over a decade, the European Union integration process has faced a systemic ‘crisis of values’.[1] In response to national phenomena of rule of law backsliding,[2] the Union has deployed a variety of instruments. However, except for conditionality mechanisms,[3] the adoption of legislative acts has found little space. In contrast, the Union’s recent initiatives in defence of democracy – which, along with the rule of law, is one of the EU’s founding values listed in Article 2 TEU – stand out for the greater involvement of the legislature. Since the presentation of the 2020 European Democracy Action Plan (EDAP),[4] we have witnessed a ‘democratic turn’ by the Union:[5] a greater involvement of its institutions in safeguarding democracy at both the national and European levels.[6] In this context, secondary law has gradually been relied upon to establish legal obligations to uphold the value of democracy.
Specifically, considerable attention has been devoted to combating the multifaceted phenomenon of disinformation, as the EU is increasingly concerned about the threat that it poses to the integrity of our democracies. The adoption of legal acts is indeed part of the wider effort by the Union to counter disinformation, which includes a broader range of initiatives, such as the Stratcom Task Forces within the EEAS,[7] the Rapid Alert System and the initiatives to improve digital literacy,[8] financial support for civil society through EU programmes, and, not least, CFSP measures.[9] Moreover, the Union adopted soft-law tools. In 2018, a first Code of Practice on Disinformation was agreed, bringing together online platforms and tech companies that committed to a series of actions to tackle disinformation, such as disrupting advertising, demonetising providers of disinformation and reducing its findability.[10] Then, in 2022, the EU promoted the Strengthened Code of Practice on Disinformation, which involved more actors, including a wide range of stakeholders from civil society, and incorporated more detailed measures and a set of indicators to assess the levels of implementation.[11]
Against this backdrop, this article examines three recent EU legislative acts aimed at protecting democracy from disinformation. It demonstrates that each of these pieces of legislation plays a role not only in the specific areas they regulate but also as part of the wider EU efforts to safeguard the value of democracy under Article 2 TEU, emphasising the potential of using legislation for this purpose.First, the article reflects on the concept of disinformation and the peculiar challenges it presents, especially the need to address the actions of not only Member States but also private actors, which have implications for protecting the value of democracy. Second, it examines the Union’s recent attempts to regulate the matter by focusing on three key instruments, all established by regulation: the Digital Services Act (DSA),[12] Regulation 2024/900 on the transparency and targeting of political advertising (Political Advertising Regulation)[13] and the European Media Freedom Act (EMFA).[14] These instruments will be assessed from the point of view of their respective contributions to protecting democracy against disinformation. Third, the article highlights the lessons to be learned from the Union’s anti-disinformation approach that are important for protecting democracy as a value under Article 2 TEU. Specifically, it focuses on three issues: the use of internal market legislation as a way to safeguard democracy, the shift towards a co-regulatory approach, and the expansion of the notion of the EU value of democracy.
2. The challenges posed by disinformation: a threat to EU values
Nowadays, disinformation is increasingly attracting the attention of EU policymakers and legal scholars, but that was not always the case. The Union started to discuss disinformation in relation to the Russian propaganda on the annexation of Crimea. The European Council first mentioned disinformation as a source of concern in its March 2015 conclusions, stressing ‘the need to challenge Russia’s ongoing disinformation campaigns’ and calling on the High Representative ‘to prepare by June an action plan on strategic communication’ and establish ‘a communication team’.[15] Shortly thereafter, the interference at the time of Brexit and the first Trump election acted as wake-up calls for the EU and worldwide democracies, while the Cambridge Analytica scandal highlighted the harmful effects of microtargeting techniques on electoral processes.[16] In this latter case, political messages specifically tailored to voters are created based on the available data about them and displayed as personalised political ads.[17] This series of events also revealed the role that social media and, more broadly, online platforms play in spreading disinformation.
As a result, concerns about disinformation campaigns have started to grow. And indeed, in a 2016 Communication on hybrid threats, the Commission for the first time recognised that ‘(m)assive disinformation campaigns, using social media to control the political narrative or to radicalise, recruit and direct proxy actors can be vehicles for hybrid threats’.[18] Another Commission communication followed this document in 2018, this time specifically dedicated to disinformation. Disinformation is there defined as ‘verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm’.[19] That definition builds on the report from the High-Level Expert Group on fake news and disinformation, which was set up by the Commission early that year.[20] The report distinguished disinformation from misinformation: while the former is intentional, the latter is ‘misleading or inaccurate information shared by people who do not recognize it as such’.[21]
Although disinformation is a vague and multifaceted phenomenon, which also extends beyond fake news,[22] it is hard to overlook its challenge to EU values, particularly democracy. Indeed, disinformation poses a genuine threat to the proper functioning of the national and EU democratic systems.[23] It erodes citizens’ trust in institutions and democratic procedures. Furthermore, disinformation prevents citizens from gaining true knowledge, thereby hindering their ability to make informed decisions,[24] as was the case during the infodemic surrounding the COVID-19 pandemic.[25] This is even more evident when considering more specific phenomena that harm the formation of genuine political opinions, such as microtargeting. Microtargeting risks impairing the freedom of information of voters, as personalised news may limit access to critical information necessary for being a well-informed citizen.[26]
The above risks are far from hypothetical. In particular, recent interferences during Member States’ and European elections and the Russian propaganda concerning the invasion of Ukraine have raised the bar of attention, also showing that disinformation campaigns may come from governments.[27] More recently, the annulment of the 2024 Romanian presidential election by the Constitutional Court, due to disinformation campaigns that compromised electoral integrity, vividly demonstrated the impact of disinformation on European democracy and the need to reflect on the legal and institutional instruments at our disposal to defend it.[28] While disinformation in itself is not a historically new phenomenon, its magnitude, resonance and perils have been dramatically amplified by the possibilities offered by the digital society.[29] The communication changes brought about by the Internet and the rise of online platforms have altered and facilitated the creation and dissemination of disinformation (as well as information).[30]Moreover, literature has shown that digital populists rely heavily on unregulated, disintermediated platforms to spread propaganda, and for this reason, they oppose restrictions from authorities or private entities.[31]
The peculiarity of the challenges posed by disinformation has implications for defending the value of democracy. It requires addressing the conduct of not only Member States but also private actors. Indeed, intermediary services, including online platforms, play a major role in this respect. Online platforms exercise quasi-public functions over their online spaces: they act as ultimate arbiters of the content they host, possessing the authority to autonomously decide whether to remove it. In doing so, they make decisions that profoundly impact fundamental rights, primarily the freedom of information and expression.[32] This raises valid concerns regarding the risks associated with such a privatisation of censorship.[33]
That said, even though democracy as a value can be undermined through widespread disinformation campaigns, establishing a formal breach of Article 2 TEU is likely a difficult exercise. First, the Treaty-based and specific mechanism to enforce EU values, namely, Article 7 TEU, may address only breaches put in place by Member States.[34] Thus, both the preventive and sanctioning mechanisms envisaged by that provision are unfeasible to address disinformation conducts whose responsibility rests on third countries or private actors, such as social media, political parties or even individuals. The same also applies to infringement procedures under Articles 258-260 TFEU, which relate to failure to fulfil obligations by a Member State.
Second, as the case law on judicial independence illustrates, thus far, the entire practice of enforcing the rule of law through Article 19(1) TEU has been based on a scenario where liability for violating EU values rests with Member States, not with private parties.[35] Therefore, the case law on the rule of law may be more useful for the application of Article 2 TEU in cases involving disinformation conduct whose responsibility falls with a Member State. As the Hungarian case recently shows, such a scenario is far from remote.[36] Yet, while indispensable, addressing disinformation from the Union’s own members is only a drop in the ocean. In any case, so far, Article 2 TEU has always been used in conjunction with other, more specific, provisions of the Treaties.[37]Scholars are still debating whether a similar approach could be applied to the value of democracy, particularly by relying on Article 10 TEU as a more concrete provision to enforce representative democracy and citizens’ right to participate in the EU’s democratic life.[38] The problem with these proposals is that, unlike the rule of law, and despite some recent developments,[39] democracy is a value that has been given little substance under EU law.[40]
Third, Article 2 TEU does not appear to be clear and precise enough to satisfy, on its own, the conditions for direct effect, even less to create obligations for private parties, thus hampering a successful response against disinformation.[41] Although the Court of Justice has recently shed some light on the meaning of the value of democracy under Article 2 TEU,[42] this case law does not yet amount to a full set of standards. Moreover, obligations under Article 2 TEU do not appear unconditional, but rather to be understood as results to be achieved in light of the constitutional structure of each Member State. As the Court of Justice has stressed, neither Article 2 TEU nor any other provision of EU law ‘requires Member States to adopt a particular constitutional model governing the relationships and interaction between the various branches of the State, in particular as regards the definition and delimitation of their competences’.[43]
Against this background, the peculiar challenges posed by the phenomenon of disinformation call for a different role for EU institutions in safeguarding democracy. This involves setting aside (at least for now) the judicial route to prioritise the adoption of secondary law instruments that give legal backing to the value of democracy.
3. The EU’s legislative response to disinformation
Over the past few years, the Union has adopted a set of relevant legislative acts to fight disinformation. In the EDAP, the Commission outlined several policy initiatives to promote and defend democracy against, inter alia, the challenge of disinformation. In particular, the EDAP included the commitment to two relevant legislative proposals, which were subsequently adopted in the following years: the DSA and the Political Advertising Regulation. Instead, the EMFA was announced by the president of the Commission in the 2021 State of the Union and proposed one year later.[44]
Through these three instruments, the Union is pursuing different goals but also building an arsenal to combat disinformation and, consequently, protecting the integrity of our democracies.[45] The following pages will discuss each of these instruments individually, with the aim of understanding the EU legislature’s contribution to tackling disinformation.
3.1. Enhancing platforms’ responsibility: the Digital Services Act
The DSA was adopted in 2022, on the basis of Article 114 TFEU, and became fully applicable in February 2024.[46] The Commission presented it in the EDAP as a major tool in the fight against disinformation, as it provides for ‘more obligations and accountability for online platforms’.[47] Despite the few explicit references to disinformation, which are confined to its preamble, the DSA acknowledges its aim of ‘ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter are effectively protected and innovation is facilitated’.[48] The notion of ‘illegal content’ is framed in very broad terms by Article 3(h) DSA, encompassing conduct in violation of EU law or the law of any Member State.
To achieve its goals, the DSA establishes a complex framework that has a threefold relevance for countering disinformation. First, the DSA makes platforms responsible for spreading disinformation through their services. Let’s be clear: the DSA does not revolutionise platforms’ accountability related to content. It maintains, as a general rule, online platforms’ non-liability for the content hosted on their services, provided that they are unaware of the violations.[49] Furthermore, platforms are not required to monitor content and users on their services, nor are they obligated to actively seek out illegal activity.[50] That said, the adoption of the DSA marked a significant turning point in the EU legislature’s involvement in urging platforms to assume their responsibilities concerning, inter alia, the spread of disinformation. It established rules for moderating and removing illegal content, along with due diligence and transparency obligations applicable to all intermediary service providers, regardless of their place of establishment, provided they offer services to users in the Union. As such, the DSA marked a crucial step for the ‘supranational regulation of online public discourse’.[51]
The DSA adopts a ‘layered requirements’ approach depending on the size of online intermediary services and, thus, their impact and the risks they pose.[52] The first layer of obligations concerns all providers of intermediary services, followed by hosting services, online platforms,[53] and, finally, very large online platforms and search engines (VLOPs), defined as services that reach 45 million active monthly users in the EU.[54] While all intermediary services have transparency[55] and due diligence obligations concerning their content moderation activities,[56] the DSA provides for specific due diligence requirements for VLOPs. They have to carry out a yearly assessment of four categories of ‘systemic risks’ in the EU, all of which are relevant for countering disinformation.[57] Relevantly, systemic risks concern not only (a) the dissemination of illegal content, but also harmful online behaviours capable of negatively affecting (b) the exercise of fundamental rights, (c) civic discourse, electoral processes and public security and (d) gender-based violence, public health, minors, and physical and mental well-being.[58] Thus, while disinformation is not, as such, a systemic risk, it can clearly give rise to such a risk, particularly in the case of the negative effects on civic discourse, electoral processes, and public security included in Article 34(1)(c).[59] According to Article 35, VLOPs must then implement effective measures to mitigate systemic risks, also following the Commission’s guidelines, which include adapting content moderation procedures and algorithmic systems, such as those used to recommend content. Although those measures do not require VLOPs to remove disinformation, most removal efforts are taking place under the 2022 Code on disinformation.[60] The latter has been integrated into the DSA framework as a proper Code of Conduct under Article 45 since July 2025, with the result that it has become ‘a benchmark for determining platforms’ compliance with the DSA’.[61] Finally, other relevant obligations under the DSA include the crisis response mechanism, which allows the Commission to request VLOPs to take certain actions in the event of a crisis,[62] the yearly submission to an independent audit, [63] and the establishment of an independent compliance function.[64]
Second, the DSA empowers users and interested parties to combat disinformation. Article 16 DSA requires hosting services, including online platforms, to establish a ‘notice and action’ mechanism that enables individuals or entities to notify the presence of illegal content on their service. The activation of this mechanism thus prevents platforms from claiming ignorance regarding the existence of illegal content, thereby triggering their liability. Indeed, once intermediary services become aware of illegal activity, they shall act ‘expeditiously to remove or to disable access to the illegal content’.[65] In the case of users who frequently post manifestly illegal content, providers of online platforms are compelled to suspend their services after issuing a prior warning.[66]This duty is compensated, on the side of users’ freedom of expression, by the obligation to state reasons for removing content and adopting other restrictive measures.[67] Instead, and regrettably, platforms are not required to state reasons where they refuse to take action after a notice.[68] Providers of online platforms must also enable users to contest decisions regarding the removal or labelling of content through an internal complaint‐handling system and an external out-of-court dispute resolution mechanism.[69]Furthermore, they are required to give priority and process without delay the notices submitted by ‘trusted flaggers’ such as independent fact-checking and research organisations.[70]
The third contribution of the DSA to combating disinformation is the creation of an enforcement system. In contrast with the E-Commerce directive, whose supervision and enforcement were left with Member States and self-regulatory initiatives, the DSA established a proper oversight and enforcement framework, which is shared between national authorities (which shall appoint the Digital Services Coordinators)[71] and the Commission. While in general the competent Member State is the one in which the provider has its main establishment, compliance with the special monitoring obligations for VLOPs falls under the exclusive radar of the Commission.[72] This system comes with far-reaching monitoring, investigation and enforcement powers, including the possibility to adopt interim measures and impose fines of up to 6 per cent of the annual worldwide turnover of the provider in the preceding financial year[73], as well as penalty payments.[74]
The inclusion of a centralised enforcement mechanism against VLOPs is a crucial development: ambitious rules without proper enforcement are nothing more than a paper tiger. For the moment, the Commission has not overlooked its role as guardian of EU law. It has initiated proceedings against several platforms, including Meta, TikTok, and X, for breaching the DSA,[75] and has also referred six Member States to the Court of Justice for failing to effectively implement the DSA, particularly regarding the appointment and empowerment of the Digital Services Coordinator.[76] More recently, the Commission adopted the first non-compliance decision against a VLOP. On 5 December 2025, it imposed a fine of €120 million on X for breaching its transparency obligations under the DSA, due to X’s deceptive design of its ‘blue checkmark’, lack of transparency in its advertising repository, and inadequate access to data for vetted researchers.[77]
The Commission’s decision against X is undoubtedly ‘a landmark step for EU digital regulation’,[78] but we should not be dazzled by this initial move. In the medium to long term, a valid concern remains that the Commission’s dual role as a political actor and an administrative body could make it vulnerable to (geo)political pressure to (not) enforce the DSA.[79] In this regard, scholars have already started making proposals to reconsider the Commission’s role within the DSA enforcement framework or, at the very least, introduce safeguards to enhance its independence by drawing on the experience of neutral bodies such as Eurostat and OLAF.[80] At the same time, to make the DSA a true success story, the commitment of users, civil society, and research organisations is also crucial, as the DSA relies on their active role in reporting disinformation and illegal content. There is thus a need for a composite combination of private and public, national and European, enforcement to guarantee that the DSA will not be a paper tiger.
3.2. Regulating political advertising and restricting microtargeting: the Political Advertising Regulation
The very recent Regulation 2024/900 on the transparency and targeting of political advertising is another interesting EU piece of legislation for fighting disinformation. Political advertising has long been a crucial component of marketing in electoral campaigns.[81] Nonetheless, the advent of digital platforms has revolutionised the field. Today, political campaigns are increasingly conducted online, specifically through social media platforms. The DSA already features some general transparency requirements on advertising, which also cover political advertising. According to Article 26 DSA, online platforms must ensure that users can recognise advertisements as such, including through prominent markings. They should also provide information about the advertiser’s identity, the person who paid for the advertisement, and the criteria used to select the user as the recipient. Where applicable, options to modify those parameters should also be made available. In addition, VLOPs shall make available a repository of their advertisement practices.[82]
Regulation 2024/900 complements the DSA framework by providing harmonised rules on the labelling and transparency specifically for political advertising. At the same time, it establishes strict rules on the use of microtargeting techniques in the context of political advertising that are based on the processing of personal data. The duality of aims is reflected in a double legal basis: Article 114 TFEU for the general rules on political advertising services and Article 16 TFEU for the provisions on microtargeting. As regards the addressees of the regulation, these are providers of political advertising services – including publishers –, sponsors, and controllers as defined in Article 4, point 7, of the GDPR.
Throughout the regulation, the necessity for EU intervention is justified based on strong value-based language. Although the regulation declares not to affect the content of political advertisements,[83] it is evident that ‘a key target of the proposal is the content of advertisements if it is disinformation’.[84] This, in turn, is directed at protecting democracy. The underlying idea is that if citizens cannot recognise political advertisements, particularly in the current era of disinformation, this will prevent them from exercising their democratic rights in an informed manner. In this context, transparency is regarded as ‘a legitimate public goal, in conformity with the values shared by the Union and its Member States pursuant to Article 2 of the Treaty on European Union (TEU)’ and a necessary element ‘to support open and fair political debate and political campaigns, and free and fair elections or referendums, and to counter information manipulation and interference, as well as unlawful interference, including from third countries’.[85] Indeed, the regulation conceives political advertising as a possible ‘vector of disinformation, in particular where the advertising does not disclose its political nature, comes from sponsors outside of the Union or is subject to targeting techniques or ad-delivery techniques’.[86] The preamble also reveals a specific concern regarding political advertising originating from third countries and foreign actors. As recital 19 makes clear, electoral interference from outside the EU ‘is known to pose a serious threat to democracy, which is a common value of the Union and the securing of which is of fundamental importance to the Union and its Member States’.
The proper identification of political advertising is crucial as it determines the scope of application of the harmonised transparency rules. A too broad definition would unduly limit freedom of expression, which encompasses the more specific area of political speech.[87] It could encourage censorship and impose significant restrictions on both commercial campaigns that are not typically regarded as political and on non-commercial campaigns, including those by civil society actors. Indeed, the Commission’s proposal initially drew criticism for its overly broad definition of political advertising.[88]
Despite certain amendments, the final text of Regulation 2024/900 still retains a wide scope. It defines political advertising as concerning two types of messages: ‘by, for or on behalf of a political actor, unless it is of a purely private or a purely commercial nature’ or those which are ‘liable and designed to influence the outcome of an election or referendum’.[89] This second prong is likely to encompass ‘issue-based advertisements’ such as those promoted by NGOs.[90] When the Commission issued the regulation proposal, many NGOs voiced their concerns regarding the wide definition of political advertising, urging a clarification that the regulation applies solely to paid political content.[91] In the end, after long discussion at the trialogue stage, the regulation maintains an ambiguous character as it only specifies that advertising messages are ‘normally provided for remuneration or through in-house activities or as part of a political advertising campaign’.[92] At the same time, the Regulation attempts to reassure NGOs with recital 23, which requires a ‘clear and substantial link’ between the message and its potential to influence electoral outcomes, voting behaviour or legislative or regulatory process.
Additionally, in comparison to the proposal, the final version of the Regulation does not include in the definition of political advertising messages from official EU or national sources related to the organisation and participation of electoral campaigns, public communication providing official information and messages presenting candidates in specified public spaces or in the media which is explicitly provided for by law and allocated free of charge.[93] It also explicitly excludes from its scope political opinions expressed in a personal capacity and, in order not to affect the freedom of the media, those subject to editorial responsibility.[94] Despite these improvements, many NGOs still remain sceptical about the broad definitions adopted by the regulation,[95] and not without reason.
The core of the regulation aims to ensure that political advertising is clearly labelled as such, allowing users to know that they are being targeted, who sponsored the advertising, and for which election or referendum, and be able to consult transparency notices.[96]Then, all online political advertisements published in the Union or aimed at EU citizens and residents shall be included in a European repository set up by the Commission.[97] In the end, if the provider notices that information provided by the publisher or sponsor is incomplete, it shall first make contact and request corrections and, if not possible, remove the political advertisement.[98] At the same time, to ensure that political actors can conduct cross-border campaigns, Regulation 2024/900 excludes restrictions imposed by advertising providers that are solely based on the place of residence of the sponsor or establishment.[99] However, sponsored advertising from outside the EU is prohibited in the three months leading up to an election, reaffirming the connection between disinformation and sponsored political advertising from third countries that underpins the regulation.[100]
In addition, the regulation imposes due diligence obligations. These include transmitting, if requested, any necessary information to national authorities and interested entities such as vetted researchers, civil society organisations and journalists.[101] Furthermore, a notice and action mechanism that resembles that of the DSA is envisaged. The publisher shall have mechanisms in place to allow natural or legal persons to signal any political advertisement that does not comply with the Regulation.[102] In the case of VLOPs, they shall examine and address the notifications received without undue delay, in a diligent, non-arbitrary, and objective manner, and inform the notifier about the follow-up provided. More stringent obligations apply in the month immediately preceding an election or a referendum: political advertising publishers shall process notices about potentially unlawful or non-compliant political advertisements within 48 hours.[103]
As anticipated, the Political Advertising Regulation also marks the first binding step in addressing political microtargeting. The restrictions arise from the idea that microtargeting can be used as a means to spread disinformation. Indeed, the preamble states that targeting techniques ‘negatively impact the democratic process as it leads to fragmentation of the public debate about important societal issues, selective outreach and, ultimately, the manipulation of the electorate’ and ‘increases the risk of the spreading of information manipulation and foreign interference’. The same recital also stresses that ‘[m]isleading or surreptitious political advertising is a risk because it influences the core mechanisms that enable the functioning of our democratic society’.[104] The regulation allows such targeting techniques only if two conditions are satisfied: the controller collected data from the data subject, and the users have given consent for their collection in accordance with the GDPR.[105] In addition, political microtargeting is prohibited in relation to the most sensitive data as defined in the GDPR (such as ethnic origin, political opinion, religious belief or sexual orientation).[106] This exclusion intersects with the similar prohibition included in the DSA but has a broader scope, encompassing actors beyond online platforms.[107]
Regarding enforcement, it is entrusted to different national authorities, depending on the specific rules outlined in Regulation 2024/900.[108] First, Member States shall designate competent authorities for the supervision of online intermediary services, which may include those appointed under the DSA. Second, enforcement of microtargeting is delegated to the national supervisory authorities under the GDPR and the European Data Protection Supervisor (EDPS). Third, Member States shall appoint competent authorities to supervise the remaining aspects of the Regulation, which may differ from the two previously mentioned. The Regulation specifies that such authorities ‘shall structurally enjoy full independence both from the sector and from any external intervention or political pressure’.[109] Member States shall also outline rules on sanctions, which, however, cannot exceed the maximum penalty amounts set forth in Article 25 of the Regulation.
Most of the Political Advertising Regulation has been applicable only since October 2025.[110] In this respect, the Commission failed to achieve its objective of having the regulation in place before the 2024 European Parliament elections. Moreover, the guidelines on the application of the Regulation, an opportunity for the Commission to shed further light on the responsibilities of platforms and advertising services and to address the thorny issue of the definition of political advertising, were released only on 8 October 2025, two days before the regulation became applicable.[111] Although all documents are now available, the Regulation’s implementation has a bumpy road ahead. Labelling EU rules as ‘unworkable’ and creating ‘legal uncertainties’, several platforms, including Google and Meta, decided to stop serving political advertising in the EU ahead of the application of the Political Advertising Regulation.[112] The Commission’s late release of the guidelines is regrettable in this respect, as such guidelines could have assisted platforms in identifying political advertising, thus lessening the risk of a complete halt to serving political advertising. The platforms’ ban is likely to affect the most civil society organisations, whose advertising campaigns deemed political by platforms will be restricted, and which have fewer resources than established parties for promoting their initiatives.[113] It will also adversely impact voters, who will have fewer sources to access political information. At the same time, this will not reduce the spread of disinformation and will benefit propaganda campaigns that profit from algorithmic recommendations.[114]
3.3. Protecting the media to fight disinformation: the EMFA
The emergence of digital platforms has marked a significant shift in the composition of the public sphere, resulting in traditional media losing their role as gatekeepers of information. Notably, ‘[p]ublic communication mediated by mass media is socially selective, one-way, linear, centralised, and non-transparent, but through digital media it diversifies into participatory, interactive, net-like, decentralised, and transparent communication processes’.[115] The manner in which platforms operate in sharing news and information, particularly their policies on content moderation and algorithmic recommendations, influences various aspects of media pluralism, as well as contributing to the spread of disinformation.[116] At the same time, we are witnessing an erosion of the integrity of the media and the freedom of journalism throughout the EU, and particularly in Central European Member States.[117]
To address these and other challenges, the EU adopted the EMFA in Spring 2024, which became fully applicable on 8 August 2025.[118] Thus, after years in which media freedom and pluralism did not feature among EU initiatives for enforcing EU values, despite their crucial role in safeguarding democracy,[119] the legislature has finally decided to step in. The EMFA is a horizontal instrument applicable to all media, and it encompasses a wide range of issues, thereby representing a significant shift in the EU’s involvement with media activities.
First, the EMFA obliges Member States to respect the effective editorial freedom of the media, and lays down measures for ensuring their independent functioning.[120] In particular, it forbids Member States to ‘interfere in or try to influence the editorial policies and editorial decisions of media service providers’ (Article 4(2)) and compel them to put in place safeguards for ensuring that those providers are editorially and functionally independent.[121] This includes ensuring that the procedures for appointing and dismissing the head and members of the management board of public service media providers are open, transparent, effective, non-discriminatory, and capable of guaranteeing their genuine independence.[122] Those legislative, regulatory or administrative measures taken by the Member States that may affect the operation of the media service must be duly justified and proportionate.[123]
Second, the EMFA establishes an independent European Board for Media Services (EBMS) to promote the effective and consistent application of the EMFA.[124] The EBMS replaces the European Regulators’ Group for Audiovisual Media Services and acts as an independent advisory body at the EU level, comprising national regulatory authorities and bodies from the media and audiovisual sector. It supports the Commission by offering technical expertise and opinions, as well as contributing to the development of guidelines. Furthermore, the EBMS promotes cooperation and exchanges among national regulatory authorities and mediates in the event of disagreements.
The EMFA is founded on the premise that media freedom and pluralism play a crucial role in safeguarding democracy and, more broadly, EU values. Independent media are understood as performing the role of ‘public watchdog’ and as ‘an indispensable factor in the process of the formation of public opinion’.[125] This becomes even more apparent when comparing the EMFA with the 2010 Audiovisual Media Services Directive, which already included provisions for enhancing media pluralism but lacked a strong value-based language.[126] The primary objectives of the EMFA are indeed to regulate the internal market for media services while simultaneously safeguarding the independence and pluralism of the media, as stated in Article 1. This should be read in light of recital 2, which acknowledged that ‘[t]he Union should help the media sector so that it can seize those opportunities within the internal market, while at the same time protecting the values that are common to the Union and to its Member States, such as the protection of fundamental rights’. A central provision in this respect is Article 3, which establishes the right of media services recipients to access ‘a plurality of editorially independent media content’ and compels Member States to respect this right and ‘ensure that framework conditions are in place in line with this Regulation to safeguard that right, to the benefit of free and democratic discourse’. The nature of the obligations for Member States is a matter of debate. For sure, it includes a negative obligation not to interfere with the right to access public media content. However, as Cavaliere argues, it seems to include also ‘a distinctive horizontal and derivative positive dimension’ which entails ‘a right not to receive media content of such poor quality that hinders the democratic discourse’.[127]
The intention to promote compliance with EU values also emerges in several recitals of the regulation. In particular, the EMFA stresses the role played by media in upholding the fundamental right to freedom of expression and information, ‘enabling people to seek and receive diverse information, and in promoting the values of democracy, cultural diversity and social cohesion’.[128] It also openly seeks to contribute to the protection of Article 11 of the Charter.[129] Furthermore, it recognises that the media plays a crucial role in shaping public opinion and providing citizens with information relevant to active participation in democratic processes.[130] In this respect, it emphasises the importance that recipients of media services can identify the owners of the media and potential conflicts of interest since this ‘is a prerequisite for forming well-informed opinions and, consequently, for actively participating in a democracy’.[131]
Moving to the relationship between media disinformation, it is crucial to emphasise its dual nature. On one hand, independent and reliable media that deliver accurate and thorough information to the public are crucial in breaking down disinformation narratives. On the other hand, the media can themselves be a vector of misinformation, particularly in a context of declining quality journalism and a quest for sensationalism. In some cases, they can also be actively involved in disinformation campaigns.[132]
This dual role of media is reflected in the text of the EMFA. On the one hand, the EMFA generally understands independent and quality media as ‘an antidote against disinformation and foreign information manipulation and interference’.[133] More specifically, it emphasises the harm that disinformation inflicts on the internal market for media services, particularly through the behaviours of global platforms. It considers that digitalisation has increased market failures in the internal market for media services.[134] Global platforms are seen as gateways to media content supported by a business model that encourages polarised content and disinformation. This challenges the media services market, as platforms are more competitive and, as advertising providers, divert financial resources from the media sector. In this respect, the EMFA aims to enhance the competitiveness of media services vis-à-vis digital platforms.
On the other hand, while the positive role is more apparent, the EMFA also acknowledges that dissemination of disinformation can run through media, resulting in improvements compared to the Commission’s proposal.[135] The EMFA is concerned that media providers engaging in disinformation do not exploit the system. It recognises that ‘the good functioning of the internal market for media services is challenged by providers, including those controlled by certain third countries, that systematically engage in disinformation’.[136] In this respect, the EMFA aims at boosting cooperation between national regulatory authorities insofar as ‘it is key to ensuring that media market players, which are often active in different media sectors, that systematically engage in disinformation or information manipulation and interference, do not benefit from the scale of the internal market for media service’.[137]
The understanding of traditional media as an antidote to disinformation is reflected in the decision to confer them a media privilege, as established by Article 18 EMFA. While the DSA was silent on the matter, Article 18 EMFA grants traditional media a privileged status on digital platforms regarding content moderation procedures. It establishes a special framework for VLOPs to treat content from media service providers. First, VLOPs shall include functionality that allows media to declare themselves as independent media that comply with the EMFA requirements. Second, VLOPs shall grant the media special guarantees, thereby creating a series of exceptions to procedures under the DSA.[138] In particular, when they intend to remove media content, VLOPs shall include a statement of reasons and allow the media to reply within 24 hours ‘prior to such a decision to suspend or restrict visibility taking effect’. This system derogates from Article 17 DSA, whereby platforms shall provide reasons to the affected user only after the removal of the content. Furthermore, any complaints by the media shall be processed with priority and without undue delay. In the event that a VLOP repeatedly restricts or suspends the service of a media, it shall engage in a meaningful and effective dialogue with the aim of finding an amicable solution. The media service provider may also inform the Board and request it to issue an opinion.
The challenge in drafting a media privilege stems from the fact that the broad and ‘functional’ definition that would, in principle, be necessary to provide protection to other watchdogs (such as NGOs, whistleblowers, and researchers) would also ‘substantially expand the range of potential beneficiaries, thereby increasing the risks of circumventions by malicious actors’.[139] When the Commission proposed the EMFA, several voices were raised against the media privilege, including NGOs advocating for digital rights, as they were concerned that it could backfire in the fight against disinformation.[140] They feared that self-identification as media service providers could be exploited by disinformation actors and state-propaganda broadcasters in backsliding Member States, thus undermining the DSA’s efforts to address and mitigate disinformation risks.[141] Clearly, the debate on the media privilege ‘reflects the inability (both for policy and for academia) to clearly define the media today and to separate the media from bad actors and propagandistic outlets who disguise and self-present as media’.[142] Yet, as Monti pointed out, in comparison with the Commission proposal,[143] the current Article 18 of the EMFA entails a series of precautions for avoiding the exploitation of media privilege by disinformation actors.[144] In particular, to benefit from the privilege, media must declare themselves as editorially independent from States, political parties, and third state entities, comply with their transparency duties under Article 6(1) EMFA[145] and be subject to ‘regulatory requirements for the exercise of editorial responsibility in one or more Member States’ and oversight by a national regulatory authority or body, or adhere to a co-regulatory or self-regulatory mechanism governing editorial standards.[146] At the same time, the media privilege is without prejudice to VLOPs’ duties to assess and mitigate systemic risks under the DSA.[147]
4. Assessing the EU’s legislative response to disinformation: three lessons
The DSA, the Political Advertising Regulation, and the EMFA are key pieces in the broader puzzle of the EU’s efforts to combat disinformation. While each instrument has its own features, benefits, and limitations, as outlined above, together they provide insights into the role the EU legislature can play in fighting disinformation and, more generally, in defending the value of democracy. At least three lessons can be identified.
4.1. Legislation protecting democracy can be adopted through Article 114 TFEU
The EU lacks explicit legislative competence to safeguard and promote its values, as Article 2 TEU cannot serve as a legal basis for adopting legal acts to ensure respect for EU values.[148] Accordingly, the EU cannot adopt an act specifically focused on protecting democracy against disinformation. And indeed, just a few years ago, scholars argued that the Union had no competence at all to legislate for protecting democracy from disinformation.[149] Even today, the use of Article 114 TFEU as the legal basis for the EMFA is contested, not only by Hungary’s recent annulment action but also in legal scholarship.[150]
Nonetheless, as the Court of Justice made clear, ‘insofar as they have a valid legal basis, acts of the European Union may also seek to ensure respect for the values of the European Union’.[151] In the case at issue, all three instruments were primarily based on Article 114 TFEU, which is notably the main legal basis for harmonisation within the EU internal market. Although Article 114 TFEU does not amount to ‘a general power to regulate the internal market’,[152] legislation under it can also pursue other, non-market objectives, provided that they are not the prevalent ones. As De Witte pointed out, ‘internal market legislation is always also ‘about something else’, and that something else may in fact be the main reason why the internal market measure was adopted’.[153]
While the use of Article 114 TFEU allows for incisive EU intervention, it comes with limits and implications, as it is available as long as EU acts ‘have as their object the establishment and functioning of the internal market’. Notably, the Court clarified that to justify recourse to that provision, the EU act must prevent obstacles to trade or remove distortions of competition.[154] Therefore, Article 114 TFEU empowers the EU legislatures to adopt a wide range of measures, but it simultaneously restricts the EU’s intervention to regulating the market and its actors, such as service providers. In this respect, as Davies framed it, Article 114 TFEU is at the same time ‘too broad and too narrow’.[155] The necessity of a connection with the market thus makes Article 114 TFEU less suitable for protecting other EU values, such as the rule of law, where establishing such a link is more challenging.
In the case of the three instruments analysed, the connection between safeguarding democracy and the market was established because regulated actors are service providers: information society services, advertising services, and media services. However, it is evident that such instruments are built on the will to protect democracy and fundamental rights in the digital society. By relying on Article 114 TFEU, the EU was able to adopt legislation that establishes harmonised democracy standards applicable to the Member States, as well as private actors. As shown above, the three instruments make no secret of the fact that their aims go beyond ensuring the functioning of the internal market to protect non-market values. They all share a strong value-based language, also when compared to their predecessors (notably, in the comparison between the EMFA and the Audiovisual Media Services Directive).
Although not all provisions in the three acts are directly related to the aim of Article 114 TFEU,[156] it is submitted that none of these instruments exceeds the scope of EU competence. First, while pursuing other objectives, these instruments help improve the functioning of the internal market. They regulate digital market actors and contribute to harmonising diverging national laws across the Member States, thereby reducing fragmentation within the EU internal market. To use the words of Bayer, ‘market actors have been regulated, in an attempt to amend structures of power and incentivise market actors to build resilience against malicious information attacks’.[157]
Second, despite the Treaties drafters’ effort to confine the EU’s action through a list of competences, such competences cannot be strictly demarcated.[158] As De Witte convincingly argued, ‘Internal market law is a story of constitutional flexibility: EU constitutional law is constantly adapted and re-interpreted so as to allow the Union to achieve common policy goals within the framework of the (limited) competences granted by the European treaties’.[159] Furthermore, in this case, the broad interpretation of Article 114 TFEU is justified by a very significant non-economic objective, namely the need to protect democratic society from disinformation, thereby safeguarding one of the EU’s founding values.[160] As the Court of Justice acknowledged in relation to the rule of law conditionality regulation, ‘it is permissible for the EU legislature, where it has a legal basis for doing so, to establish, in an act of secondary legislation, other procedures relating to the values contained in Article 2 TEU’.[161]
Third, in the specific context of the digital environment, maintaining a clear separation between the internal market and political goals is even more challenging. Indeed, in this setting, ‘the political and economic aspects of the internal market are often two sides of the same coin’. [162]
Fourth, in the cases at hand, the broad interpretation of Article 114 TFEU is also justified in light of the necessity to not only defend but also promote EU values. Notably, Article 13(1) TEU mandates all EU institutions, including the legislature, to promote EU values, which are also among the Union’s objectives outlined in Article 3(1) TEU. Following the Lisbon Treaty, Article 2 TEU precedes these objectives, further emphasising that EU values should inform them. Therefore, in this context, the EU’s intervention has the merit of striking a balance between the absence of a value-related legal basis and the obligation to promote EU values.[163]
Having established that legislation combating disinformation was rightly based on Article 114 TFEU, it is important to stress that recourse to such a legal basis is inherently limited and cannot accommodate all initiatives aimed at protecting democracy, especially those unrelated to the market. Nevertheless, when available, the use of internal market legislation as a means to safeguard democracy offers several advantages. First, it expands the tools at the Union’s disposal for protecting its own values without exacerbating the federal relationship with the Member States. Since the protection of democracy is framed through a substantive policy field and bears the credentials of the legislative process, it will appear less politicised and thus less contested.[164] Second, recourse to legislation helps the legitimacy of the Union’s action in the context of value enforcement. As Tridimas stressed, ‘if the EU legislature has spoken, this means that the Member States have exercised a collective choice having considered the issues involved, and the outcome enjoys, such as they are, the democratic credentials of the legislative process’.[165] This is even more needed for the value of democracy, which bears a significant political core, and where the legitimacy for judicial intervention is therefore more limited. Lastly, legislative intervention can be both reactive and proactive, aiming to avoid future deterioration. Legislative efforts with a preventive character indeed have the merit to ‘foster a rule of law culture’ which, with time, could act at the roots of the problem of democratic backsliding.[166]
4.2. The choice for a co-regulatory approach to enforce democracy
The adoption of the DSA, the EMFA, and the Political Advertising Regulation marked a shift from the traditional approach to platform liability for content distributed through their services. That traditional approach, influenced by the US experience, was grounded in platform self-regulation, where online platforms and other intermediaries are seen as essential promoters of free speech in the online realm and should therefore be protected from any responsibility regarding conduct on their services. It was implemented through the development of codes of practice, such as the 2018 Code of Practice on Disinformation, which brought online platforms together to commit to voluntary measures against disinformation practices. These efforts, however, resulted in unsatisfactory results as they completely delegated content removal and other measures to reduce visibility of disinformation to the platforms, which carries the risks associated with the privatisation of censorship.[167] Moreover, as De Gregorio and Pollicino pointed out, the first attempts based on pure self-regulation were ‘disappointing in terms of the vagueness of the obligations assumed by the platforms themselves and the almost complete absence of criteria for verifiability and measurability of the commitments’.[168]
The slow response of online platforms to assume responsibility for content moderation and the emergence of fragmented national initiatives introducing due diligence obligations at the level of individual Member States have, however, prompted a shift in the EU’s approach.[169] Abandoning the unproductive self-regulatory model, the Union has gradually moved towards a co-regulatory framework. The DSA and the Political Advertising Regulation establish binding rules for platforms and advertising services to tackle disinformation risks, while the EMFA limits online platforms’ ability to moderate content from traditional media. The fact that all these instruments were adopted in the form of a regulation, notably the most pervasive EU legal act, also demonstrates the EU’s change of approach. Moreover, these instruments have turned many of the commitments under the EU codes of practice into binding requirements.[170] At the same time, the Strengthened Code of Practice on Disinformation is not only more ambitious than its 2018 version,[171] but it has also become an official code of conduct under the DSA, thus acting as a means to evaluate platform compliance.[172]
The three instruments complement each other in enforcing counter-disinformation policies by regulating various relevant actors – online intermediaries, advertising publishers, and media services –, establishing a framework of transparency, due diligence, and risk-mitigation obligations, and granting traditional media a privilege in countering disinformation.[173] By doing so, they aim to uphold the challenging task of regulating the digital sphere to curb disinformation while avoiding excessive restrictions on freedom of expression. While the DSA and the EMFA have managed to strike a good balance between the two interests, the Political Advertising Regulation suffers from a poorly drafted definition of political advertising that risks limiting the wrong actors. Indeed, freedom of expression and maintaining an open environment for civil society are essential for mitigating the effects of disinformation campaigns.
As De Gregorio and Pollicino noted, a ‘European constitutional way to address disinformation’ is emerging, one which is characterised by a mix of hard and soft law.[174] At present, the Union is seeking to strike a balance between self-regulation and binding instruments. On the one hand, the Union left behind the pure self-regulation approach. On the other hand, the EU legislature has been careful not to impose overly specific policies on platforms for the organisation of their services, granting them a wide margin of appreciation. This reflects the search for an equilibrium between establishing effective obligations for online platforms to counter disinformation and preventing their regulatory evasion – also in light of the information asymmetry between them and public authorities –, while, at the same time, safeguarding users’ safety and freedom of expression, as well as, one should not forget, the economic interests of online platforms.
4.3. Even without defining disinformation, EU legislation enriches the notion of democracy
None of the three instruments discussed includes a legislative definition of disinformation. The DSA refrains from defining the conduct that constitutes disinformation, primarily referring to the legislation of individual Member States to establish what constitutes illegal content. The Political Advertising Regulation and the EMFA adopt a stronger value-based language and include the fight against disinformation in their rationales, but they also do not define disinformation. Consequently, disinformation lacks a binding definition under EU law and appears only in soft-law instruments such as the 2018 Communication on the European Approach to online disinformation and the 2022 Code of Practice. This choice is not surprising. Given the complexity of the disinformation phenomenon, whose conduct is not always illegal,[175] and the conflicting right to freedom of expression, defining disinformation is far from straightforward.[176] The lack of a clear definition indicates the EU co-legislator’s preference for legislation that targets disinformation by harmonising procedures rather than adopting a content-based approach. Currently, the EU aims to counter disinformation not by regulating its content but by focusing on the actors behind its dissemination, such as online platforms, and by encouraging cooperation between public and private entities.[177]
Conversely, the three instruments analysed play an important role in the development of the notion of democracy under EU law. First, all instruments recognise that freedom of expression and information are essential to the democratic debate. The Court of Justice has long established that freedom of expression, as guaranteed by Article 11 of the Charter, constitutes ‘one of the essential foundations of a pluralist, democratic society, and is one of the values on which, under Article 2 TEU, the Union is founded’.[178]Under the Charter, freedom of expression has an active and a passive dimension: it includes both the freedom to hold opinions and ‘to receive and impart information and ideas without interference by public authority and regardless of frontiers’.[179] Now the EU legislature is contributing to concretise such fundamental principles. Behind the efforts to regulate online intermediaries lies the idea that platforms can be used in ways that adversely shape public opinion and discourse.[180] Citizens’ right to receive adequate information and thus be able to form an independent political opinion and cast a ballot based on truthful information is essential to democracy. In the context of political advertising, this implies that citizens can recognise political advertisements and are not misled by disinformation campaigns, allowing them to exercise their democratic rights knowledgeably and effectively.
Second, transparency emerges as another relevant component of the EU value of democracy. The Political Advertising Regulation expressly links the need to ensure transparency with the values under Article 2 TEU, insofar as it is necessary for making informed choices, thus enabling open and fair political debate and electoral campaigns.[181] Moreover, transparency of the online environment is among the objectives of the DSA, while the transparency of media ownership lies at the heart of the EMFA.
Third, the legislature is finally recognising the importance of media freedom both in itself and within the broader effort to defend democracy. This is most evident with the EMFA, which acknowledges media freedom and pluralism as ‘two of the main pillars of democracy and of the rule of law’[182]. The Political Advertising Regulation, which excludes editorial content from political advertising, also recognises that democracy depends on the media’s role in supporting the proper functioning of democratic processes. They are vital for ensuring freedom of expression and access to information, fostering public debate and helping shape public opinion.[183]
Fourth, the three instruments also share a common understanding of the responsible actors. Threats to European democracy are increasingly seen as originating from third countries. As highlighted above, the connection between disinformation and sponsored political advertising from third countries lies at the heart of the Political Advertising Regulation. Although less explicit, such an understanding is also present in the EMFA.[184] This perspective aligns with other recent initiatives to protect democracy, such as the proposed Directive on transparency of interest representation carried out on behalf of third countries.[185] Yet, the concern that threats to a value under Article 2 TEU originate from outside the Union marks a key difference from the rule of law, which has been undermined by the Member States themselves. This difference may thus influence future endeavours to defend the two values, which are increasingly demanding a different enforcement approach.
5. Conclusions
In the digital era, the values of the Union face new and complex challenges. In particular, disinformation poses a significant threat to the value of democracy, compromising the ability of EU citizens to make informed and free political decisions. At the same time, the complex relationship between disinformation and private actors requires the Union to adopt new strategies which, as this article has argued, should prioritise the adoption of legislative tools.
The adoption of the DSA, the EMFA, and the Political Advertising Regulation is an important step for protecting democracy against disinformation. These three instruments, which complement each other, form part of a first proper EU policy to tackle disinformation. Their introduction was made possible by the fact that the EU Treaties allow greater room for passing legislative acts that protect the value of democracy than is the case with the rule of law. At the same time, this effort serves as another example of utilising internal market legislation to achieve ‘non-economic common objectives’.[186]
The current EU counter-disinformation strategy is built on a co-regulatory approach that aims to regulate powerful actors in the digital environment rather than relying on platforms’ self-regulation, which would have detrimental consequences for democracy, freedom of expression, and other fundamental rights. Although the three pieces of legislation analysed are not the silver bullet or a panacea for tackling disinformation, they contribute to addressing some of the challenges it poses. Clearly, ensuring the success of these legislative instruments depends on effective enforcement. The Commission has a key role to play in this respect, in light of both the enforcement powers entrusted to it by the new rules and its general role as guardian of the Treaties under Article 17 TEU. As regards the DSA and the Political Advertising Regulation, the Commission must hold its ground against online platforms, and, since many of them are US-based, also not give up the fight in light of geopolitical considerations. In the November 2025 Democracy Shield, the Commission reaffirmed its commitment to enforcing EU law against major platforms but did not announce any new concrete initiatives.[187] It thus remains to be seen whether this commitment will be borne out in practice or will remain merely rhetorical. In the case of the EMFA, although it is notable that the Commission has started enforcement action against Hungary, evidence shows that concrete implementation is delayed in several Member States.[188] It is therefore essential that the Commission utilise its power under Article 258 TFEU to initiate infringement proceedings against non-compliant Member States.
Finally, the DSA, the EMFA, and the Political Advertising Regulation all share a common understanding of the value of democracy and contribute to giving it substance. Thus, thanks to the EU’s efforts to fight disinformation, the concept of democracy under EU law is beginning to take shape also through legislation, finally complementing similar efforts from the Court of Justice. Hopefully, future legislative efforts, such as the upcoming revision of the Audiovisual Media Services Directive,[189] will continue this positive trend.
-------------------
European Papers, Vol. 11, 2026, No 1, pp. 81-111
ISSN 2499-8249 - doi: 10.15166/2499-8249/863
* Post-doc researcher and lecturer in EU Law, University of Florence, martina.coli@unifi.it.
[1] X Groussot and E Karageorgiou ‘Solidarity and the Crisis of Values in the European Union’ (2023) 2 Nordic Journal of European law 29. The literature on Article 2 TEU and the challenges to EU values is endless. See, inter alia, C Fasone, A Dirri and Y Guerra, EU Rule of Law Procedures at the Test Bench (Palgrave 2024); L D Spieker, EU Values Before the Court of Justice: Foundations, Potential, Risks (Oxford University Press 2023); A Jakab and D Kochenov (eds), The Enforcement of EU Law and Values: Ensuring Member States’ Compliance (Oxford University Press 2017); C Closa and D Kochenov (eds), Reinforcing Rule of Law Oversight in the European Union (Cambridge University Press 2016).
[2] See for a definition of rule of law backsliding: L Pech and KL Scheppele, ‘Illiberalism Within: Rule of Law Backsliding in the EU’ (2017) 19 Cambridge Yearbook of European Legal Studies 3.
[3] E.g. Regulation 2020/2092 of the European Parliament and of the Council of 16 December 2020 on a general regime of conditionality for the protection of the Union budget.
[4] European Commission, ‘Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions on the European democracy action plan’, COM(2020)790.
[5] M Bonelli, ‘The European Union’s Democratic Turn’ (re:constitution Working Paper 38-2025) at www.reconstitution.eu.
[6] The most recent expression of this trend is the European Democracy Shield released in November 2025. European Commission, ‘Joint Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: European Democracy Shield: Empowering Strong and Resilient Democracies’, JOIN(2025) 791 final.
[7] EEAS, ‘Don’t be deceived: EU acts against fake news and disinformation’ (19 September 2017) www.eeas.europa.eu. The taskforce introduced the website euvsdisinfo.eu.
[8] See European Commission, ‘Communication: Action Plan against disinformation’, JOIN(2018) 36 final.
[9] See the EU embargo on the RT television channel for spreading Russian disinformation: Council Regulation 2022/350 of 1 March 2022 amending Regulation No 833/2014 Concerning Restrictive Measures in View of Russia’s Actions Destabilising the Situation in Ukraine. See for an analysis J Bayer, ‘The European response to Russian disinformation in the context of the war in Ukraine’ (2023) 64 Hungarian Journal of Legal Studies 589 and GF Lendvai, ‘Media in War: An Overview of the European Restrictions on Russian Media’ (2023) 8 European Papers 1235.
[10] European Commission, ‘2018 Code of Practice on Disinformation’ digital-strategy.ec.europa.eu.
[11] European Commission, ‘Strengthened Code of Practice on Disinformation 2022’, at digital-strategy.ec.europa.eu
[12] Regulation 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC.
[13] Regulation 2024/900 of the European Parliament and of the Council of 13 March 2024 on the transparency and targeting of political advertising.
[14] Regulation 2024/1083 of the European Parliament and of the Council of 11 April 2024 establishing a common framework for media services in the internal market and amending Directive 2010/13/EU.
[15] European Council Conclusions of 19-20 March 2015, para 13.
[16] V Polonski, ‘How artificial intelligence conquered democracy’ (The Conversation, 8 August 2017) at theconversation.com.
[17] FJ Zuiderveen Borgesius, J Möller, S Kruikemeier, R Ó Fathaigh, K Irion, T Dobber, B Bodo, and C de Vreese, ‘Online Political Microtargeting: Promises and Threats for Democracy’ (2018) 14 Utrecht Law Review 82.
[18] European Commission, ‘Joint Communication to the European Parliament and the Council: Joint Framework on countering hybrid threats: a European Union response, JOIN(2016) 18 final, 2.
[19] European Commission, ‘Communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: Tackling online disinformation: a European Approach’, COM(2018) 236 final, 3–4.
[20] European Commission, ‘A multi-dimensional approach to disinformation – Report of the independent High level Group on fake news and online disinformation’, 2018.
[21] Ibid, 10. Nonetheless, some scholars have emphasised that addressing disinformation but not misinformation is a choice that does not come without problems. See A Koltay, ‘Freedom of Expression and the Regulation of Disinformation in the European Union’ in R Krotoszynski Jr, A Koltay and C Garden (eds.) Disinformation, Misinformation, and Democracy: Legal Approaches in Comparative Context (Cambridge University Press 2025) 133.
[22] Notably, the concept of fake news fails to adequately describe the complexity of disinformation. The term is also increasingly used by politicians worldwide to characterise media they dislike, in an effort to restrict the free press. See C Wardle and H Derakhshan, ‘Information disorder: Toward an interdisciplinary framework for research and policy making’, Council of Europe report DGI(2017)09, 5. Similarly, the Report of the High level Group on fake report also advised against using the term ‘fake news’ as it is both ‘inadequate’ to address the complexity of disinformation, which transcends the simple true/false divide, and ‘misleading’ as it has been weaponised by politicians and parties to attack independent media. Report ‘A multi-dimensional approach to disinformation’ (n 20) 10.
[23] J. Bayer, N Bitiukova, P Bárd, J Szakács, A Alemanno, and E Uszkiewicz, ‘Disinformation and Propaganda – Impact on the Functioning of the Rule of Law in the EU and Its Member States’ (2019) Study for the LIBE Committee, European Parliament, PE 608.864.
[24] È Brown, ‘Propaganda, formation, and the Epistemic Value of Democracy’ (2018) 30 Critical Review 194.
[25] E Cirone, ‘Misinformation about COVID-19: Is the European Union Well Equipped to Fight the “Infodemic”?’ in D Utrilla and A Shabbir (eds.) EU law in times of pandemic (EU Law Live Press, 2021) 177.
[26] S Eskens, N Helberger and J Moeller, ‘Challenged by news personalisation: five perspectives on the right to receive information’ (2017) 9 Journal of Media Law 259, 281.
[27] K Pentney, ‘Tinker, Tailor, Twitter, Lie: Government Disinformation and Freedom of Expression in a Post-Truth Era’ (2022) 22 Human Rights Law Review 1.
[28] Judgment of the Constitutional Court of Romania n 32 of 6 December 2024. See for an analysis R Cornea, ‘Romanian Militant Democracy in Action: Shielding Democracy from Subversion and Annulling the Elections’ (Verfassungsblog, 1 April 2025) at verfassungsblog.de.
[29] O Pollicino ‘General Report: Freedom of Speech and the Regulation of Fake News’ in Freedom of Speech and the Regulation of Fake News (Intersentia 2023) 1.
[30] B Martens, L Aguiar, E Gomez-Herrera and F Mueller-Langer, ‘The digital transformation of news media and the rise of disinformation and fake news’ (2018) JRC Digital Economy Working Paper at publications.jrc.ec.europa.eu.
[31] G Martinico and M Monti, ‘Online Disinformation and Populist Approaches to Freedom of Expression: Between Confrontation and Mimetism’ (2024) 45 Liverpool Law Review 143.
[32] On the complex framework for protecting fundamental rights online see G De Gregorio, ‘From constitutional freedoms to the power of the platforms: protecting fundamental rights online in the algorithmic society’ (2019) 11 European Journal of Legal Studies 65 and G Frosi and C Geiger, ‘Taking fundamental rights seriously in the Digital Services Act’s platform liability regime’ (2023) 29 European Law Journal 31.
[33] M Monti, ‘The EU code of practice on disinformation and the risk of the privatisation of censorship’ in S Giusti and E Piras (eds.) Democracy and Fake News (Routledge, 2020) 214.
[34] On Article 7 TEU see, among others, W Sadurski, ‘Adding a Bite to a Bark? A Story of Article 7, the EU Enlargement, and Jörg Haider’ (2010) 16 Columbia Journal of European Law 385; L Besselink, ‘The Bite, the Bark, and the Howl: Article 7 TEU and the Rule of Law Initiatives’ in A Jakab and D Kochenov (eds), The Enforcement of EU Law and Values: Ensuring Member States’ Compliance (Oxford University Press 2017) 128 and D Kochenov, ‘Article 7 TEU: From “Nuclear Option” to “Sisyphean Procedure”?’ in U Belavusau and A Gliszczynska-Grabias (eds), Constitutionalism under Stress (Oxford University Press 2020) 161.
[35] Since the ASJP judgement, the Court of Justice has interpreted Article 19(1) TEU extensively, holding that the obligation it imposes on Member States to ensure effective judicial protection ‘in the fields covered by Union law’ encompasses the guarantee of independence for national courts, irrespective of whether EU law is being implemented in the specific case. This then paved the way for the use of infringement proceedings for violations of Article 19(1) TEU against those Member States that undermine the independence of the national judiciary. It also granted the Court jurisdiction, within the framework of preliminary ruling procedures, to address interpretative questions concerning the substance of judicial independence under Article 19(1) TEU. Case C‑64/16 Associação Sindical dos Juízes Portugueses (ASJP), EU:C:2018:117. See for a commentary: M Bonelli and M Claes, ‘Judicial Serendipity: How Portuguese Judges Came to the Rescue of the Polish Judiciary: ECJ 27 February 2018, Case C-64/16, Associação Sindical Dos Juízes Portugueses’ (2018) 14 European Constitutional Law Review 622.
[36] For an overview of the situation of State-funded disinformation in Hungary see: G Polyák, A Urbán, P Szávai and K Horváth, ‘Disinformation under the guise of democracy: lessons from Hungary’ in M Echeverría, S García Santamaría and D C Hallin (eds) State-Sponsored Disinformation Around the Globe(Routledge, 2024) 231.
[37] The pending infringement procedure concerning the Hungarian anti-LGBT law (Case C-769/22) could however be a game-changer. Alongside the violation of internal market legislation and the EU Charter, the Commission claims, for the very first time, that a Member State ‘has infringed Article 2 TEU’. See for an analysis of the different paths for the justiciability of Article 2 TEU in such a case: LS Rossi, ‘‘Concretised’, ‘Flanked’, or ‘Standalone’? Some Reflections on the Application of Article 2 TEU’ (2025) 10 European Papers 1.
[38] See for proposals in this direction: J Cotter, ‘To Everything There Is a Season: Instrumentalising Article 10 TEU to Exclude Undemocratic Member State Representatives from the European Council and the Council’ (2022) 47 European Law Review 69 and LD Spieker, ‘Beyond the Rule of Law: How the Court of Justice Can Protect Conditions for Democratic Change in the Member States’, in A Södersten and E Hercock (eds), The Rule of Law in the EU: Crisis and Solutions (Sieps 2023) 87.
[39] In two judgments of November 2024 the Court started shaping the relationship between the value of democracy under Article 2 TEU and the democratic rights of EU citizens under Articles 10 TEU and 22 TFEU. Case C-808/21 Commission v Czech Republic, EU:C:2024:962, and Case C-814/21 Commission v Poland, EU:C:2024:963. See for a comment: N Vissers, ‘Join the (political) party: The CJEU’s emerging role as a guardian of democracy in Cases C-808/21 and C-814/21’ (2025) 32 Maastricht Journal of European and Comparative Law 1. See also, more recently, judgment in Case C‑181/23 Commission v Malta, EU:C:2025:283, particularly paras 88-90.
[40] As Nemitz and Ehm argued, ‘as to democracy, EU primary law does not contain the necessary consolidations and concretisations as were achieved regarding fundamental rights in the Charter of Fundamental Rights of the European Union’. P Nemitz, F Ehm, ‘Strengthening Democracy in Europe and its Resilience Against Autocracy: Daring More Democracy and a European Democracy Charter’ in S Garben, I Govaere and P Nemitz (eds), Critical Reflections on Constitutional Democracy in the European Union and its Member States (Hart Publishing 2019) 345, 349.
[41] M Brkan, ‘EU fundamental rights and democracy implications of data-driven political campaigns’ (2020) 27 Maastricht Journal of European and Comparative Law 774, 782.
[42] See footnote 39 above.
[43] Case C-430/21 RS, EU:C:2022:99, para 43.
[44] President of the European Commission, ‘State of the Union 2021’ at state-of-the-union.ec.europa.eu.
[45] More recently, this toolbox has been expanded with the adoption of the AI Act (Regulation 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence), which, however, is outside the scope of the present analysis.
[46] For full commentary on the DSA see F Hofmann and B Raue (eds), Digital Services Act (Bloomsbury 2024) and M Husovec, Principles of the Digital Services Act (Oxford University Press 2024).
[47] Communication COM(2020)790 (n 4) 21.
[48] Regulation 2022/2065 (n 12) Recital 9, emphasis added.
[49] Regulation 2022/2065 (n 12) Art 6, which confirms the previous liability exemption in Article 14 of the E-commerce directive: Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce). This is complemented by the insertion of a Good Samaritan clause in Article 7.
[50] Regulation 2022/2065 (n 12) Art 8.
[51] M Monti, ‘Towards a Federal-Type Regulation of Online Public Discourse by the EU?’ (2024) 30 European Public Law 1, 2.
[52] F G’sell, ‘The digital services act (DSA): a general assessment’ in A von Ungern-Sternberg (ed), Content Regulation in the European Union – The Digital Services Act (Schriften Des Irdt, 2023) 89.
[53] The definitions of intermediary and hosting services and online platforms are provided, respectively, in Art 3(g) and Art 3(i) Regulation 2022/2065 (n 12).
[54] Regulation 2022/2065 (n 12) Art 33. The designation of a services as VLOP is entrusted by the Commission, which acts by decision. So far, the Commission has designed 25 VLOPs. The list is available on the Commission’s website at the following link: digital-strategy.ec.europa.eu.
[55] These require that platforms include in their general terms and conditions information about the restrictions they impose in the use of their services, including the policies, procedures and measures used for the purpose of content moderation (Art 14 DSA) and that they publish annual reports on their content moderation activity (Art 15 DSA).
[56] According to Art 14(1) DSA, providers ‘shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions’ and ‘with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter’.
[57] Recital 84 DSA further links systemic risks to disinformation by requiring that, when assessing such risks, VLOPs ‘pay particular attention to how their services are used to disseminate or amplify misleading or deceptive content, including disinformation’.
[58] Regulation 2022/2065 (n 12) Art 34.
[59] S Mündges, K Park ‘But did they really? Platforms’ compliance with the Code of Practice on Disinformation in review’ (2021) 13 Internet Policy Review1.
[60] R Ó Fathaigh, D Buijs and J van Hoboken, ‘The Regulation of Disinformation Under the Digital Services Act’ (2025) 13 Media and Communication 1.
[61] European Commission Press release ‘Commission endorses the integration of the voluntary code of practice on disinformation into the digital services act’ (13 February 2025) at ec.europa.eu.
[62] Regulation 2022/2065 (n 12) Art 36.
[63] Ibid Art 37.
[64] Ibid Art 41.
[65] Ibid Art 6(1)(b). Notably, and as also specified by recital 22 DSA, the removal of content shall take place in full observance of the fundamental rights of users of the service, particularly the right to freedom of expression and of information.
[66] Regulation 2022/2065 (n 12) Art 23. Recital 63 DSA specifies that a content is manifestly illegal ‘where it is evident to a layperson, without any substantive analysis, that the content is illegal or, respectively, that the notices or complaints are unfounded’.
[67] Regulation 2022/2065 (n 12) Art 17 DSA.
[68] P Ortolani, ‘If You Build It, They Will Come: The DSA’s “Procedure Before Substance” Approach’ (Verfassunsblog, 7 November 2022) at verfassungsblog.de.
[69] Regulation 2022/2065 (n 12) Arts 20 and 21.
[70] Ibid Art 22. To obtain the status of trusted flaggers, an applicant must demonstrate (a) particular expertise in detecting, identifying, and notifying illegal content; (b) independence from online platform providers; (c) diligence, accuracy, and objectivity in submitting notices. The status is awarded by the Digital Services Coordinator of the Member State where the applicant is established.
[71] Ibid Art 49.
[72] Ibid Art 56. This decision appears to be a response to the challenges that arose in enforcing the General Data Protection Regulation (GDPR) through the system of Data Protection Authorities. See C Busch ‘Platform responsibility in the European Union: from the E-commerce directive to the digital services act’ in B Chakravorti and J P Trachtman (eds) Defeating Disinformation (Cambridge University Press, 2025)
[73] Regulation 2022/2065 (n 12) Arts 52(3) and 74.
[74] Ibid Arts 52(4) and 76.
[75] An overview of the proceedings activated by the Commission can be found on the dedicated website: digital-strategy.ec.europa.eu.
[76] European Commission Press release: ‘Commission decides to refer Czechia, Spain, Cyprus, Poland and Portugal to the Court of Justice of the European Union due to lack of effective implementation of the Digital Services Act’ (7 May 2025) at ec.europa.eu.
[77] European Commission Press release: ‘Commission fines X €120 million under the Digital Services Act (5 December 2025) at ec.europa.eu.
[78] L Comel, ‘€120 million later: the DSA enters the enforcement phase’ (Media Laws 23 December 2025) at www.medialaws.eu.
[79] See the European Parliament debate of 21 January 2025 on the DSA enforcement against US platforms at www.europarl.europa.eu
[80] J Harfst, T Mast, and W Schulz ‘Independence as a Desideratum: DSA Enforcement by the EU Commission’ (Verfassungsblog, 16 July 2025) at verfassungsblog.de.
[81] Kaid and Holtz-Bacha define political advertising as ‘a means through which parties and candidates present themselves to the electorate, mostly through the mass media’. C Holtz-Bacha and LL Kaid, ‘Political advertising in international perspective’ in LL Kaid and C Holtz-Bacha (eds), The Sage handbook of political advertising (Sage Publications 2006) 3.
[82] Regulation 2022/2065 (n 12) Art 39.
[83] Regulation 2024/900 (n 13) Recital 14.
[84] A Richter, ‘For Propaganda Without Disinformation: Draft EU Regulation on Political Advertising’ (EJIL: Talk!, 24 February 2023) at www.ejiltalk.org.
[85] Regulation 2024/900 (n 13) Recital 4.
[86] Ibid.
[87] There is abundant ECtHR case law in this respect. See in particular Lindon, Otchakovsky-Laurens and July v France App nos 21279/02 and 36448/02 (ECtHR, 22 October 2007) para 46 and Wingrove v. the United Kingdom App no 17419/90 (ECtHR, 25 November 1996) para 58.
[88] See J Barata, ‘Regulation of online political advertising in Europe and potential threats to freedom of expression’ (Media@LSE, 9 March 2023) at blogs.lse.ac.uk and S Lindroos-Hovinheimo, ‘The Proposed EU Regulation on Political Advertising Has Good Intentions, But Too Wide a Scope’ (European Law Blog, 23 February 2022) at www.europeanlawblog.eu.
M van Drunen, N Helberger, W Schulz, and C de Vreese, ‘The EU is going too far with political advertising!’ (DSA Observatory, 16 March 2023) at dsa-observatory.eu.
[89] Regulation 2024/900 (n 13) Art 3(2). Paragraph 4 of the same provision then specifies the notion of ‘political actor’.
[90] I Nenadic and K Bleyer-Simon, ‘Issue-based advertising’ (2021) Report by the EUI Centre for Media Pluralism and Media Freedom at hdl.handle.net.
[91] In October 2022, around 30 NGOs wrote a letter to the Czech Presidency calling for a series of amendment to the Commission proposal. The letter is available at www.liberties.eu.
[92] Regulation 2024/900 (n 13) Art 3(2).
[93] Ibid.
[94] Regulation 2024/900 (n 13) Art 1, para 2 and 3. See also recitals 29 and 30.
[95] See the Report by Civil Liberties Union for Europe ‘European Commission Discussion Points: Online Focus Group on Transparency and Targeting of Political Advertising’ (2025) at www.liberties.eu.
[96] Regulation 2024/900 (n 13) Art 11. The following Article 12 specifies the information that transparency notices shall include. See also Commission Implementing Regulation 2025/1410 of 9 July 2025 on the format, template and technical specifications of the labels and transparency notices of political advertisements in accordance with Articles 11 and 12 of Regulation 2024/900 of the European Parliament and of the Council.
[97] Regulation 2024/900 (n 13) Art 13.
[98] Ibid Art 12(2).
[99] Ibid Art 5(1).
[100] Ibid Art 5(2).
[101] Ibid Arts 16 and 17.
[102] Ibid Art 15.
[103] Ibid Art 15(7).
[104] Ibid Recital 74.
[105] Ibid Art 18(1). Additional transparency requirements are specified in Art 19.
[106] Ibid Art 18(1)(c). Instead, it remains uncertain whether the regulation allows targeting based on sensitive data without profiling. See S Eskens, ‘The role of the Political Advertising Regulation and European Media Freedom Act in the EU’s anti-disinformation approach’ (Working Paper 31 August 2024) at ssrn.com.
[107] Regulation 2022/2065 (n 12) Art 26(3).
[108] Regulation 2024/900 (n 13) Art 22.
[109] Ibid Art 22(4).
[110] Only the definitions in Article 3 and the provision that prohibits discrimination based on the place of residence or establishment of the sponsor apply since the entry into force of the regulation. Regulation 2024/900 (n 13) Art 30.
[111] European Commission, ‘Guidelines to support the implementation of Regulation (EU) 2024/900 on the transparency and targeting of political advertising’, C(2025) 6829 final.
[112] Google, ‘An update on political advertising in the European Union’ (14 November 2014) at blog.google and Meta, ‘Ending Political, Electoral and Social Issue Advertising in the EU in Response to Incoming European Regulation’ (25 July 2025) at about.fb.com.
[113] See in this respect the statement by LibertiesEU in response to Google’s decision at www.liberties.eu.
[114] Notably, systems of algorithmic recommendations tend to promote sensational and shocking content, as well as content that users already enjoyed and agreed with before. See L Quaritsch, ‘Political Advertising in the 2024 European elections Between Europeanisation and the protection of electoral integrity online’ (Hertie Policy Brief, 4 June 2024) at www.delorscentre.eu.
[115] S Sevignani, ‘Digital Transformations and the Ideological Formation of the Public Sphere: Hegemonic, Populist, or Popular Communication?’ (2022) 39 Theory, Culture & Society 91.
[116] I Nenadić, R Carlini and O Spassov, ‘A decade of digital transformation Pluralism between the media and digital platforms’ in E Brogi, I Nenadić, PL Parcu (eds) Media Pluralism in the Digital Era (Routledge 2024) 17.
[117] See for recent data: T Blagojev, K Bleyer-Simon, E Brogi, R Carlini, D Da Costa Leite Borges, JE Kermer, I Nenadic, M Palmer, PL Parcu, Pier Luigi, U Reviglio, M Trevisan, S Verza, ‘Monitoring media pluralism in the European Union: results of the MPM2025’, EUI Centre for Media Pluralism and Media Freedom (CMPF) Report 2025 at cadmus.eui.eu.
[118] On 10 July 2024 Hungary challenged the validity of the EMFA through an action for annulment, which is currently pending before the Court of Justice. See pending case C-486/24 Hungary v Parliament and Council.
[119] R Mastroianni, ‘Freedom and pluralism of the media: an European value waiting to be discovered?’ (2022) Media Laws 100.
[120] In this respect, on 11 December the Commission launched the first infringement procedure for failure to comply with the EMFA by sending a letter of formal notice to Hungary. See European Commission, ‘Press release: Commission calls on Hungary to comply with European Media Freedom Act and Audiovisual Media Services Directive’ at digital-strategy.ec.europa.eu.
[121] Regulation 2024/1083 (n 14) Art 5.
[122] Ibid Art 5(2).
[123] Ibid Art 21.
[124] Ibid Arts 8 to 13.
[125] Ibid Recital 1.
[126] Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive).
[127] P Cavaliere ‘Freedom of expression after disinformation: Towards a new paradigm for the right to receive information’ (2024) 16 Journal of Media Law28, 37.
[128] Regulation 2024/1083 (n 14) Recital 27.
[129] Ibid Recitals 9 and 21.
[130] Ibid Recital 64.
[131] Ibid Recital 32.
[132] See for a series of examples: EU Disinfo Lab, ‘The role of “media” in producing and spreading disinformation campaigns’ (13 October 2021) at www.disinfo.eu. On the role of traditional media in the dissemination of disinformation see Y Tsfati, H G Boomgaarden, J Strömbäck, R Vliegenthart, A Damstra and E Lindgren, ‘Causes and consequences of mainstream media dissemination of fake news: literature review and synthesis’ (2020) Annals of the International Communication Association 157.
[133] Regulation 2024/1083 (n 14) Recital 14.
[134] Ibid Recital 4.
[135] Commission Proposal for a Regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU, COM/2022/457 final (hereafter ‘EMFA Proposal’).
[136] Ibid, last sentence.
[137] Regulation 2024/1083 (n 14) Recital 6. See also recital 53.
[138] On the relationship between the EMFA and the DSA see K Klafkowska - Waśniowska, ‘Taking Extra Care of the Media? Media Content Moderation under the European Media Freedom Act’ (Verfassungsblog, 16 July 2024) at verfassungsblog.de.
[139] P Cesarini, G De Gregorio, O Pollicino, ‘The Media Privilege in the European Media Freedom Act’ (2023) Media Laws at www.medialaws.eu 16.
[140] See the Report on the workshop ‘The DSA’s capacities for free media and safe journalists online: realistic or utopian?’ held 23 February by the DSA Observatory and the AI, Media and Democracy Lab at dsa-observatory.eu.
[141] See the ‘Policy statement on article 17 of the proposed European media freedom act’ by several NGOs and civil society organisations, January 2023 at www.liberties.eu.
[142] I Nenadić and E Brogi ‘Why news media need article 17 of the European Media Freedom Act’ EMFA (Observatory Blog, 16 November 2023) at cmpf.eui.eu and D Borges, ‘Media ownership transparency and the European Media Freedom Act: how did the EU get there?’ (2024) Rivista italiana di informatica e diritto 259.
[143] See Article 17 of the EMFA Proposal.
[144] M Monti, ‘The missing piece in the DSA puzzle? Article 18 of the EMFA and the media privilege’ (2024) Rivista Italiana di informatica e diritto 195.
[145] Art 6(1) EMFA compels media service providers to make available to users a series of information, including the identity of their direct or indirect owners and the amount of public funding received, included by third States. See for a comment of the provision: D Borges, ‘Ownership transparency obligations under Article 6 of the European Media Freedom Act: opportunities and challenges’ EMFA (CMPF Observatory Blog, 15 November 2024) at cmpf.eui.eu.
[146] Regulation 2024/1083 (n 14) Art 18(1).
[147] Ibid Art 18(4), last subparagraph.
[148] Case C-899/19 P Romania v European Commission (Minority SafePack), EU:C:2022:41, para 55.
[149] See for instance A Garcia Pires, ‘Media Pluralism and Competition’ (2017) 47 European Journal of Law and Economics 255.
[150] For a critical opinion on the use of Article 114 TFEU see M Sznajder, ‘European Media Freedom Act and its implications for both merger control and media pluralism: the Polish perspective’ (2024) 49 European Law Review 66. As Hungary has challenged the EMFA through an action for annulment (n 118) the Court of Justice will soon have to adjudicate itself the validity of Article 114 TFEU as a legal basis in the context of democracy protection.
[151] Romania v European Commission (Minority SafePack) (n 148) para 55.
[152] Case C-376/98 Germany v European Parliament and Council (Tobacco Advertising I), EU:C:2000:544, para 83.
[153] B De Witte, ‘A competence to protect: The pursuit of non-market aims through internal market legislation’ in P Syrpis (ed) The Judiciary, the Legislature and the EU Internal Market (Cambridge University Press 2012) 25, 36.
[154] Case C-380/03 Germany v European Parliament and Council (Tobacco Advertising II), EU:C:2006:772.
[155] G Davies, ‘The Competence to Create an Internal Market: Conceptual Poverty and Unbalanced Interests’, in S Garben and I Govaere (eds), The Division of Competences in the EU Legal Order: Reflections on the Past, the Present and the Future (Hart Publishing 2017) 74.
[156] M Bonelli (n 5) 25.
[157] J Bayer, ‘The EU Policy on Disinformation: Aims and Legal Basis’ (2024) 16 Journal of Media Law 18, 27.
[158] S Garben, ‘Confronting the Competence Conundrum: Democratising the European Union through an Expansion of Its Legislative Powers’ (2015) 35 Oxford Journal of Legal Studies 55.
[159] B De Witte, ‘Internal market legislation as European public policy’ (2025) 80 Revista de Derecho Comunitario Europeo 11, 16.
[160] See for a similar finding in relation to the EMFA: E Longo, ‘Grounding media freedom in the EU: The legal basis of the EMFA’ (2025) Rivista italiana di informatica e diritto 111.
[161] Case C-157/21 Poland v European Parliament and Council, EU:C:2021:978, para 207.
[162] MZ van Drunen, N Helberger and RÓ Fathaigh, ‘The beginning of EU political advertising law: unifying democratic visions through the internal market’ (2022) 30 International Journal of Law and Information Technology 181, 194
[163] J Larik, ‘From Speciality to a Constitutional Sense of Purpose: On the Changing Role of the Objectives of the European Union’ (2014) 74 International & Comparative Law Quarterly 935.
[164] Contra, for the argument that, due to its limited competences, the Union’s legislation might be perceived as doing too little with respect to backsliding countries and too much for other Member States, see M Bonelli (n 5) and G Polyák, ‘Too Much for Others, too Little for Us: The Draft European Media Freedom Act from a Hungarian Perspective’ (Verfassungsblog, 15 March 2023) at verfassungsblog.de.
[165] T Tridimas, ‘Wreaking the Wrongs: Balancing Rights and the Public Interest the EU Way’ (2023) 29 Columbia Journal of European Law 185, 198.
[166] M Claes, ‘Safeguarding a Rule of Law Culture in the Member States: Engaging National Actors’ (2023) 29 Columbia Journal of European Law 214.
[167] See in this respect Monti (n 33).
[168] G De Gregorio and O Pollicino, ‘The European Approach to Disinformation: Policy Perspectives’ (IEP@BU Policy Brief June 2024) 1, 6, at iep.unibocconi.eu.
[169] See in particular the German Network Enforcement Act (NetzDG) of 2017 and the French Law on the Fight Against the Manipulation of Information of 2018.
[170] S Eskens, ‘The role of the Regulation on the transparency and targeting of political advertising and European Media Freedom Act in the EU’s anti-disinformation strategy’ (2025) 58 Computer Law & Security Review 1.
[171] For a comparison of the two Codes see E Brogi and G De Gregorio, ‘From the code of practice to the code of conduct? Navigating the future challenges of disinformation regulation’ (2024) 16 Journal of Media Law 38.
[172] While offering a stronger incentive for compliance, this, of course, does not imply that platforms are, or will be, fully compliant with the Code. See for an analysis of platforms’ compliance Mündges and Park (n 59).
[173] Eskens (n 170).
[174] O Pollicino and G De Gregorio ‘The European Constitutional Way to Address Disinformation in the Age of Artificial Intelligence’ (2025) 26 German Law Journal 449.
[175] J Bayer and others, ‘The fight against disinformation and the right to freedom of expression’ (2021) Study for the LIBE Committee of the European Parliament PE 695.445.
[176] Some scholars have argued that binding definitions of disinformation would actually be counterproductive. See in this respect Ó Fathaigh, R Helberger and N Appelman, ‘The perils of legally defining disinformation (2021) 10 Internet Policy Review 1.
[177] M Husovec, ‘The Digital Services Act’s Red Line: what the Commission can and cannot do about disinformation’ (2024) 16 Journal of Media Law 47 and Pollicino and De Gregorio (n 174).
[178] Joined Cases C-203/15 and C-698/15 Tele2 Sverige, EU:C:2016:970, para 93. See also Case C-163/10 Patriciello EU:C:2011:543, and Case C-507/18 NH v Associazione Avvocatura per i diritti LGBTI, EU:C:2020:289.
[179] Case C-280/21 PI v Migracijos departamentas, EU:C:2023:13, para 29.
[180] Regulation 2022/2065 (n 12) Recital 79.
[181] See supra section 3.2.
[182] Regulation 2024/1083 (n 14) Recital 2.
[183] Regulation 2024/900 (n 13) Recital 29.
[184] See in particular Regulation 2024/1083 (n 14) Recitals 4, 14 and 47.
[185] Commission Proposal for a Directive of the European Parliament and of the Council establishing harmonized requirements in the internal market on transparency of interest representation carried out on behalf of third countries and amending Directive (EU) 2019/1937, COM(2023) 637. See for a critical analysis of the proposal: F Feisel, ‘One Step Forward, Two Steps Back: The EU’s ‘Defence of Democracy’ Package (Verfassungsblog, 19 December 2023) at verfassungsblog.de.
[186] De Witte (n 153) 26.
[187] In the European Democracy Shield, the Commission committed that it ‘will continue to monitor and enforce obligations under the DSA and will engage with stakeholders to ensure that these obligations are upheld’. Joint Communication JOIN(2025) 791 final (n 6) 6. See for a critical comment on the Commission’s ‘much talk & little action’ approach: F Feisel, ‘The European Democracy Shield and Its Whole-of-Society Approach: From the Bottom‑Up, but Short on Concrete Action’ (Verfassungsblog, 20 November 2025) at verfassungsblog.de.
[188] RSF News, ‘EU: Without political will to enforce it, the EMFA risks becoming a dead letter’ (6 August 2025) at rsf.org and M Kozak, ‘Regulating Political Advertising: Lessons from Poland?’ (Verfassungsblog, 31 October 2023) at verfassungsblog.de.
[189] The review of the Audiovisual Media Services Directive is part of the Commission’s commitments within the European Democracy Shield. Joint Communication JOIN(2025) 791 final (n 6) 18.