Careers

Learn more

Qualified professionals

Learn more

Trainee & intern programmes

Learn more

Offices

New York

Learn more

San Francisco

Learn more
A&L Goodbody logo
Digital Fairness Act: A necessary safeguard or unnecessary duplication? 

Consumer & Advertising

Digital Fairness Act: A necessary safeguard or unnecessary duplication?

The EU plans to introduce a new Digital Fairness Act in 2026 to tackle online consumer risks.

Fri 13 Feb 2026

10 min read

We are almost seven years on from the European Commission’s (EC) announcement of the New Deal for Consumers in 2019 which saw the radical reform of EU consumer rights to make them more compatible with e-commerce and digital age; the introduction of ‘GDPR style’ fines for EU-wide infringements; and a collective redress regime to ensure that consumers can collectively seek remedies where breaches of consumer rights arise.

This updated consumer rights regime is still bedding down. There have been year-on-year increases in cross-border investigations by the Consumer Protection Cooperation Network, increased class actions across the EU, and harder enforcement domestically (including the recent series of prosecutions by the Irish Competition and Consumer Protection Commission for pricing issues).Despite this, it seems almost certain that more regulation is coming down the line in 2026, in the form of the Digital Fairness Act (the DFA), which will specifically target perceived issues consumers face in the online environment.

In this article, we will look at some of the key proposals and query whether this further legislative intervention is a necessary safeguard or unnecessary duplication.

Background

In May 2022, the EU Commission announced that it was conducting a ‘fitness check’ to evaluate whether three existing consumer protection directives - the Unfair Commercial Practices Directive (2005/29/EC) (the UCPD), the Consumer Rights Directive (2011/83/EU) (the CRD), and the Unfair Contract Terms Directive (93/13/ EEC) (UCTD) (together the Directives) ensure a high level of protection in the online environment 

As part of the ‘fitness check’, on 28 November 2022, the EC launched a public consultation to gather stakeholder views on whether amendments should be proposed to the Directives to provide better protection to consumers including by requiring traders to:

Following the public consultation, on 10 October 2024, the EC published a staff working document fitness check of EU consumer law on digital fairness (the Working Paper) which concluded that, while the current framework provides a foundational level of protection, gaps in enforcement, legal uncertainty, and the inadequacy of existing laws to address newer technologies like AI and algorithmic decision-making are significant concerns. The Working Paper suggested targeted updates to ensure effective protection and legal coherence across Member States in the rapidly evolving digital landscape.

Key target areas

Although the Digital Fairness Act (the DFA) has not yet been drafted, the EC’s Working Paper identified several specific practices which are likely to be addressed in the DFA.

The EC’s Working Paper identifies features such as infinite scroll, autoplay, pull-to-refresh and ephemeral content as contributing to digital addiction by exploiting psychological triggers, especially amongst vulnerable consumers such as children.

In the online gaming sphere, the EC pointed to the “significant lack of compliance with the CRD’s provisions regarding transparency and right of withdrawal” relating to the sale of virtual items and the use of intermediate in-app virtual currencies. The Working Paper stated that consumers are often confused about the real price of virtual items and raised concerns about the marketing practices related to virtual items, such as ‘pity timers’ that increase the odds of winning after many losses and pay-to-win models.

The Working Paper notes that no EU legislation currently addresses addictive design directly (though, as we will explore below, this is arguably not correct). The EC recommends the introduction of specific rules to limit addictive design features, particularly in services targeting children.

The Working Paper defines dark patterns or deceptive design as “commercial practices deployed through the structure, design or functionalities of digital interfaces or system architecture that can influence consumers to take decisions they would not have taken otherwise”. Examples of this practice include, fake countdown timers, hidden information and creating false hierarchies in choice architectures. The Working Paper notably advocates for the introduction of ‘fairness-by-design‘ obligation for digital interfaces.

The EC raises concerns on the use of consumer data to tailor prices, offers or content in ways that exploit consumer vulnerabilities or reduce transparency. Examples of unfair personalisation include price discrimination based on location, device or browsing history and targeted ads exploiting emotional states or financial stress.

The Working Paper recommends the strengthening of transparency obligations for personalised pricing and advertising.

The Working Paper found that consumers often face obstacles when trying to unsubscribe from a digital service, for example, through hidden cancellation buttons, repeated prompts to reconsider, and automatic renewals without reminders.  

The EC recommends the introduction of EU-wide rules requiring cancellation to be as easy as sign-up and mandatory clear technical means (for example, cancellation buttons) and confirmation of termination.

The Working Paper found that dropshipping (where sellers forward orders to third-party suppliers) often results in misleading product descriptions, long delivery times and non-compliance with safety standards.

The EC proposes enhanced liability rules for platforms and intermediaries, requiring disclosure of product origin and supplier identity and stricter accountability for product safety and conformity.

The EC notes that social media influencers often promote products without clearly disclosing paid relationships, which misleads consumers about the nature of the endorsement.   

The Working Paper advocates for the introduction of stricter rules on the disclosure of paid partnerships and sponsored content, accountability for misleading claims made by influencers and platform obligations to monitor and enforce compliance.

Risk of overregulation?

Although stakeholder feedback is still emerging, early responses raise significant concerns. Despite the EU’s stated commitment to simplifying laws to boost competitiveness and reduce administrative burdens, aspects of what is expected to be contained in the DFA appear to contradict these goals. The proposed reforms may also be unnecessary or, at least in part, duplicative given existing legislation.

In relation to dark patterns, for example, while the term is not expressly defined in EU legislation, provisions (and, as applicable, guidance) under the DSA (related to interface design and organisation), the GDPR (related to the principle of fairness and data protection by design), and the UCPD (related to the prohibition on the use of misleading and aggressive practices) may be sufficient to address this practice and we have already seen regulators active in this space.

For example, the EC and the Irish media Digital Services Coordinator, Coimisiún na Meán are assessing whether certain in-scope providers have designed their DSA ‘notice and action’ mechanisms in a way  that confuses and dissuades users about the types of content they may report (i.e. illegal content or content that is in breach of providers’ terms and conditions) and are therefore engaging in dark patterns in breach of Article 25(1) DSA.  In addition to the DSA, from a consumer protection perspective, the EC’s 2021 UCPD Guidance explains that, if dark patterns are applied in a business-to-consumer context, the UCPD may be used to challenge the fairness of such practices, in addition to other laws such as the GDPR.

The DSA is also relevant in regulating addictive design. The recitals of the DSA clarify that, amongst the systemic risks that very large online platforms and very large online search engines must identify, assess, and mitigate, they should consider if the design, functioning or use of their services may lead to “actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person's physical and mental well-being”. For example, to an individual’s mental well-being, such as by stimulating behavioural addictions. The EC has several ongoing investigations which include consideration of whether certain service features have an addictive design, leading to negative consequences to users’ well-being.

Conclusion

The above are just a few examples, but they arguably call into question the need for a net new legal instrument in areas which are already highly complex and, at least in part, already heavily regulated.

Nevertheless, the EC seems committed to bringing forward the DFA, with a draft proposal due by the end of 2026. It remains to be seen whether the EU will seek to delineate between the obligations under the various regimes outlined above or, potentially more likely, whether online businesses will need to grapple with the interplay and heightened regulatory risk arising from a new addition to the regulatory landscape.

If you have any queries about how the upcoming Digital Fairness Act might affect your business, please contact Mairead O’Brien, Partner, Denise Daly Byrne, Partner, Consumer and Advertising, Mark Ellis, Partner or Irina Vasile, Senior Associate, Technology.

Date published: 13 February 2026

Key Contacts