Page Contents
Key contacts
Related areas
On 13 May 2025, the European Commission (the Commission) launched a public consultation on its much anticipated guidelines on the protection of minors (the Guidelines) under the Digital Services Act (DSA).
In this article, we provide an overview of some of the key aspects of the proposed Guidelines and our take on what they mean for online platforms.
Key takeaways
Purpose and status of the Guidelines
Article 28(1) DSA requires providers of online platforms which are “accessible to minors” to implement “appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service”.
There has been considerable discussion about the scope of this broadly framed obligation since its introduction, with many eager to see the Commission’s guidance to providers of online platforms on how to comply with this obligation (as envisaged under Article 28(4) DSA).The publication of the draft Guidelines marks the first instance of such guidance from the Commission and gives additional insight into its approach to Article 28 DSA.
These draft Guidelines are a markedly different instrument to the voluntary Codes of Practice which were previously issued by the Commission (see our previous posts on the Commission’s Code of Disinformation and the Code on Countering Illegal Hate Speech). In contrast to the voluntary, opt-in codes, the Guidelines expressly state that they will serve as a “significant and meaningful benchmark” against which the Commission will assess compliance with Article 28(1). This implies a stronger regulatory expectation that all online platforms which are “accessible to minors” (save for micro and small enterprises) will strive to align their practices with the Guidelines so as to meet DSA obligations. They are perhaps best understood as providing greater detail in respect of the broad but mandatory requirements set out in Article 28(1).
Nevertheless, the Guidelines are careful to maintain a degree of flexibility for the Commission and DSCs in assessing DSA compliance. Although they state that the Guidelines will limit the regulatory discretion in applying Article 28(1), they explicitly note that implementing the measures set out in the Guidelines will not automatically guarantee compliance.
Moreover, the Guidelines note that Section 5 of Chapter III of DSA (the Section which imposes specific obligations on very large online platforms services (VLOPS) and very large online search engines (VLOSES)) may require such entities to implement measures which go further than those outlined in the Guidelines in order to fulfil their Section 5 DSA obligations.
Those caveats aside, it seems likely that if a provider’s Article 28(1) compliance is being scrutinised, a provider that has made a demonstrable and concerted effort to follow the Guidelines should be in a strong position to defend its compliance posture. However, putting in place the necessary measures to demonstrate such compliance with Guidelines is likely to have a significant impact on the operations of many online platforms within the EU.
Scope of the Guidelines - When is an online platform “accessible to minors”?
The Guidelines make clear that an online platform provider cannot place itself outside of the scope of Article 28(1) DSA solely by including provisions in its terms of service that the service is not intended for use by individuals under a certain age.
The Guidelines reiterate the guidance provided in Recital 71 DSA, which notes that an online platform will be considered “accessible to minors” where any one of the following scenarios applies:
What exactly constitutes “some users” in the third scenario remains unclear, but the draft Guidelines do identify certain elements which will indicate that an online platform is “accessible to minors”, including:
Accordingly, the Guidelines reflect the broad approach taken under the DSA (and other EU laws such as the GDPR and EU competition law), to focus any regulatory assessment on how the platform operates in fact, rather than merely assessing its intended use. If an online platform is not intended for use by minors, it will need to be able to demonstrate it has effective measures in place to avoid minors gaining access, otherwise it will be expected to comply with the obligations of Article 28.
Requirement to conduct risk reviews
Perhaps one of the most notable developments under the Guidelines is the expectation that online platforms accessible to minors should complete a risk review of its own service (and each time a significant change is made to its service) to determine the extent to which minors use the platform and, as relevant, the measures it has implemented to protect minors on its service. It is proposed that this assessment should include an examination of:
While the Commission hints at potentially issuing further materials to support providers in carrying out risk reviews, an existing child rights impact assessment tool used by the Dutch Ministry of the Interior and Kingdom Relations is identified as a helpful resource.
The Guidelines note that, for the providers of VLOPS and VLOSES this “protection of minors” risk assessment, should form part of the general assessment of systemic risks which is already required under Article 34 of DSA.
While VLOPS and VLOSES will have stood up teams to carry out DSA risk assessments, many non-VLOP/VLOSE online platforms will now have to put in place new processes and procedures for regularly assessing the risks to minors in using its service. Given the ease with which many online platforms will fall within scope of being “accessible to minors”, this obligation is likely to impact a significant proportion of the industry.
Age assurance methods
A central focus of the draft Guidelines is a discussion on age assurance measures. The Guidelines provide some further detail on what appropriate age assurance measures might look like under Article 28(1) DSA. However, they do not take a prescriptive approach in respect of what technical measures must be implemented. They instead discuss a number of potential measures which can be used in conjunction with one another, depending on the circumstances, and focus instead on the overall efficacy of the measures used.
The Guidelines draw a distinction between “age verification” and “age estimation” measures. “Age verification” measures are those which rely on physical identifiers or verified documentation to determine the age of a user with sufficient certainty. “Age estimation” measures are those which allow a provider to make an informed estimation that a user is likely to be of a certain age, to fall within a certain age range, or to be over or under a certain age, generally based on behavioural indicators.
The Guidelines provide guidance in respect of the circumstances in which different types of measure may be most appropriate.
The draft Guidelines provide that more than one method should be available to verify a user’s age and users should be provided with a redress mechanism to complain about incorrect age assessments.
The following should also be considered before any age assurance measure is implemented:
Applying those considerations, the draft Guidelines explain that self-declaration would not be a sufficient age assurance measure. This is an unsurprising regulatory view and mirrors Coimisiún na Meán’s position on age assurance, as set out in Sections 10.6(f) and 12.10 of its Online Safety Code.
Interestingly, the Guidelines emphasise that the ultimate goal of age assurance measures is to ensure a high level of privacy, safety and security for minors on a service and they recognise that such objectives could be achieved by alternative means, including by the implementation of other protective measures discussed in the Guidelines (e.g. content moderation measures, default settings etc.). Accordingly, it seems that what will be appropriate in a given case will be highly dependent on the particular circumstances and nature of the relevant platform.
The Guidelines also recognise that online platforms might have only some content, sections or functions on their platform that carry a risk for minors and accordingly it may be appropriate to implement age assurance measures focused on just those parts of the service.
The stance adopted by the Guidelines regarding age assurance is broadly aligned with the evolving approach within the industry. Online platforms are increasingly utilising a variety of age assurance methods and tools to prevent underage users from accessing their services. This includes exploring the use of biometric identifiers, machine learning, and/or linking access to already verified items like credit cards or IDs. Outsourcing age verification to a third party which specialises in age assurance is also an increasingly popular practice.
The Guidelines also reference the European Data Protection Board (EDPB) statement on Age Assurance, but unfortunately they do not examine in detail the concerns raised by the EDPB that certain age assurance measures, while effective, may also necessitate the excessive collection of personal data from minors. Striking the right balance between implementing robust age assurance mechanisms and appropriately minimising the amount of personal data collected is a consistent challenge for service providers, so additional guidance from the Guidelines on how to navigate this issue would have been particularly valuable.
EU Digital Wallet
The draft Guidelines are principles-based, which helpfully leaves the door open for new age assurance measures to be identified in the future to enable compliance with Article 28(1) DSA.
In that regard, the draft Guidelines reference the EU Digital Wallet as a method of digital identification in the EU. Once issued, the EU Digital Identity Wallet is scheduled to be available by the end of 2026 and may be used by online platform providers for device-based age verification.
In the interim, the Commission has proposed an EU age verification app which will confirm whether users are 18 and can be presented to online platform providers to confirm the user is not a minor. The Commission published the technical specifications on the age verification app on the open-source developer platform Github in April 2025.
Account settings
Another particularly significant element of the Guidelines is the section addressing account settings and in particular default settings. An overarching principle of the Guidelines is “privacy-, safety-, and security-by-design”, meaning platforms accessible to minors should integrate the highest standards of privacy, safety and security into the design and operation of their services. As an extension of this requirement, the draft Guidelines expect default settings on minors’ accounts to be configured in a manner that promotes privacy and security, while also discouraging excessive usage and harmful behaviour.
The Guidelines set out a list of account settings which should be implemented as default on a minor’s account, many of which could have a significant practical impact on the operation of online platforms. For example:
The Guidelines also expect regular tests and updates of default settings to ensure they remain effective against emerging online risks and trends.
Online interface design
The Guidelines highlight the importance of providers ensuring that its online interface design does not encourage overuse or compulsive behaviour. Features such as indefinite scroll, automatic triggering of video content, artificial content and virtual rewards for repeated actions on the platform are viewed as encouraging negative behaviour in minors. There is a focus on increasing awareness for minors on their time spent on online platforms and to create “real frictions” so minors are dissuaded from spending more time on the platform.
Recommender systems
The draft Guidelines outline that recommender systems may pose and exacerbate risks to minors’ privacy, safety and security online by amplifying content that can have a negative impact. The Guidelines require that recommender systems should generally operate in a way to prevent minors’ repeated exposure to harmful content, but should also give minors the autonomy to completely “reset” their recommended feeds and opt for a recommender system not based on profiling.
Some of the suggestions made in the Guidelines for recommender systems include the following:
Other child protection measures
Beyond the measures outlined above, the draft Guidelines also outline a number of measures that may be adopted by online platform providers to comply with Article 28(1) DSA (depending on the service) such as:
Commentary
The publication of the draft Guidelines will be welcomed as it ends the wait for some further clarity on the interpretation of Article 28(1) DSA. The adoption of a principles-based approach is particularly sensible, given the broad range of services covered by the DSA. This flexibility is essential to avoid an overly prescriptive framework, ensuring that compliance measures can be tailored to the specific nature of each service, and ensuring that online platforms can use their own deep expertise of their services to design the most effective means of achieving the objectives of Article 28(1) DSA.
However, it is difficult to avoid the conclusion that the Guidelines will introduce a significant layer of additional regulatory expectations (or, in effect, quasi-obligations) for non-VLOPS and non-VLOSES which are accessible to minors. Further, these types of quasi-obligations are analogous to the types of obligations that the EU legislature had specifically reserved for application to VLOPS and VLOSES. For example, the risk review is analogous to the Article 34 DSA risk assessment (as the Guidelines implicitly acknowledge), and the governance obligations are analogous to the Article 41 DSA independent compliance function obligations). That said, the Guidelines make clear that proportionality will be a core principle in applying Article 28(1) DSA, and it may be that providers can design less burdensome measures which are nonetheless acceptable to regulators as they prioritise the best interests of minors, and effectively safeguard their privacy, safety, and security.
Finally, it is noteworthy that there are some cross-overs between the Guidelines and Coimisiún na Meán’s Online Safety Code (which applies to video-sharing platform services whose providers are established in Ireland). For instance, there appears to be a degree of alignment in respect of age assurance measures, the need to restrict who can see a minor’s content and who can contact a minor, and the availability of controls for parents/guardians.
What’s next?
The Guidelines are open for feedback until 10 June 2025, with the final Guidelines due to be published in summer 2025.
If you would like any further information on the Commission’s Article 28(1) DSA Guidelines, please contact Dr Stephen King, Partner, Eoghan O’Keeffe, Knowledge Consultant, Jade Van Standen, Solicitor, or your usual contact on A&L Goodbody’s Technology team.
Date published: 22 May 2025
[1] When considering what risks are posed to minors, online platform providers should consider the following types of risk: content, conduct, contact, consumer and cross-cutting risks (referred to in the draft Guidelines as “5C typology of risks”).