blog
Year XIX – no. 43
Digital Statute for Children and Adolescents (Law No. 15,211/2025) – Effective on March 17, 2026
5 de March de 2026Introduction
The risks to which children and adolescents are exposed in the digital world are significant, and cases of humiliation, threats, intimidation, and cyberbullying in Brazil are alarming. This fact can be verified in the 2nd Technical Bulletin School That Protects – Data on Bullying and Cyberbullying[1], which is part of the School That Protects Program, linked to the National System for Monitoring and Combating Violence in Schools (SNAVE).
The enactment of Law No. 14,811/2024 already represented progress in amending Article 146 of the Penal Code, introducing the criminal offense of cyberbullying, for bullying committed through social networks, applications, online games, or other digital means, including through real-time transmission, with a penalty of imprisonment from 2 to 4 years and a fine, if the conduct does not constitute a more serious crime.

In 2025, there was further progress in addressing this issue through the creation of the Digital Statute for Children and Adolescents, known as the Digital ECA. The Digital ECA was established with the enactment of Law No. 15,211, published on September 17, 2025.
The Digital ECA is in line with other initiatives abroad for the protection of children and adolescents online, for example, in British legislation, especially the Online Safety Act 2023 – OSA, which has the general objective of making Internet services safer for users in the United Kingdom. We will now draw a parallel between the OSA and the ECA Digital, in order to In Practice evaluate the strategic points of the regulations and their implementation.
Validity of the Digital ECA and the OSA
The OSA is more comprehensive than the Digital ECA, as it aims to protect both children and adults online. Both legal instruments converge on the objective of making the digital environment safer for users, albeit with distinct scopes and legislative approaches.
The OSA predates the Digital ECA (it was enacted on October 26, 2023) and has a phased implementation roadmap, which is being led by the Office of Communications (OFCOM) and is heavily dependent on the finalization of codes of practice (non-binding recommendations) and secondary legislation. The Digital ECA will come into effect on March 17, 2026, as per Law 15.352/2026 and also depends on additional regulation.
The National Data Protection Agency (ANPD) has set February 13, 2026, as the deadline for some Information and Communication Technology (ICT) platforms and companies subject to the obligations of Law No. 15,211 to have submitted reports containing the technical and organizational measures for adapting to the Digital ECA.
Scope and Regulatory Body
- Digital ECA (Brazil): The element that attracts the application of the Digital ECA is the information technology product or service that is directed at children and adolescents in Brazil, or that is likely to be accessed by them, regardless of where the service originates, or where its development, manufacture, offering, marketing or operation occurred. This includes internet applications, computer programs, operating systems, internet application stores and electronic games; in other words, any digital service that targets children or adolescents in Brazil, or is likely to be accessed by them, must comply with the Digital ECA.
- Online Safety Act (United Kingdom): The Law does not apply to all platforms. Its scope includes search engines; social networks; services that publish or display pornography; user-to-user (U2U) services, that is, those that allow users to publish content online or interact with each other. This includes, for example, social networks, instant messaging services, marketplaces, games, discussion forums and chat rooms, video or audio sharing platforms. The OSA has extraterritorial reach as it applies to services even if the companies providing them are outside the United Kingdom, but must have a link to the United Kingdom, such as global platforms with a significant number of users in the United Kingdom, or that direct their services to British citizens.
- Regulatory Body: Following the publication of the Digital ECA, Decree No. 12,622/2025 was published on the same day designating the National Data Protection Authority (ANPD) as an autonomous administrative authority for the protection of children and adolescents in digital environments. On 02/25/2026, Law 15.352/2026 was enacted, which transformed the ANPD into a Regulatory Agency linked to the Ministry of Justice. In the United Kingdom, OFCOM assumes the role of regulatory entity for online safety. OFCOM may develop guidelines and codes of practice that establish how online platforms can fulfill their obligations, monitor the effectiveness of the processes adopted by the platforms in protecting users from harm, and finally, take the necessary measures against companies that fail to fulfill their obligations.
General Duties
- Prevention and Protection Duties: Broadly speaking, the Digital ECA establishes in its article 3 that suppliers of information technology products or services directed at children and adolescents or likely to be accessed by them (“Suppliers”) must guarantee the priority protection of these users, always preserving the best interests of the minor or adolescent, adopting technical measures that ensure privacy and data protection. Suppliers must observe duties of protection, prevention, information, and security, observing the provisions of the Consumer Protection Code (Law No. 8,078/1990) and the Statute of the Child and Adolescent (Law No. 8,069/1990). The protection of minors and adolescents, among others, against pornography, cyberbullying, inducement, incitement or assistance to self-harm, suicide, self-medication and drug use, and the practice of betting or gambling, must be adopted from the conception of the product or service.
- Similarly, the OSA imposes a series of new obligations on platforms and services under its scope, making their duties to prevent and mitigate risks more transparent and objective. The obligations of U2U services and search engines are slightly different. Platforms and search engines are required to prevent children from accessing harmful and age-inappropriate content, as well as provide parents and children with clear and accessible ways to report access to inappropriate or illegal content. In both cases, platforms and search providers assume active responsibility (duty of care/duty of safety) for the content to which children may be exposed. Providers must ensure, by default, the configuration of the most protective model available regarding privacy and personal data protection, and also develop and adopt by default, from the outset, configurations that prevent the compulsive use of products or services by minors.
- Risk of Access to Harmful Content: The Digital ECA mandates that platforms take reasonable measures to prevent and mitigate the risk of minors being exposed to harmful content, such as sexual exploitation and abuse, physical violence, intimidation and harassment, promotion of gambling, and pornography. The adoption of age verification mechanisms is not mandatory for all Providers, but only for those Providers that offer content, products, or services whose offer or access is prohibited, inappropriate, or unsuitable for minors under 18 years of age. These Providers are obliged to adopt effective age verification measures for each access (article 9, paragraph one of the Digital ECA) to prevent minors from accessing such inappropriate or harmful content, and self-declaration regarding age is prohibited.
- Age verification: The OSA requires platforms to adopt age verification or age estimation mechanisms to prevent children from accessing harmful Primary Priority Content, such as pornography, content that encourages, promotes, or provides instructions for self-harm, eating disorders or suicide, bullying, hate speech, content that depicts or encourages violence or serious injury, content that encourages dangerous stunts and challenges, and content that encourages the ingestion, inhalation, or exposure to harmful substances. Search engines have a duty to conduct impact assessments on the risk of children encountering Primary Priority Content that is harmful to their age group, especially children with certain characteristics or who belong to specific groups, considering the risk of the algorithms used and how they organize and present search results.
- Content Rating by Age Group: Article 10 of the ECA Digital mentions that the Provider must provide an experience appropriate to the minor’s age, referring to the age rating system. Platforms should, by default, clearly state the appropriate age range for the service or content at the time of access. Similarly, the OSA mandates the use of age assurance measures to ensure that underage users have age-appropriate experiences and are protected from harmful content. In the United Kingdom, a series of OFCOM guidelines were published in early 2025 to guide platforms and users on how age assurance measures and access assessments for children can be effectively implemented.
- Data Collection for Age Verification: Internet application store and terminal operating system providers must verify the user’s age, or age range (age signal), and may share such data or information, via API, with Internet application providers. The age range attribute, or the age data of the minor or adolescent, collected for age verification, may only be used for this purpose, and the creation of behavioral profiles of children and adolescents from the collection of their data, as well as any processing operation for another purpose, is prohibited. To download applications, children and adolescents require the consent of their parents or legal guardian, and authorization cannot be presumed in the absence of consent from the parents or legal guardians.
- Social Networks and Linked Accounts: The Digital ECA creates an unprecedented requirement that social media accounts of children and adolescents (up to 16 years old) be linked to the accounts of one of their parents or legal guardians. This measure seeks to ensure additional supervision at the time of account or profile creation. Providers must suspend user access if they determine that the account belongs to a minor or adolescent, in violation of the rules of the Digital ECA, and parents or guardians must present proof of age to restore the account. There is no similar rule in the British legislation. The OSA does not prohibit children’s access to social networks, nor does it establish a minimum age for their use, but it does require social media companies to implement effective age verification to protect their child users. OFCOM will publish a report on the use of the age assurance mechanism by platforms and regulated service providers for the purpose of compliance with the OSA.
- Parental Supervision: The Digital ECA sets forth that Providers must provide accessible and easy-to-use settings and tools that support parental supervision, always in the best interests of minors; provide information to parents or legal guardians about existing tools; offer features that allow parents to view, limit and monitor the time of use of the product, avoiding situations that encourage increased use; manage privacy options; restrict geolocation sharing; restrict purchases and financial transactions; identify profiles of contacted adults, among others; and display a clear and visible warning when parental supervision tools are in effect and what settings or controls have been applied.
- Electronic games intended for children and adolescents, or that are likely to be accessed by them, should, by default, limit the functionality of interaction with other users, so that parents or legal guardians can first consent to the minor’s access to such content.
- The OSA does not detail parental supervision tools in the same way but encourages companies to provide clear and accessible means of family management; in addition, OFCOM has been preparing guides for parents to help children use networks safely. (https://www.ofcom.org.uk/online-safety/protecting-children/how-ofcom-is-helping-children-to-be-safer-online-a-guide-for-parents).
- Targeted advertising: The Digital ECA prohibits the use of profiling techniques and emotional analysis to target advertising to children and adolescents. The OSA, in turn, does not specifically address advertising techniques, but contains rules to prevent fraudulent advertising for each category of regulated service.
- Mandatory removal and notification of violations: The Digital ECA mandates that Providers remove content that promotes sexual exploitation or abuse, kidnapping, and enticement of minors as soon as it is detected in their products or services and report it to the competent authorities as established in regulations. Furthermore, the Digital ECA obligates services to provide mechanisms for users to report violations of the rights of children and adolescents. Based on the principle of comprehensive protection, any content that violates the rights of children and adolescents must be removed as soon as the Provider is notified by the victim, their legal representatives, the Public Prosecutor’s Office, or protection entities, regardless of a court order (article 29 of the Digital ECA). The OSA, in turn, also requires platforms to promptly remove any content that may endanger the health and safety of children. Both laws mandate that providers act proactively in removing illicit material.
Transparency, Accountability, and Sanctions
- Transparency Reports: The ECA Digital requires that internet application providers targeting children and adolescents with more than 1 million registered users in that age group must prepare semi-annual reports in Portuguese, to be published on the Provider’s website, detailing the security measures adopted, the number of complaints received, measures taken to identify children’s accounts on social networks, and other information provided for in the Law. The OSA provides that OFCOM will notify high-risk service providers to produce a transparency report on the service they provide. Furthermore, the OSA provides that OFCOM may request information from providers for various purposes, including verifying compliance with obligations and even information about the use of a service by a specific person. In both cases, there is a concern for transparency and accountability and a search for greater visibility over the providers’ operations.
- Sanctions: The Digital ECA provides for a simple fine of up to 10% of the economic group’s revenue in Brazil in its last fiscal year, or, in the absence of revenue, a fine of R$10.00 to R$1,000.00 per registered user, limited to R$50 million, per infraction. In addition to fines, other penalties are foreseen, ranging from warnings to the suspension or prohibition of activities. All revenue from fines collected will be allocated to the National Fund for Children and Adolescents. According to the OSA, companies may be fined £18 million or 10% of their global revenue (whichever is greater) in case of non-compliance with the rules. OFCOM may also resort to other penalties such as suspension of services and criminal liability of senior management for repeated or serious non-compliance. In both systems, penalties reflect the nature of the infractions and consider factors such as the severity of the infraction, recidivism, and social impact.
This article is for informational purposes only and does not constitute legal opinion or advice.
BRENTANI RONCOLATTO ADVOGADOS
[1] https://www.gov.br/mec/pt-br/escola-que-protege
[2] See article 11 of Ordinance No. 1,048/2025 of the Ministry of Justice and Public Security.