Publicado em 24/03/2026

Digital ECA: Consolidated Analysis Following the Publication of Decree No. 12,880/2026

The Digital ECA has entered into force, and with the publication of Decree No. 12,880/2026 (“Decree”), a significant portion of the applicable rules has been detailed.

This alert aims to present how this new regime is structured and what its main practical effects are.

Although there are still technical aspects that depend on supplementary regulations, particularly from the National Data Protection Agency (“ANPD”), several obligations are already in effect. This changes the starting point for companies. The question is no longer whether it is better to wait or act, but rather to understand what needs to be done now, what requires proper documentation, and what should be monitored in upcoming regulatory developments.

Based on this, three questions guide the analysis:

  • What is the risk level of my product or service? The Decree makes it clear that the risk level is the primary factor in determining the required measures. Products or features involving gambling, casinos, loot boxes, or pornographic content fall under the highest level of requirements. In these cases, age verification with a high degree of reliability is required, in addition to the removal of existing accounts belonging to minors. There is no transition period for these measures.
  • Does my system perform age verification or just age assessment? This is the most significant distinction introduced by the Decree. For content prohibited to minors, it is not sufficient to estimate or infer age. It is necessary to confirm age with a high degree of reliability. In practice, user self-declaration no longer meets this requirement, even though it has been a widely used standard until now. The technical criteria are still to be defined by the ANPD, but the obligation to adopt a risk- based mechanism already exists.
  • Who controls the verification process and data processing? The Decree requires that data collected for the purpose of age verification or assessment be deleted immediately and irreversibly after use. If this process is carried out by third parties, this does not alter liability. The company that defines the processing remains responsible for the entire flow, including the proper deletion of the data.

1. Prohibited content vs. inappropriate content

This is one of the most significant distinctions introduced by the Decree. Classifying content incorrectly can lead to two problems: the adoption of insufficient measures or restrictions that are more severe than necessary. The difference between the two regimes is as follows:

  PROHIBITED CONTENT INAPPROPRIATE / INADEQUATE CONTENT
Legal definition Content that is prohibited by law from being made available to minors.

Includes, among others: weapons, ammunition, explosives, alcoholic beverages, tobacco products, substances with addictive potential, fireworks, gambling, betting, lotteries, loot boxes, pornographic content, escort services, and dating apps for sexual purposes.

Content that is not prohibited by law but is considered inappropriate for certain age groups.

The classification follows the Content Rating System. The Decree does not provide a closed list and requires a case-by-case assessment, taking risk into account.

Age verification mechanism Highly reliable age verification. Self- declaration is not sufficient. Age verification proportional to the risk. The technical criteria are still to be defined by the ANPD.
Blocking requirement There must be effective blocking of access, use, or consumption.

For products restricted to those over 18, it is necessary to prevent minors from creating accounts and to remove existing accounts.

There is no requirement for total blocking.

Access must be restricted by default, effective parental supervision must be provided, and the age rating must remain visible.

2. Age assessment vs. age verification: the key distinction

This is the most frequently misunderstood aspect of the Digital ECA. Many platforms collect the date of birth during registration and assume they are compliant, when in practice they may not be. The difference between age assessment and age verification is not merely conceptual. It determines whether the system is compliant or in violation.

  AGE ASSESSMENT
gender
AGE VERIFICATION
especies
Definition A mechanism used to estimate, infer, or confirm the user’s age group. It encompasses methods of varying degrees of accuracy. A mechanism designed to confirm age with a high degree of reliability. It requires effective validation of the information.
When it applies Inappropriate or unsuitable content. Content legally prohibited for minors, such as gambling, casino, pornography, and loot boxes.
Requirement level Defined by the provider, based on the product’s risk. The technical criteria will be further detailed by the ANPD. Higher. There is no room for solutions based solely on estimates or self-declaration.
Method May be defined by the provider, provided it is proportional to the risk. Subject to greater standardization and possible certification. The government may play a role in defining the technical parameters.
Data processing Must adhere to the principle of data minimization Data used for verification must be deleted immediately and irreversibly after use, in addition to being minimized.
Existing user base There is no express obligation to reassess. This must also be applied to the existing database. Accounts belonging to minors must be identified and removed immediately.

3. Product risk quadrant

The risk level depends on two factors: the type of content (prohibited or inappropriate) and the degree of interaction and monetization of the product. Identify below the scenario that best describes your product. The measures indicated reflect the expected priority action.

3.1 Open chat, UGC, and direct monetization

Content Prohibited for Minors CRITICAL RISK
Recommended Measures •Implement high-level age verification immediately;

•Remove existing accounts belonging to minors

•Set parental consent as the default for interactions

•Implement age verification for features involving loot boxes, unless there is a version of the product with this feature disabled by default for minors

•Require judicial authorization for monetization of content featuring minors

Inappropriate or unsuitable content HIGH RISK
Recomendações
gerais
•Implement age verification proportional to the risk

•Enable parental controls by default

•Ensure active content moderation, including in chats

•Suspend all forms of behavioral advertising targeting minors

•Eliminate design practices that encourage compulsive use

3.2 Filtered chat and curated content

Content Prohibited for Minors HIGH RISK
Recommended Measures •Implement age verification, even with filtered chat

•Block prohibited content by default

•Update consent flows to reflect monetization, interaction, and any loot boxes

•Review terms of use, with a clear version in Portuguese

Inappropriate or unsuitable content RISCO MODERADO
Recommended Measures •Document mechanisms to prevent compulsive use

•Provide accessible and functional parental controls

•Ensure there is no advertising targeted at minors

•Adopt more protective settings as the default

4. What has changed: Law, Decree, and market practices

To understand the impact of the Digital ECA, it is necessary to distinguish between three distinct levels:

  • what the Law establishes
  • what the Decree details or expands
  • what the market used to do that is now insufficient, prohibited, or mandatory

The columns below follow this logic. The analysis is not exhaustive, but covers the points with the greatest practical impact.

Topic Law Decree Market practice
Age Verification ESTABLISHES OBLIGATION

Requires the adoption of mechanisms to prevent minors from accessing prohibited or inappropriate content, without defining how this should be done.

EXPANDS THE DIGITAL ECA

Distinguishes between age assessment and age verification.

Prohibits self- declaration for prohibited content. Requires the immediate deletion of data used in verification. Technical requirements are still pending from the ANPD.

PROHIBITED 

Self-declaration as the sole mechanism for age verification. This model is no longer acceptable for prohibited content.

Blocking of prohibited content GENERAL PRINCIPLE

Establishes the duty to protect minors, without detailing the mechanism.

DETAILS 

Defines what constitutes prohibited content and requires blocking, hiding, or blurring by default when age verification is not available.

PROHIBITED 

Sexually explicit content accessible by default, with optional restriction (opt-out), rather than active protection by default (opt-in).

 

Access to restricted content without effective restrictions or based solely on self- declared age

Loot boxes and randomization mechanisms ESTABLISHES REQUIREMENT

Prohibits minors from accessing loot boxes.

EXPANDS THE DIGITAL ECA

Requires high-level age verification for access to features with loot boxes. Allows for an alternative: making a version of the product available without loot boxes or with this feature blocked by default for minors.

PROHIBITED 

Loot boxes available through self- declaration or without any controls. This model is no longer permitted.

Content inappropriate for minors DEFINES OBLIGATION

Imposes obligations, but does not clearly define what constitutes inappropriate content.

EXPANDS THE DIGITAL ECA

Defines the concept and establishes conditions for its availability.

PROHIBITED 

Prohibited: making content available without complying with the Content Rating Policy.

 

Lack of technical and organizational measures appropriate to the level of risk, starting from the product’s design.

Lack of effective parental supervision tools, including blocking features that can be configured by guardians.

Chat and user interaction DEFINES OBLIGATION

Requires technical safeguards, without specifying how.

DETAILS 

Open chat with active parental consent required by default.

 

Mandatory moderation of interactions.

 

Initial settings at the most protective level.

INSUFFICIENT 

Chat enabled by default for all users.

Parental controls available only as an option, and not enabled by default.

 

Reactive moderation, performed only after a report.

Behavioral advertising DEFINES OBLIGATION

Prohibits advertising targeted at minors based on profiling.

DETAILS 

Extends the prohibition to any form of behavioral advertising, including browsing, history, location, and inferred interests.

PROHIBITED 

Ad targeting based on behavior, including for users who may be minors. This model is no longer permitted.

Dark patterns ESTABLISHES OBLIGATION

Prohibits manipulative practices or those that induce harmful use.

EXPANDS THE DIGITAL ECA

Provides examples of prohibited practices, such as: creating artificial urgency, including countdowns and simulated scarcity; obstacles to cancellation or leaving the service; rewards tied to usage time, without limits

This list is illustrative.

PROHIBITED 

Mechanics that encourage continuous engagement, such as daily rewards, countdowns, and exit friction.

Legal representative in Brazil DEFINES OBLIGATION

Requires a legal representative in Brazil with the authority to receive official communications.

N/A Does not add new elements. MANDATORY 

Mandatory: operating in Brazil without a formal representative or with a structure lacking sufficient authority. This is now considered non-compliant.

Application to existing accounts DEFINES OBLIGATION

Requires accounts to be brought into compliance with the new regime, with no defined deadline.

EXPANDS THE DIGITAL ECA

For prohibited content, age verification is also mandatory for existing accounts.

Accounts identified as belonging to minors must be removed immediately.

For inappropriate content, gradual compliance is permitted.

MANDATORY 

Mandatory: unreviewed user bases, with accounts created solely through self- declaration. This scenario is no longer acceptable.

What still depends on ANPD regulations?

Some obligations set forth in the Digital ECA and the Decree are already in effect, but depend on technical definitions by the ANPD for their full implementation. This does not mean that companies can wait. The obligation already exists. What remains to be defined is how it should be operationalized.

Pending with the ANPD What will be defined Risk of waiting
Guidance on scope and application Definition of who is subject to the Digital ECA, including the concept of “likely access,” as well as parameters for privacy by design and risk management. The ANPD has indicated that structured enforcement, with penalties, will begin after the guide is published. The publication of the guide serves as a trigger for enforcement. Companies without a documented compliance plan tend to be more exposed from the start of enforcement.
Technical requirements for age verification Definition of accepted methods for high-level assessment and verification. Until then, each company must technically justify the solution adopted based on risk. There is no expected grace period for compliance. Those without an implemented solution will need to act quickly. There is also a risk of rework if the adopted solution is not compatible with future standards.
Enforcement and sanctions regulations Adaptation of the current LGPD model to the Digital ECA, with the definition of applicable procedures, criteria, and sanctions. Until these are defined, there is uncertainty regarding practical application. Following publication, the trend is toward increased predictability and accelerated enforcement actions.
Parental supervision standards Definition of minimum requirements for tools, including features, default settings, and accessibility criteria. Basic or less robust solutions may require redesign within a short timeframe following regulation.
Accreditation of
notifying
entities
Definition of who may request content removal without a court order, under the extrajudicial notice-and-takedown model. Platforms without a structured process for responding to notifications may immediately be in violation upon receiving the first requests.
Linking accounts of
minors under 16
Definition of the technical model for linking accounts to guardians, including verifiable consent and supervision limits. Products that do not currently account for this linkage may already require structural changes, not just policy adjustments.

What is the main risk of waiting?

The current uncertainty is technical, not legal. The obligations are already established. What remains to be defined by the ANPD are the minimum criteria, the form of proof, and the level of operational requirements. This does not suspend the need to act. In practice, platforms that reach the publication of these rules without any solution implemented—even a partial one—are unlikely to be granted an adaptation period. Compliance assessments will now consider the current situation, not just the moment of regulation.

5. The Digital ECA in the Regulatory Ecosystem

The Digital ECA does not replace existing regulations. It complements them. This means that the same conduct can be analyzed under different legal regimes at the same time. In practice, understanding this overlap is essential for accurately assessing risk.

Brazilian Civil Rights Framework for the Internet

The Supreme Federal Court has been expanding the liability regime for platforms regarding third-party content. The trend is to require action upon receipt of proper notifications, without the prior need for a court order. The Digital ECA reinforces this trend in cases involving children and adolescents. In these scenarios, faster and more structured action is expected from the “ ” platforms, especially regarding the removal or restriction of content.

LGPD

The LGPD already establishes enhanced protection for children’s data. The Digital ECA expands this framework by:

  • prohibiting profiling- based advertising
  • requiring the immediate deletion of data used for age verification

In practice, the same operation may raise concerns under both the LGPD and the Digital ECA. The classification and penalty will depend on the specific case.

ECA

The Digital ECA functions as an extension of the existing framework for the protection of children and adolescents. In the absence of clarity in specific situations, interpretation tends to follow the principle of the best interests of the child. This typically leads to a more conservative application of the rules.

5.1 Cumulative Penalties

A frequently underestimated point is the possibility of multiple violations. The same conduct may give rise to liability under the Digital ECA, the LGPD, the Brazilian Civil Rights Framework for the Internet, and the ECA. This does not mean that all sanctions will be applied at the same time, but the risk of exposure increases. In practice, analyses that consider only one of these frameworks tend to underestimate the actual regulatory risk.

6. Other topics in the Decree: AI, advertising, and influencers

In addition to regulating Law 15.211/2025, the Decree introduces additional obligations and details rules in areas that had not yet been clearly mapped out by the market. All of these have a direct impact on product decisions.

6.1 Generative AI and conversational agents

The Decree establishes a specific regime for systems capable of generating content and interacting with users in natural language, including language models, conversational agents, and similar interfaces, when accessible to children and adolescents. These obligations are already in effect and do not depend on additional regulations from the ANPD.

Key requirements

  • Transparency: the system must clearly inform the user that they are interacting with automated content, not with a person.
  • Prohibition on behavioral manipulation: it is prohibited to exploit cognitive or age-related vulnerabilities to induce behaviors that are not in the child’s best interest.
  • Algorithmic risk assessment: the provider must assess the system’s risks to the health and safety of children and adolescents, considering the type of interaction and the potential impact.
  • Development safeguards: Active mechanisms to protect physical, mental, and psychosocial development must be implemented. The technical parameters are still to be defined by the ANPD.

6.2 Streaming and providers with editorial control

The Decree provides for a significant exemption for providers that exercise editorial control over content. This includes services where content is pre-selected and licensed, rather than user- generated. In such cases, age verification may be waived, provided certain conditions are met.

(A) Who may qualify

The exemption is likely to apply to:

  • video, music, and audiobook streaming platforms with licensed content
  • movie, series, music, and podcast services
  • game publishers
  • providers of e-books and literary content

(B) Conditions for the exemption

Exemption is not automatic. It depends on the simultaneous fulfillment of two requirements:

  • Children’s profiles or accounts: The service must offer profiles with age-appropriate content, adhering to age ratings where applicable.
  • Effective parental supervision: Control mechanisms must be in place to restrict access to

inappropriate content, while respecting the child’s progressive autonomy.

(C) Cases of full exemption

Providers of journalistic and sports content, when subject to editorial control and not bound by age ratings, are exempt from age verification without the need to meet additional requirements.

(D) Point of attention

Classification as “editorial control” is not automatic and must be analyzed on a case-by-case basis. The company must:

  • assess whether the business model truly fits this definition
  • document this understanding

The ANPD may review this classification at any time. In hybrid models, with licensed content and user-generated content, the exemption does not apply to the part of the service based on UGC. In such cases, it is necessary to treat the two regimes separately.

6.3 Transparency and Accountability

The Decree introduces a new accountability mechanism, which still depends on ANPD regulations, but for which the obligation is already established.

Child Safety and Health Impact Assessment

Providers must conduct a specific impact assessment to identify and address risks associated with the use of their products by children and adolescents. This assessment must include, at a minimum:

  • identification and analysis of the risks involved
  • assessment of the likelihood of occurrence and the severity of impacts
  • definition of mitigation measures
  • continuous monitoring of the effectiveness of these measures

In addition, it will be necessary to:

  • provide a summary version of the report in clear and accessible language
  • be prepared to share the report with authorized third parties, such as academic institutions, innovation entities, or news outlets

The ANPD will still define the minimum requirements, frequency, and criteria for drafting and revising this document.

6.4 Advertising to minors: expanded prohibitions

The Digital ECA already prohibits advertising based on behavioral profiling for minors. The Decree expands this regime by including additional restrictions, focusing on technologies and practices not explicitly addressed in the law. These prohibitions apply regardless of consent or parental settings.

What is now prohibited

For underage users, the following is not permitted:

  • Behavioral profiling: Use of data such as browsing history, purchase history, location, or inferred interests to target advertising.
  • Emotional analysis: The use of inferences about emotional state to personalize or target ads.
  • Augmented reality (AR) advertising: Inserting advertising by overlaying digital elements onto the physical environment.
  • Advertising in immersive environments (XR and VR): Inserting advertising into virtual environments or immersive experiences.

6.5 Child influencers: judicial authorization and 90-day deadline

The Decree regulates a requirement that has existed in the ECA since 1990 and expressly applies it to the digital environment. The regular participation of children and adolescents in commercial activities now requires prior judicial authorization, with a direct impact on platforms that monetize or promote this type of content.

What changes

Platforms that monetize or promote content that habitually exploits the image or daily lives of children and adolescents must require prior judicial authorization, pursuant to Article 149 of the ECA. This obligation does not fall solely on parents or guardians. The platform also bears responsibility. If monetization or promotion occurs without verifying this authorization, the risk is borne by the service provider.

When it takes effect

The requirement applies to content whose monetization or promotion begins 90 days after the publication of the Decree. In practice, this means as of June 18, 2026. Content already monetized before that date is not immediately affected.

What to do now

The MJSP, the CNJ, and the CNMP have yet to define the operational procedure for obtaining judicial authorization. In the meantime, platforms must:

  • establish an internal verification process
  • identify which creators qualify as engaging in regular activity
  • prepare mechanisms to request and validate authorization

7. Relevant insights for compliance

The points below reflect direct feedback from the public hearing held on March 2, 2026, with participation from the ANPD and the MJSP (“Hearing”). They do not alter the text of the regulation, but indicate how the regulation is likely to be applied in practice.

01

INSIGHT

ENFORCEMENT BY ANPD WILL START ONLY AFTER REGULATORY CLARITY

The ANPD indicated in the Hearing that the initiation of formal proceedings, with the possibility of sanctions, should occur after the publication of the guidance document and the technical rules for age verification.

The intention is to allow companies to accurately understand what is expected in terms of compliance before enforcement actions are taken.

Practical implication: There is a window of opportunity to establish compliance, but not to delay decisions. The obligation is already in effect.

02

INSIGHT

THE INITIAL FOCUS SHOULD NOT BE ON PLATFORMS

The ANPD indicated in the Hearing that enforcement is likely to begin with the areas of greatest impact in the digital ecosystem, such as app stores and operating systems.

The logic is to target a few entities that control the initial access layer, with a broader effect on the market. It was also noted that structured enforcement actions should occur after the publication of the guidance document and the technical rules for age verification.

Notifications issued through the end of 2025 were for mapping purposes, not for the imposition of sanctions.

This does not alter the fact that the obligations are already in force. Any prioritization in enforcement only affects the order of action, not the enforceability of the rules.

Practical implication. There is a window to establish compliance before a direct inspection, but this should not be interpreted as an opportunity to wait.

Products with high exposure to minors or without visible protective measures tend to come under regulatory scrutiny, regardless of the initial order of inspection.

03

INSIGHT

THE AGE VERIFICATION ARCHITECTURE MAY CHANGE

The ANPD has signaled support for the use of age verification solutions based on digital identity with Zero Knowledge Proof (ZKP). In this model, the system only confirms whether the user is of legal age, without sharing the exact age or other personal data.

It was also indicated that this solution could be incorporated into Brazil’s public

infrastructure, with technical feasibility and potentially reduced costs.

If this model is adopted as the standard, it is likely to replace solutions based on the collection and storage of documentary data.

Practical implications. Architectural decisions made now should take into account flexibility and compatibility with this potential standard.

Proprietary solutions that rely on data collection or rigid integrations may need to be replaced in the short term, with significant technical and contractual implications.

04

INSIGHT

LOOT BOXES REQUIRE A STRUCTURED DECISION

The regulations applicable to loot boxes are stricter. Self-declaration is not sufficient to prevent access by minors.

The Decree allows for two approaches:

•  implementing age verification with a high degree of reliability

•  or making a version of the product available in which the loot box feature is disabled for minors or blocked by default.

Loot boxes are treated as content prohibited for minors, which raises the bar for the effectiveness of the measures adopted.

In the absence of a final technical definition by the ANPD, the choice of model requires careful evaluation.

Practical implications. The decision must be formally documented, with technical justification and evidence of effectiveness. The absence of documentation makes it difficult to demonstrate compliance should the regulatory interpretation become more restrictive.

05

INSIGHT

HOW THE DECREE AND THE ANPD REGULATIONS WILL WORK TOGETHER

The ANPD director explained in the Hearing how the division of powers between the Executive Branch/Ministry of Justice (“MJSP”) and the ANPD will work. The scope of regulation of the Decree and future ANPD rules was also discussed.

According to him, the regulation is structured on two fronts that advance in parallel and in a coordinated manner.

In practice, there will be a separation between: (i) the establishment of general guidelines, which is the responsibility of the Executive Branch; (ii) technical and regulatory details, which are the responsibility of the ANPD.

(A) The Decree: guidelines and structural parameters:

The first front is led by the MJSP and the federal Executive Branch, responsible for issuing the Decree.

The Decree establishes the guidelines of the law, that is, the central parameters that will guide its application. Among the points indicated as essential are:

These definitions will serve as the basis for subsequent regulatory action.

Institutional coordination between ANPD and MJSP. The ANPD and the MJSP are in constant dialogue to define the agency’s role in supplementing the Decree, to ensure that the ANPD understands the definitions of risk and general rules, in order to precisely define the agency’s regulatory role in the stage of supplementing the Decree.

The main concern is to ensure that the definitions of risk and general parameters established by the Executive Branch can be translated by the ANPD into clear and enforceable technical requirements.

8. Action Plan

Each item below represents a practical decision that must be made.

HORIZON 1 — IMMEDIATE

Content prohibited for minors Implement high-level age verification and remove existing

accounts belonging to minors.

Unfiltered chat or messaging between

users

Set active parental consent as the default.

 

Explicit content accessible without age verification Block, hide, or blur by default. This measure is already required.

 

Advertising based on minors’ behavior Immediately suspend and review contracts with advertising

networks.

Foreign provider without a representative in Brazil Appoint a legal representative with the authority to receive

summonses and subpoenas.

 

HORIZON 2 — FIRST 90 DAYS

Age verification methods

Document the adopted model, with justification based on the level of risk and an auditable record, also considering the LGPD.
Verification performed by
third parties
Review contracts to include immediate data deletion and process audit mechanisms.
Relevant user base of minors Structure a transparency report with a record of notifications and the respective measures taken.
Products accessible to minors Implement parental controls at the most protective level by default and train product and marketing teams.

HORIZON 3 — CONTINUOUS MONITORING

The compliance plan must be reviewed whenever relevant events occur, in particular:

  • publication of the ANPD guidance document
  • definition of technical requirements for age verification
  • ANPD’s stance on loot boxes and access restrictions
  • accreditation of reporting entities
  • updates to the enforcement and sanctions regulations

Important

This material is for informational purposes only and does not replace specific legal analysis. Applicable obligations vary depending on the business model, target audience, and product features.