Publicado em 14/04/2023 - News

Ordinance enacted to prevent the dissemination of illicit, damaging, or harmful content by social media platforms

On April 12, 2023, the Ministry of Justice and Public Security published Ordinance No. 351/2023 (“Ordinance”), which sets forth administrative measures to prevent the spread of illicit, damaging, or harmful content by social media platforms.

The Ordinance is the result of recent discussions on the need to regulate digital platforms to combat fake news, misinformation and hate speech over the internet. The debate also stems from the definition of digital platforms’ liability set under the Civil Rights Framework for the Internet (Law No. 12,965/2014 – “MCI”).

What is the current legal understanding of User Generated Content (UGC)?

According to article 19 of the MCI, application providers (such as digital platforms) are only liable for UGC if they fail to comply with a court order that specifically orders their removal. The constitutionality of this article is currently being debated at the Brazilian Supreme Court (“STF”), which recently held a public hearing on the subject, with the participation of several market players.

Noteworthy that article 19 of the MCI is still in force and digital platforms remain as content intermediaries, without direct liability for third-party content.

How does the Ordinance interact with the MCI?

The Ordinance recognizes the liability exemption enshrined in the MCI, but indicates that there would be a systematic interpretation of this rule along with the Statute of Children and Adolescents (“ECA”) and the Federal Constitution, so that it would not be possible to exempt social media platforms “from liability and the obligation to prevent the spread of flagrantly illicit, damaging or harmful content, in relation to which reasonable and proportionate measures of care are expected to be adopted.”

Furthermore, it is noteworthy that there is no definition in the Ordinance of what should be considered as “social media platforms”.

What is the content of the Ordinance?

The Ordinance considers that social media platforms are not simple content exhibitors. They should be understood as “mediators of the content displayed to each of their users, defining what will be displayed, what can be moderated, the reach of publications, the recommendation of contents and accounts”. Therefore, social media platforms should not be seen as “neutral agents in relation to the contents that move across them”.

Social media platforms are characterized as service providers, as set forth in Consumer Protection Code (“CDC”) and, therefore, must guarantee the “security of services provided to the consumer”. The Ordinance also considered null and void clauses “that make it impossible, exonerate or mitigate the responsibility of suppliers for vices of any nature, those that establish unfair obligations, those that place the consumer at an exaggerated disadvantage or are incompatible with good faith or equity”.

In view of the above, the Ordinance imposes on the National Consumer Secretariat (“SENACON”) and the National Public Security Secretariat (“SENASP”) a series of attributions aimed at supervising and structuring a system for the protection of minors.

SENACON – INSPECTION AND INVESTIGATION

SENACON may initiate an administrative proceeding to verify and hold social media platforms accountable.

Descriptive reports. SENACON may request reports from social media platforms that describe the measures taken to monitor, limit and restrict potentially illegal, damaging, and harmful content that encourages attacks against the school environment or advocates and incites crimes. They may also request the proactive measures taken to limit the spread of such content, the fulfillment of requests by the competent authorities, the development of protocols for crisis situations and other appropriate measures.

Assessing and Mitigating Systemic Risks. The Ordinance also provides that SENACON will be responsible for requesting social media platforms to assess and adopt measures to mitigate systemic risks arising from the functioning of their services and systems, including algorithmic systems. That is, they should analyze the negative, real, or foreseeable effects of the spread of illicit content, such as, for instance, the risk of minors accessing age-inappropriate content or the risk of propagating content that is considered violent or that instigate the crime.

In this assessment, SENACON must request a report that considers the impact of the following factors on the systemic risk: “the design of recommendation systems and any other pertinent algorithmic system; content moderation systems; the applicable terms and policies of use and their consistent application; and the influence of malicious and intentional manipulation of the service, including inauthentic use or automated exploitation of the service, as well as potential amplification and dissemination.”

SENASP – COORDINATION AND EFFECTIVENESS

Coordination within the scope of the Safe School Project. SENASP will be responsible for “coordinating the sharing, between social media platforms and the competent authorities, of the data that allow the identification of users or the terminal of the Internet connection of the person who made the content available”. The scope of action of SENASP seems restricted in the Ordinance to the Safe School Operation, which was created by the Ministry of Justice and Public Security to monitor suspected cases of attacks on educational institutions in support of hatred and violence.

Generation of Users and Database. SENASP may require the adoption of measures that standardize the compliance with competent authorities, as well as guide platforms to prevent the creation of new profiles based on internet protocol addresses (IP address) in which illegal, damaging, and harmful activities have already been identified.

Finally, SENASP must create a database which contains what it deems as illegal content (with images, links and other illegal content, which will be identified with a hash, among other mechanisms that assist in the identification of content) that will be shared with social media platforms to facilitate identification by automated systems. It must also guide them to use as a parameter the removal or unavailability of content similar or identical to those SENASP has requested the exclusion.

Indication of a representative. The social media platforms must indicate a representative that will be responsible for the direct communication (including by electronic means) with local federal and state government entities. This representative must be able to make decisions to mitigate crisis situations.

The Ordinance provides that, in the event of violation of its provisions, sanctions will be imposed within the scope of administrative or judicial proceedings, in accordance with the attributions of the competent authorities.

Our Data Protection and Technology team will monitor developments on the matter and remain at your disposal for any clarification.