Changes in Meta’s fact-checking system should have an impact on the STF

The Meta company, the controlling entity of Whatsapp, Instagram, Threads, and Facebook, announced the discontinuation of its fact-checking program in the United States, which will be replaced by a system of “Community Notes”.

With this, similar to the X platform by Elon Musk, the big tech assigns the responsibility for checking content to users, who are required to assess posts, flag false information, and make corrections.

The change in the fact-checking protocol, which had been carried out by professional agencies since 2006, was justified by the company in an official statement on Tuesday (7).

According to Patricia Peck, CEO of Peck Advogados, a reference in Digital Law for 20 years in Brazil, the “return to roots” advocated by Mark Zuckerberg cannot be devoid of any responsibility.

“In addition to demonstrating alignment with the new U.S. administration, the statement made by Zuckerberg makes it clear that this same understanding should resonate in other countries. Care must be taken to prevent political pressure from contradicting existing laws and compromising sovereignty in other states,” she states.

In Brazil, for example, there is a constitutional provision for freedom of expression, but it must be harmonized with others, such as national sovereignty, privacy, and civil and criminal liability for any excesses. In this sense, Peck points out the risks of increased polarization and dissemination of prejudiced and criminal content.

“Furthermore, there is a risk that community notes may be artificially used to benefit or harm any political, ideological, or other stance,” she explains.

With the return of the judgment of the Marco Civil da Internet, scheduled for the first semester of 2025, the topic can be discussed by the ministers of the Supreme Federal Court (STF).

“As a rule, companies must comply with current laws and Brazilian court orders, regardless of the model adopted by corporations in their home countries. If we consider that there is a large volume of removals that will no longer be proactively removed from the networks, we tend to see an increase in judicial actions for content removal,” concludes Peck.”