Meta, the parent company of WhatsApp, Instagram, Threads, and Facebook, announced the discontinuation of its fact-checking program in the United States, which will be replaced by a “Community Notes” system.
Consequently, and similar to Elon Musk's platform X, the big tech company assigns the responsibility for content verification to users, who are expected to evaluate posts, flag false information, and make corrections.
The change in the verification protocol, which had been conducted by professional agencies since 2006, was justified by the company in an official statement on Tuesday (7).
According to Patricia Peck, CEO of Peck Advogados, a reference in Digital Law in Brazil for 20 years, the “return to roots” advocated by Mark Zuckerberg cannot be devoid of any responsibility.
“In addition to demonstrating alignment with the new American administration, the statement presented by Zuckerberg makes it clear that this same understanding should resonate in other countries. Care must be taken to ensure that political pressure does not contravene existing laws and compromise the sovereignty of other States,” she states.
In Brazil, for example, there is a constitutional provision for the right to freedom of expression, but it must be balanced with other rights, such as national sovereignty, privacy, and civil and criminal liability for any excesses. In this regard, Peck points to the risks of increased polarization and the dissemination of prejudiced and criminal content.
“Furthermore, there is a risk that community notes will be used artificially to benefit or harm any political, ideological, or other positioning,” she explains.
With the resumption of the judgment of the Brazilian Civil Rights Framework for the Internet, scheduled for the first half of 2025, the topic may be discussed by the ministers of the Supreme Federal Court (STF).
“As a rule, companies must comply with existing laws and Brazilian court orders, regardless of the model adopted by corporations in their home countries. If we consider that a large volume of content that would have been proactively removed from the platforms will no longer be, we are likely to see an increase in lawsuits for content removal,” concludes Peck.

