Idec expresses concern about the changes announced today by Mark Zuckerberg regarding content moderation on Facebook, Instagram, and Threads. The replacement of fact-checkers with “community notes” and the reduction of moderation filters may increase the circulation of misinformation, hate speech, and harmful content on the platforms.
These changes undermine legitimate regulatory initiatives and directly impact users’ daily lives, exposing them to fraud, abuse, and misleading information that can harm everyday actions such as online shopping and health-related searches. Additionally, weakening moderation rules is potentially dangerous during election periods and reduces platform safety, especially for more vulnerable groups such as the elderly, Black people, children, and adolescents.
Idec warns that the changes announced by Meta demonstrate the structural problem of power concentration in the hands of corporations that act as arbiters of the digital public space and prioritize corporate interests over user safety and rights. Such abusive changes reinforce the need for more robust regulation that holds platforms accountable for the harms caused by their business models and prioritizes consumer protection in the digital environment. It is essential that users remain vigilant and that governments and organizations act to ensure a safer and more trustworthy online space.