The Meta Supervision Board, the independent group created to assist Meta with the decisions of moderation of content, issued its response to new policies of the hate speech of the social media company announced in January.
The board says new Meta policies were “hastily announced, in a removal from the regular procedure”, and called on the company to provide more information about its rules. Moreover, the Board asked Meta to appreciate the impact of its new policies on the groups of vulnerable users, to report them publicly and update the board every six months.
The board says he is in meta discussions to attract his fact -control policies in regions outside the US, too.
Just a few weeks before President Donald Trump took office, Meta General Director Mark Zuckerberg began with an adjustment of the company’s content moderation policies in an effort to allow “more speech” on Facebook, Instagram and Threads. As part of this push, Meta envelops the hate speech rules that protected immigrants and users of LGBTQIA+ on its various platforms.
Regarding new flawed policies, the Board says it issued 17 recommendations for Meta that, among other things, require the company to measure the effectiveness of its new community record system, clarify its revised stance on hate ideologies, and improve the way it implements its harassment policies. The board says it has also asked Meta to support its 2021 commitment to the UN business and human rights guidance principles by engaging with stakeholders influenced by new policies. The board says Meta should have done this in the first place.
The Supervision Board is limited to its ability to direct the broader flawed policies. However, Meta should follow its decisions in individual posts, according to the company’s own rules.
If Meta gives the board a reference to policy advisory thinking – something that has been done several times before – the group may have a channel to reshape moderation of the meta content.
In decisions published on 11 cases regarding matters on meta-including anti-Migrant speech, hate speech aimed at people with disabilities and suppressing LGBTQIA+-Supervisory voices appeared to criticize some of the new content policies Zuckerberg announced earlier this year. January policy changes Meta did not affect the outcome of these decisions, the Board said.
In two cases of the US that include videos of transgender women on Facebook and Instagram, the Board supported Meta’s decision to quit the content, despite user reports. However, the Board recommends that Meta remove the term “Transgjini” from his policy of hate behavior.
The Board overturned Meta’s decision to leave three posts on Facebook about the unrest against immigration in the United Kingdom during the summer of 2024. The Board revealed that Meta acted very slowly to remove anti-Muslim and anti-immigration content that violated the company’s violence policies.