January 28, 2021 – The Facebook Oversight Board has announced the first set of its decisions on five of the 150,000 cases that were appealed to it since October 2020 when the Oversight Board went operational. The decisions were made regarding the posts that Facebook has taken down in the past year, and the Oversight Board prioritised those cases that “have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
According to the blogpost by the Oversight Board, “the Board overturned four of Facebook’s decisions, upheld one and issued nine policy recommendations to the company.” These cases covered four continents, which include, Asia, Europe, North America and South America.
The cases that the decisions were made on referred to,
- A post made by a user in Myanmar in October 2020 in a closed group that criticised reaction of Muslim men about the oppression of Uyghur Muslims in China compared to that against the cartoons by the French artist. The post, written in Burmese, was taken down for generalising psychology of Muslim men, under Facebook’s Hate Speech Community Standards.
The Oversight Board overturned the decision of Facebook. The announcement states, “The Board considered that while the first part of the post, taken on its own, might appear to make an insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context.”
- A post made in November 2020 targeted Azerbaijan, calling the people nomads and with no history compared to Armenians. The post was made in Russian, and the user included hashtags calling for an end to Azerbaijani aggression and vandalism. It was posted at the time of the conflict between the two countries. Facebook took this post down under its Community Standard on Hate Speech for using slurs to describe a group of people.
The Oversight Board upheld the decision of Facebook. Based on its members’ analysis and with the help of independent linguists that it commissioned, it was determined that, “The context in which the term was used makes clear it was meant to dehumanize its target. As such, the Board believes that the post violated Facebook’s Community Standards.”
- In October 2020, a user in Brazil posted an instagram photo with the title in Portuguese about breast cancer awareness as part of the Pink October international campaign. The post was taken down by Facebook’s automated moderation on the basis of its Community Standard on Adult Nudity and Sexual Activity. However, as soon as the Oversight Board took this case, the post was restored as Facebook determined it was an error. The Oversight Board finds that this case raises human rights concerns on Facebook’s automated moderation.
In this case, Facebook argued that since the post was restored, there is no disagreement between the user and Facebook, hence the Oversight Board should not hear this case. The Oversight Board, however, rejected this argument, and the report mentions, “The need for disagreement applies only at the moment the user exhausts Facebook’s internal appeal process. As the user and Facebook disagreed at that time, the Board can hear the case.”
It holds that the incorrect removal of this post points towards Facebook lack of proper human oversight which raises serious human rights concerns. It further stated, “As Facebook’s rules treat male and female nipples differently, using inaccurate automation to enforce these rules disproportionately affects women’s freedom of expression. Enforcement which relies solely on automation without adequate human oversight also interferes with freedom of expression.”
- In October 2020, a user posted a quote in English that said that rather than appealing to intellectuals, arguments should appeal to emotions and sentiments. It was incorrectly attributed to Joseph Goebbels, a Reich Minister of Propaganda in Nazi Germany. However, the post did not use any Nazi symbol or photo. The user, in their statement, claimed that they were equating Trump’s regime with Nazi Germany. Facebook removed the post under its Community Standard on Dangerous Individuals and Organizations.
Facebook, in its response, said that the user’s post did not clarify whether they reject Nazi ideology, which is the requirement under their Community Standards in order to ensure that harmful content does not stay on the platform. The Oversight Board determined that the users are not informed of these requirements in the policy which does not state that the post must also specify their position on the sentiment being shared. It identified the gap and recommended policy change, while also overturning Facebook’s decision of taking the post down.
- In October 2020, a user posted a video along with a caption in French in a public group about COVID-19 information. The video said that while the French government agency that regulates health related products in the country rejected two drugs for treatment of COVID-19, it approved and promoted another drug. It claimed that the treatment through hydroxychloroquine, as suggested by the French professor Didier Raoult currently facing disciplinary case, is being used elsewhere in the world for COVID-19, and criticised France’s lack of effective healthcare strategy to deal with the pandemic. Facebook took this post down under its Violence and Incitement Community Standard that also covers misinformation.
The Oversight Board determined that the post did not encourage people to buy the drug without prescription which is prohibited in France, and the user was only opposing the government’s policy and pointed at changing it. The Oversight Board overturned Facebook’s decision of taking the post down, and suggested that the social media platform institute a separate policy to deal with healthcare related misinformation.
The Oversight Board, which Mark Zuckerberg has referred to as the Supreme Court, is an autonomous body that Facebook has created owing to the challenges that the company faces with regards to content moderation on its platforms. The cases referred to the Board are reviewed by a five-member panel each, with at least one member from the region the case pertains from, and the panel ensures a mix of gender representation. After the panels reach the conclusion for their cases, they are then reviewed by the majority of the panel before they are issued. The decisions made by the Oversight Board are binding, and the company has 7 days to restore the suspended posts and 30 days to respond to the policy recommendations made by the Oversight Board.
This is the first decision that the Oversight Board has issued since it went into effect in October 2020. Rights activists and experts from around the world have expressed concerns with regards to its independence from Facebook, and have questioned its silence in matters of Facebook’s involvement in incitement of violence in various countries including India, Myanmar and the United States, especially during the 2020 Presidential elections.
However, one of the key cases of Facebook content moderation regarding the indefinite suspension of accounts of former US President Donald Trump after he used it to incite January 6th riots in Washington DC have been sent to the Oversight Board to review. In addition, a decision on another case from India regarding the removal of a post under Facebook’s Violence and Incitement Community Standard will also be issued in the coming days.
Featured Illustration: Casey Chin; Getty Images
Hija is a Programs Manager at Media Matters for Democracy. She combines her experience in digital rights in Pakistan to lead digital rights and internet governance advocacy of MMfD. She tweets at @hijakamran