October 25, 2020

Facebook updates its Terms of Service to reflect plan to restrict or remove content on its platforms

On September 1, 2020, Facebook sent a pop up to its users alerting them about an update to their terms of service, specifically section 3.2: “What you can share and do on Facebook.” Effective October 1, the change would allow Facebook to remove or restrict access to content, services or information that it believes could potentially cause legal implications to Facebook. 

The text of the latest update read, “We can also remove or restrict access to your content, services or information if we determine that doing so is reasonably necessary to avoid or mitigate adverse legal or regulatory impacts to Facebook.”

The change comes during the time when Facebook is under scrutiny because of its Public Policy Executive Ankhi Das’s potential role in allowing incitement of violence against Muslim community in India. A Facebook’s spokesperson told the Digital Rights Monitor (DRM) that this update is in response to the Australian competition regulator’s plan to make digital platforms pay news media outlets for news content that is shared on the said platforms. They said in an email to DRM, “We made this Terms of Service change to enable us to stop allowing publishers and people in Australia from sharing local and international news content on Facebook and Instagram.”

The proposed news media bargaining code in Australia, if passed, would require Facebook and Google to pay for the news that individuals and news publications post on their platforms. The proposal came in light of the loss of advertising revenue of media houses at the hands of the digital platforms. However, Facebook has threatened to restrict sharing of news content from Australia on its platform, whereas, Google has initiated a campaign for civic engagement telling people that its free services are in danger in Australia.

While the change is intended to bar sharing of news in Australia, the spokesperson said that the applicability of this policy is global, and will also extend to Instagram. They further said, “This Terms of Service update is not related to content moderation,” adding, “This update doesn’t change how we moderate individual pieces of content. We remain committed to free expression and making our services available to people wherever we can. This global update provides more flexibility for us to change our offerings, including in Australia, to continue to operate and support our users in response to potential legislation or legal action, even if it’s our last choice.”

However, the update fails to offer clarity on how it will determine what content, service or information will be taken down or is not allowed to be posted on Facebook platforms. Amel Ghani, a program manager at Media Matters for Democracy, sees it as Facebook’s deliberate attempt to extend its power on the content that is published on its platforms. She asks, “How will Facebook decide what content is taken down? If DRM, that is being operated from Pakistan, posts any news specifically about Australia, would it be taken down as well? How does it determine the location of the news or publishers being published considering the global operations of news outlets?”

The ambiguity in the recent change puts many other content at risk of being taken down, at the discretion of Facebook’s content moderation if and when they decide any piece of content can lead them to legal or regulatory cases. Amel further says, “I believe this is a deliberate attempt at being vague in their terms of service. Facebook has constantly been under scrutiny for its various shortcomings, and I suspect the scope of this update will be broadened to protect Facebook from any kind of legal repercussions as and when it deems necessary.”

Facebook has repeatedly come under fire for its role in democratic disruption, as seen in the US Presidential Elections of 2016, and in the campaigns around Brexit in the UK; its role in incitement of violence against oppressed communities around the world, like the genocide of Rohingya Muslims in Myanmar which is under investigation by a committee setup by the UN, or its failure to take down calls for arms before Kenosha shooting that left two dead, or as reported by the Wall Street Journal, its Public Policy Executive in India Ankhi Das’s instructions to not take down posts by a BJP leader to incite violence against Muslims in India. Given the recent update’s ambiguity, it’s fair to assume that the scope of its applicability can potentially be extended to take down content that can reasonably lead Facebook to being held accountable for its role in political instability around the world. Amel says that where the principal motive of this update could be the response to the Australian legislation, it’s imperative that Facebook clearly defines the scope of the policy so to ensure it is not applied to take down content that does not fall under the prescribed purpose.

Photo by Solen Feyissa on Unsplash

Written by

Hija is a Programs Manager at Media Matters for Democracy. She combines her experience in digital rights in Pakistan to lead digital rights and internet governance advocacy of MMfD. She tweets at @hijakamran

No comments

leave a comment