Facebook’s content oversight board is an effort “too little, too late” 

ISLAMABAD: Experts have  welcomed the proposed content oversight board by Facebook but remain skeptical about its efficacy.

Facebook will be setting up an independent content oversight board that would help the company “review Facebook’s most challenging content decisions – focusing on important and disputed cases.”

More details about the remit of the board have recently been shared by Facebook in a draft charter. The company intends to finalize its scope over the next six months in consultation with experts from around the world.

On the other hand, Digital rights and media experts have cautiously welcomed the new development , but doubt it would resolve the core issue. Issie Lapowsky wrote in her article for Wired  that the huge size of the company would render the proposed board ineffective.  “no team, no matter the size or scope, could ever adequately consider every viewpoint represented on Facebook. After all, arguably Facebook’s biggest problem when it comes to content moderation decisions is not how it’s making the decisions or who’s making them, but just how many decisions there are to make on a platform of its size. In seeking to fix one unprecedented problem, Facebook has proposed an unprecedented, and perhaps impossible, solution.”

DRM exclusively talked to Richard Wingfield from Global Partners Digital, a London based digital rights advocacy group.  He welcomed the new initiative but emphasized that human rights should be at the heart of this initiative. “As the first major platform to put forward a model for such oversight, there is much to be welcomed here, and many of the proposals have the potential to strengthen freedom of expression online. But it is critical that human rights are at the heart of any new Oversight Board’s processes, and that it has the necessary independence and expertise to be able to undertake its work effectively.”

Meanwhile, talking to DRM in his personal capacity, Michael J. Oghia, the Advocacy and Engagement Manager for the Global Forum for Media Development called this initiative, an effort too little, too late. “I believe strongly in multi-stakeholder collaboration, so taken at face value, this could be a valuable opportunity to work together on the important but highly sensitive and contentious issue of content moderation. Yet, this announcement also strikes me as a too little, too late. Had Facebook established this in the late 2000s or early 2010s, I would have likely applauded it.”

He noted the company’s past track record of breaching public trust on numerous occasions made it difficult for him to applaud this initiative. He added that such “self-managed” initiatives might not be efficient to address the issue.  “As a principle, one-sided, self-managed, and self-implemented actions of one company is not good enough, regardless of how powerful or dominant that company is. Who will guarantee to audiences and consumers that the board will protect their interests, and who will be an independent judge if there is a dispute between a user and Facebook? Moreover, who will hold the board accountable for their decisions given the lack of accountability that has fraught Facebook for years? Proper self-regulation is done on the level of the sector, not one company. So, while this could be a positive step moving forward, I am dubious about its overall impact – especially given Facebook’s history of prioritising profit and market power above all else. Ultimately, I will evaluate accordingly once it is established, but refuse to praise them for something they should have invested more thought into over a decade ago.”

Renowned Azeri journalist Arzu Geybulla also  welcomed the initiative but also called it an effort that came too late.  She had been urging the company to set up independent country focused teams for quite some time. “…in my personal interactions with Facebook representatives, I have encouraged and advocated for having an independent country focused teams especially when it comes to certain countries where Facebook content is strictly monitored if not controlled on regular basis (Azerbaijan is just one of those countries),” she said while talking to DRM.

She also expressed apprehensions about the future makeup of the board and also wondered if the board would be geographically diverse, accessible and responsive enough. She also wondered if the board would be willing to establish relationships with major human rights organizations.  “So in a nutshell, I remain skeptical. Until we see the direct outcomes of this board, I am not sure I can make any further conclusions on its independence, accuracy, and intention,” Arzu added.





No comments

leave a comment