In the present report the Special Rapporteur proposes a framework for the moderation of user-generated online content that puts human rights at the very centre. He seeks to answer basic questions: What responsibilities do companies have to ensure that their platforms do not interfere with rights guaranteed under international law? What
standards should they apply to content moderation? Should States regulate commercial content moderation and, if so, how? The law expects transparency and accountability from States to mitigate threats to freedom of expression. Should we expect the same of private actors? What do the processes of protection and remedy look like in the digital age?
Read the report here.