There have been reports of a troubling pattern of uneven application of Facebook’s Terms of Service and policies, resulting in arbitrary treatment of content relating to the conflict between Israel and Palestine. In this context, the Facebook Oversight Board accepted to consider the appeal of a Facebook content moderation decision related to an Al Jazeera news story that was reposted in May 2021 by an ordinary user without comment, and whose repost was deleted while the original news article remained on the platform.

By promising to adhere to international human rights law, Facebook has raised the public’s expectation that it will realign its Community Standards to human rights standards, and the Oversight Board should hold the company to that commitment. The Special Rapporteur considers that the Oversight Board should call upon Facebook to incorporate a human rights approach into its guidelines, standards, considerations, and practices through the following measures:

  1. Disclosure of designated individuals and organisations: The list of designated Dangerous Individuals and Organisations should be published, and Facebook should put in place due process standards empowering users to understand how assessments are made and appeal decisions regarding the inclusion of individuals or organisations.
    1. Facebook should consider adopting the model definition of incitement to terrorism advanced by the mandate of the United Nations Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism.
    2. Facebook should be guided by the six factors set out in the Rabat Plan of Action when addressing advocacy of national, racial or religious hatred that may constitute incitement to discrimination, hostility or violence.
  2. Transparency: All requests to remove content or suspend/delete accounts by governments, including the Israeli Ministry of Justice’s Cyber Unit, should be disclosed to the public. The data provided should include the basis for the request (whether there is a violation of national law or a request for ‘voluntary’ removal), the number of requests, and action taken in response (whether to restrict the visibility of content or conversely increase the reach of pro-government messaging).
  3. Algorithmic transparency: Facebook should ensure transparency of its automation and machine learning algorithms so that users and the public can understand how the platform moderates content, in particular relating to the Palestinian conflict. The transparency should include error rates as well as classifiers used.
  4. Consultation: Facebook should undertake meaningful consultation with potentially affected groups and other stakeholders, including in Palestine, and appropriate follow-up action that mitigates or prevents these impacts. Facebook should also conduct ongoing review of its efforts to respect rights, including through regular human rights impact assessments, consultation with stakeholders, and frequent, accessible and effective communication with affected groups and the public, in line with Guiding Principles 20−21.

Read the full public comment below.