Oversight Board Urges Meta To Address Information Gaps After Syria Case
(ANALYSIS) In October, the Oversight Board, a body making precedent-setting content moderation decisions on the social media platforms Facebook, Instagram and Threads, issued a decision calling on Meta to mitigate information asymmetries in armed conflicts.
The Oversight Board is a body examining whether Meta’s decisions are in line with its policies, values and human rights commitments. Users of the three platforms can appeal to the Oversight Board when they have exhausted Meta’s appeals process to challenge the company’s decision on content.
The October decision of the Oversight Board relates to posts concerning the situation in Syria. In late 2024, two Facebook users in Syria posted content related to Hayat Tahrir al-Sham (HTS), an organization designated as a terrorist group by the U.N. Security Council.
HTS led the offensive that overthrew the regime of Bashar al-Assad. As reported by the Oversight Board:
— In the first case, a user whose appeal to the Board stated they are a journalist posted a video in Arabic to their page in November. The video showed an HTS commander’s speech encouraging rebel fighters to “attack your enemies and suffocate them.” Addressing Assad’s forces, the commander said, “You have no choice but to be killed, flee or defect.” Meta removed the content less than 15 minutes after it was posted for violating the Dangerous Organizations and Individuals policy. The post was viewed almost 5,000 times.
— In the second case, an image was posted on a public page containing a photograph of HTS leader Ahmed al-Sharaa and Arabic text of part of a speech he gave the same day. The speech encouraged HTS fighters to “not waste a single bullet except in the chests of your enemy, for Damascus awaits you.” The post was automatically removed within minutes for violating the Dangerous Organizations and Individuals policy. The day after, HTS forces took the Syrian capital, Damascus.
By way of the actions taken by Meta in relation to the two cases, the social media giant demoted their reach and visibility. After the users appealed to Meta, the content removal was affirmed, leaving them with the only option to appeal to the Oversight Board.
Having considered the cases, the Oversight Board found, by majority, that removing the content was inconsistent with Meta’s human rights responsibilities.
It further found that Meta’s relevant policies must be adjusted to ensure such alignment in the future. As explained in the decision, “The public interest in receiving information that could keep people safe in a rapidly evolving conflict situation, where the regime severely limited information flows, and the low likelihood that sharing this content would lead to additional harm are of particular relevance. The Board notes that in this and any political conflict, communication is truncated, making contextual clues as to the motivations for a post less overt to outsiders. Granting a scaled newsworthiness allowance was warranted.”
You can read the rest of Ewelina Ochab’s post at Forbes.com.
Dr. Ewelina U. Ochab is a human rights advocate, author and co-founder of the Coalition for Genocide Response. She’s authored the book “Never Again: Legal Responses to a Broken Promise in the Middle East” and more than 30 UN reports. She works on the topic of genocide and persecution of ethnic and religious minorities around the world. She is on X @EwelinaUO.