Stock photo. Meta social media icons appear on a smartphone between Facebook, Messenger, Instagram, Threads and other products, with Meta Ink visible in the background. (Photo: Jonathan Raa/NurPhoto via Getty Images)
NurPhoto via Getty Images
In October 2025, the Supervisory Board, a body that makes previous content control decisions on social media platforms Facebook, Instagram and Threads, issued a decision call in Meta to mitigate information asymmetries in armed conflict. The Supervisory Board is a body that examines whether Meta’s decisions are in line with its human rights policies, values and commitments. Users of the three platforms can appeal to the Supervisory Board when they have exhausted Meta’s appeals process to challenge the company’s content decision.
The Supervisory Board’s October decision concerns positions related to the situation in Syria. In late 2024, two Facebook users in Syria posted content related to Hayat Tahrir al-Sham (HTS), an organization designated as a terrorist group by the UN Security Council. HTS led the offensive that toppled Bashar al-Assad’s regime. As the Supervisory Board states:
- In the first case, a user whose appeal to the Board said he was a journalist posted a video in Arabic on his page in November. The video showed a speech by an HTS commander encouraging the rebels to “attack your enemies and suffocate them.” Addressing Assad’s forces, the commander said: “You have no choice but to kill yourself, flee or withdraw.” Meta removed the content less than 15 minutes after it was posted for infringement Dangerous Organizations and Individuals Policy. The post has been viewed nearly 5,000 times.
- In the second case, an image was posted on a public page containing a photo of HTS leader Ahmed al-Sharaa and the Arabic text of part of a speech he gave on the same day. The speech encouraged HTS fighters “not to waste a single bullet except in the chests of your enemy, because Damascus awaits you.” The post was automatically removed within minutes for violating the Dangerous Organizations and Individuals policy. The following day, HTS forces captured the Syrian capital, Damascus.
Through the actions Meta took in relation to the two cases, the social media giant degraded their reach and visibility. After users appealed to Meta, the content removal was upheld, leaving them with the only option to appeal to the Board of Supervisors.
Having considered the cases, the Supervisory Board found, by a majority, that the removal of the content was inconsistent with Meta’s human rights responsibilities. It also found that Meta’s relevant policies need to be adjusted to ensure this alignment in the future. As explained in the decision, “The public interest in obtaining information that could keep people safe in a rapidly evolving conflict situation where the regime severely restricted information flows and the low probability that sharing this content would lead to additional harm is particularly important. It was justified.”
A minority of the Board disagreed, finding that the removal of the positions was consistent with Meta’s human rights responsibilities and Board precedent. The justification for this was that both positions were said to be based on orders to kill, with no comment and little information about the protection of civilians.
The Board further found that, by channeling communications from a designated group without a clear intent to engage in permitted social and political discourse, both posts violated the Dangerous Organizations and Individuals policy. It also found that both posts infringe the Policy of violence and incitement as they contain clear calls for violence.
The Board further added that: “Meta’s refusal to tell users which organizations and individuals cannot be discussed under its dangerous organizations and individuals policy is particularly problematic during armed conflicts, when designated entities may act as de facto governmental authorities. speech”.
The board noted that Meta’s moderation of the Syrian conflict may have led to questionable information asymmetries that put users at risk. Meta’s policies allow calls for violence against listed entities, but prohibit them against regular military personnel.
The board overturned Meta’s decisions to remove both positions, requiring that they be reinstated with a journalism stipend.
The board also recommended that Meta:
- Add a lever to the Policy Crisis Protocol that allows the platform to mitigate information asymmetries that its policies may create. This could include policy levers such as: suspending the ban on information sharing by designated entities involved in the conflict; suspending warnings or lowering feature limits where content is found to be infringing for unclear intent. providing training to users on how to share information about specified entities in permissible ways; When these policy levers are invoked, the measure must be made public.
- Consider, in consultation with affected stakeholders, how its ban on the channeling of official communications by a designated entity under the Dangerous Organizations and Individuals Policy affects access to information and the protection of civilians from violence in armed conflict.
- Report to the Board on its efforts over the past five years to assess whether and how the Community Standards on Violence and Incitement and Dangerous Organizations and Individuals should be amended to take into account International Humanitarian Law standards and set out its short-term future plans in this area.
While the Supervisory Board considered the two cases related only to Syria, the recommendations issued address information asymmetries between the conflicts and are not limited to the Syrian context. It is not yet clear how these recommendations will be implemented. Recommendations to study and analyze the impact of policy on access to information and the protection of civilians from violence in armed conflict are key to ensuring that any changes are evidence-based and responsive to the issues at stake.


