Meta’s Oversight Board says removal of videos of Israel-Hamas conflict is wrong

The board highlighted that the removals were conducted solely by automated moderation tools without human review.

Meta's logo Electronics, Mobile Phone, Phone, Person, Face, Head

Meta’s Oversight Board criticised Meta for aggressively removing two videos related to the Israel-Hamas conflict from Facebook and Instagram. The board argues that removing such content harms freedom of expression and access to information during the conflict. The decision comes after a review conducted by the Board.

The Oversight Board highlighted that removing these videos and rejecting the user’s appeal to restore the content was done solely by Meta’s automated moderation tools without human review. It added that moderation thresholds were lowered following the attack on October 7th, increasing the likelihood of mistakenly removing non-violating content related to the conflict.

The board expressed concern over the lack of human-led moderation during times of crisis, stressing that this can result in the incorrect removal of speech that may be of significant public interest. It argued that Meta should have acted promptly to allow the sharing of content aimed at condemning the violence, raising awareness, reporting news, or calling for the release of hostages while applying appropriate warning screens.

Furthermore, the Oversight Board criticised Meta for demoting the two reviewed posts with warning screens, which prevented them from appearing as recommended content to other users. This action, in the board’s view, limited the reach and impact of the posts, despite Meta acknowledging their intention to raise awareness.

In response to the board’s decision to overturn the removals, Meta stated that no further updates to the case would be made, as the panel did not provide any specific recommendations.

Why does it matter?

The report sheds light on automated moderation tools’ challenges in censoring content during conflict. It highlights the need for a nuanced and human-led approach in moderating content, particularly during crises when public interest and access to information are crucial. The criticism faced by Meta’s Oversight Board underscores the ongoing debate surrounding the balance between preserving free expression and preventing the spread of harmful or violent content.