Meta’s tools increase chance of removing valuable posts about Israel-Hamas war, says watchdog
DUBAI: Meta’s Oversight Board, which makes decisions about content published on the company’s platforms, published its findings on Tuesday after expedited reviews of two separate appeals from users about the removal of content relating to the war between Israel and Hamas.
The board, which completed its reviews in 12 days, expressed concerns about the removal of content that might contain evidence of human rights violations, and urged Meta to demonstrate that action was being taken to preserve such content and to respond more quickly to changing circumstances.
One appeal involved an Instagram post that showed what appeared to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City during Israel’s ground offensive. The footage showed Palestinians, including children, who had been injured or killed.
During the appeal, the creator of the post said they did not incite violence and had simply shared content that showed the suffering of Palestinians, including children, and that the removal of the post displayed bias against this.
The other case involved videos posted on Facebook of an Israeli woman begging her kidnappers not to kill her as she was taken hostage during the Oct. 7 attacks by Hamas on Israel. The creator of the post told the appeal that the video captured real events and aimed to help “stop terror” by revealing the brutality of the incidents during which hostages were taken.
The Oversight Board overturned Meta’s decisions to remove the content in both cases.
“These decisions were very difficult to make and required long and complex discussions within the Oversight Board,” said Michael McConnell, its co-chair.
Social media platforms play a critical role during times of conflict, he added, as they are often the “only vehicles” through which to “provide information, especially when the access of journalists is limited or even banned.”
Meta told the board that during the conflict in Gaza it has temporarily lowered the thresholds used by automated tools to detect and remove content that potentially violates its rules, which reduces the risk of harmful content appearing but increases the likelihood that legitimate, valuable content might be removed from its platforms. As of Dec. 11, Meta had not restored the thresholds to pre-Oct. 7 levels, the board said.
It was also revealed that there had been a near-three-fold increase in the average daily number of appeals by users relating to the Middle East and North Africa region in the weeks following the Oct. 7 attacks.
The Oversight Board highlighted four aspects of Meta’s performance it said affected freedom of expression.
When the company applied warning messages to posts to prevent the involuntary exposure of users to disturbing content, it also excluded those posts from being recommended to other Facebook or Instagram users, even in cases where it had determined that the intention of the posts was to raise awareness.
In the case of the post about the situation at Al-Shifa Hospital, the steps taken to remove the content and to reject an appeal from the user happened automatically, without any human intervention or review, resulting in the suppression of information about the suffering in Gaza, the board said.
In the case of the footage of the Israeli hostages, Meta said it initially removed the videos out of concern that they might be perceived as celebrating or promoting the actions of Hamas. A few days later, victims’ families started sharing the videos to condemn the attacks and raise awareness of the situation. The Israeli government and media organizations in the country similarly shared the footage.
Meta said it began to allow the sharing of content related to the taking of hostages on or around Oct. 20, but only by accounts subject to its Early Response Secondary Review or cross-check policy, which allows for additional reviews of content from specified accounts.
The relaxing of the rules on videos showing hostages was not expanded to include all users until Nov. 16, and even then only for content posted after that date.
The Oversight Board said that although Meta had explained the need to proceed with caution because of the “humanitarian risks of portrayals of the hostages, the company’s use of this policy highlighted concerns previously raised about unequal treatment of users.”
McConnell said: “The board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred.
“These testimonies are important not just for the speakers but for users around the world who are seeking timely and diverse information about groundbreaking events, some of which could be important evidence of potential grave violations of international human rights and humanitarian law.”
The Oversight Board also reiterated the need for Meta to “swiftly act on previously issued content-moderation guidance.”
The Oversight Board revealed on Dec. 7 that it was considering the two cases and would conduct an expedited review. This gave it 30 days to publish its findings and it completed its review in just 12 days.