Meta to end ban of word ‘shaheed’ after year-long review
The company’s Oversight Board will allow the word’s use ‘unless content otherwise violates our policies’ or is paired with multiple ‘signals of violence’
Chesnot/Getty Images
Meta announced on Tuesday it will end its blanket ban on the word “shaheed,” an Arabic word commonly translated into English as “martyr.” The lift came after a year-long review by the social media giant’s Oversight Board, which is funded by Meta but operates independently.
The Oversight Board “recommended allowing use of the word ‘shaheed’ in all instances unless content otherwise violates our policies or is shared with one or more of three signals of violence,” according to a recommendations update posted online on Tuesday.
According to Meta, the parent company of Facebook and Instagram, the review was a result of the word accounting for more content removals on the company’s platforms than any other single word or phrase. In March, the review determined that Meta’s rules on “shaheed” failed to account for the word’s variety of meanings and resulted in the removal of content not aimed at condoning violence. The board said that it “considered concerns that the policy may be contributing to censorship of those commenting on situations like the violence seen in conflict, including in Gaza and Sudan.”
“The board welcomes Meta’s commitment to end what has effectively been a blanket ban on use of the term ‘shaheed’ when referring to designated dangerous organizations and individuals,” Paolo Carozza, a member of the board, said in a statement.
Meta and other social media companies have struggled to strike a balance between hate speech and free expression on their platforms, particularly in the aftermath of the Oct. 7 Hamas terrorist attacks in Israel and resulting rise of antisemitism, including online rhetoric.
In May, Meta’s Oversight Board announced that it would review three recent posts — including one, tied to the war in Gaza, that called all Israelis criminals — that the board previously removed. The board also said it would consider recommending changes to Meta’s hate speech policies.
The Israel-Hamas war also came into focus for Meta last year when the social media powerhouse issued two decisions in December, each a response to users’ appeals after posts were removed for violating Meta policies limiting the sharing of videos depicting terrorism or violent content. The decisions revealed that after Oct. 7, Meta, in response to what it described as “an exceptional surge in violent and graphic content,” lowered the bar for when it would automatically remove content that might violate the platforms’ standards on hate speech, violence, incitement and harassment. But the Oversight Board determined that the change resulted in the preemptive removal of content that users should have been allowed to post.