Facebook needs the IHRA definition

'What Facebook needs is a specific policy on antisemitism that captures all the ways it manifests. Ideally one supported by international experts, governments, and civil society'.


ONLINE hate, including antisemitism, has been rising during the pandemic. Facebook is under increasing pressure to improve its response. This week, 125 organisations signed an open letter calling on Facebook to adopt the IHRA Working Definition of Antisemitism. The definition, adopted unanimously by the International Holocaust Remembrance Alliance (IHRA) in 2016, is increasingly being turned to as a tool in the fight against antisemitism by both nation states and organisations. Last month Spain became the thirtieth country to adopt the IHRA definition.

UN Secretary-General António Guterres highlighted the utility of the definition explaining how IHRA’s common definition “can serve as a basis for law enforcement, as well as preventive policies” as we address antisemitism internally. That is the kind of solution a global platform like Facebook needs.

Facebook has policies on hate speech, but not a specific policy on antisemitism. Some forms of antisemitism are “generic hate speech” targeted at Jews. Others simply have no generic equivalent. Without tools to tackle antisemitism, some forms of antisemitism will always fall through the cracks.

To take one example, many antisemitic tropes claiming that Jews control the bank, governments or media and are a threat to society. They can usually be traced back to the Protocols of the Elders of Zion. First produced as a newspaper serial in Russia in 1903, this book falsely presents itself as a record of secret meeting of a Jewish cabal. Each year new editions appear with updated forwards that explain the tragedies of the past year and how they are a result of “the Jews” and their plans. This material spreads antisemitic hate and inspires antisemitic violent extremism. Despite successful efforts by the Online Hate Prevention Institute (OHPI) and Executive Council of Australian Jewry (ECAJ) to get such pages removed in past, as far back as 2013, new versions appeared and have remained on the platform for years.

To report antisemitism on Facebook, the user selects “hate speech”, then “race or ethnicity”, then must choose to proceed with a formal report. This brings up a warning which says, “Before you report, does the post go against our Community Standards on hate speech?” It warns that “We only remove content that directly attacks people based on certain protected characteristics. Direct attacks include things like: Violent or dehumanising speech – For example, comparing all people of a certain race to insects or animals; Statements of inferiority, disgust or contempt – For example, suggesting that all people of a certain gender are disgusting; Calls for exclusion or segregation – For example, saying that people of a certain religion shouldn’t be allowed to vote”.

Content on Facebook which promotes the protocols, or which uses Facebook to republish them, is a direct attack on Jewish people, but claiming Jews are powerful and control society does not fit within any of the examples given. The IHRA working definition, by contrast, clearly says that, “Making mendacious, dehumanising, demonising, or stereotypical allegations about Jews as such or the power of Jews as collective — such as, especially but not exclusively, the myth about a world Jewish conspiracy or of Jews controlling the media, economy, government or other societal institutions” is antisemitic. The easiest way for Facebook to close this and other gaps is to adopt and apply the IHRA definition. That would also ensure consistency with the expectations of a growing number of governments and organisations.

Facebook has made great strides in tackling online hate speech. According to a recently released Civil Rights Audit, in March this year 89% of hate speech was removed through the use of artificial intelligence. It was removed before anyone could see and reported it. This is up from 65% a year earlier. Other platforms simply don’t have this level of automated hate removal. Data from the European Union shows that Facebook responded to reports of online hate within 24 hours 96% of the time, which is better than any other platform. Twitter for comparison only responded within 24 hours 76.6% of the time. Facebook removed 87.6% of the material reported, compared to Twitter’s 35.9%.

The problem is not that Facebook is bad at removing hate speech, or indeed particularly bad at removing antisemitism. The problem is that Facebook has some rather large blind spots. There are many forms of antisemitism that just don’t fit the policy on hate speech, a confusion for both users and staff. Some antisemitism is rapidly removed, other types fester for years. To improve further, what Facebook needs is a specific policy on antisemitism. A policy that captures all the ways antisemitism manifests. Ideally one supported by international experts, governments, and civil society. The IHRA Working Definition of Antisemitism is ready and waiting for adoption.

Dr Andre Oboler is CEO of the Online Hate Prevention Institute and a lecturer in Cyber Security at La Trobe University. He serves as an Expert Member of the Australian Government’s Delegation to IHRA.

read more: