New report: Antisemitic videos spreading undetected on TikTok
The Institute for Strategic Dialogue report found that extremists use creative ways to promote hate while evading platform moderators.
Photo Illustration by Jakub Porzycki/NurPhoto via Getty Images
The celebratory, familiar sound of “Hava Nagila.” A video clip of John Travolta in “Pulp Fiction.” An upbeat indie pop song by the band Fitz and the Tantrums.
Three unrelated cultural references. Yet all have been used to propagate antisemitism on the social media platform TikTok, according to a new report from the Institute for Strategic Dialogue, a London-based think tank that studies extremism and misinformation.
The report, a copy of which was obtained by Jewish Insider, highlights the surprising ways that TikTok users advance antisemitic narratives on the video-sharing application, which was the most downloaded app in the world in 2020. Content creators use code words and numbers, duplicate accounts, misspelled hashtags and more to evade detection on the platform.
One video with nearly 25,000 views uses the so-called “Confused Travolta” meme — which shows John Travolta’s character in “Pulp Fiction” walking around and looking confused — with the caption “me in heaven looking for the 6 million,” an oblique reference to Holocaust denial. A video from one account shows a baby crying, until he is given a paper described as “heresy,” as “Hava Nagila” plays in the background. Another video, set to the lyrics of Fitz Tantrum’s “Out of My League,” has the words “America First” above a slideshow of fascist flags and images of Father Charles Coughlin, the isolationist, antisemitic priest who was popular in the 1930s.
“Extremists are going to find a way to share their message,” said Carla Hill, associate director of the Anti-Defamation League’s Center on Extremism. “We’ll see a video of something entirely different and during that video, they’ll hold up a sign that says something extreme” — for instance, holding up a book promoting violence or espousing extremism — “or send the viewer to an extreme website. This type of content is so difficult to moderate.”
ISD’s report relied on a sample of 1,030 videos, consisting of more than eight hours of content that promote hatred and glorify extremism and terrorism. The report is not representative, but rather meant to be a random sampling of how hate manifests on the platform. With no official platform for researchers, any research into extremist content on TikTok must be done manually, with videos hand-picked and viewed by real people.
The goal of the report, said author Ciarán O’Connor, is “to really just try to answer the question of, What does hate look like on TikTok, and how is the platform used? How are the features used, the hashtags, the music, the profile, the parts of the profile themselves, as opposed to [just the] videos?”
One area of focus is the video comments section, which may not be monitored as frequently or as rigorously as the clips themselves. ISD found one video that showed a clip of an ultra-Orthodox Jewish man dancing the hora, with the caption “These people sure make great dancers! For more make sure to look up ‘dancing Israelis,’” a reference to a widely debunked conspiracy theory about the September 11 terrorist attacks.
“They’re going to need a shower after all that dancing, I hear 1930’s Germany has some showers that are out of this world,” reads one comment on the video, referring to Nazi gas chambers. In response, the video’s creator wrote, “I’ve heard the showers are to die for.”
What makes TikTok so successful is its algorithm, which quickly learns the topics and types of videos users enjoy. But that also makes it hard for someone who is fed more mainstream videos to see the dangerous content on TikTok. “TikTok is a bit like a walled garden. You can go down different sides of TikTok without ever seeing other segments of the site,” O’Connor explained.
TikTok’s audience skews younger than most social media platforms. More than half of users in the U.S. are under 30. “Young minds are easier to influence,” said Hill. “Adults have their own life lessons that they can apply when they see things and interpret whether or not it is believable. It’s still a problem, but at least an adult has the ability to understand what’s being said could or could not be true.”
This demographic presents a particular challenge in the realm of antisemitism. Recent studies show that American young adults know very little about the Holocaust. A September 2020 survey commissioned by the Conference on Jewish Material Claims Against Germany found that 63% of millennials and Generation Z did not know that six million Jews were killed in the Holocaust, and 10% did not recall ever hearing the word “Holocaust” before. ISD found that antisemitism on TikTok manifested most frequently as Holocaust denial.
In recent years, major social media platforms including Facebook, Twitter and YouTube have ramped up efforts to fight misinformation and extremism on their platforms. But they have all faced challenges in scaling their solutions to the problem.
“Every time a platform finds a way to stop something, [extremists] invent a new way to get around it,” Hill told Jewish Insider. “This is going to be a never-ending project for all platforms to try to minimize the amount of content on their platform.”
Social media platforms have struggled in recent years to counter violent and hateful content, with some advocates arguing that platforms are not doing enough to remove dangerous content.
Since TikTok, which was developed in China, became available internationally in 2018, it has been downloaded 130 million times in America. While the app has put out detailed guidance on policies against hate speech, extremism and misinformation, enforcement remains uneven.
“It still is so young,” said O’Connor. It has shown signs that it’s grown up, and they’ve learned the lessons that other platforms faced in tackling these problems. Their policies are quite robust. But this research shows I think that they still have a way to go.”