TikTok might be the latest social media platform to catch the attention of young people everywhere, but reports of anti-semitism and racist content have raised questions about its safety.
The app, which is owned by Chinese-based ByteDance Ltd., is a video-sharing and creative platform that allows users to upload 60-second videos to a network of followers. Today, it is the 7th most downloaded app of the decade and boasts more than 1.7 billion users - 800 million of whom log in at least once a month.
Most of the content on the platform is innocent, showing people lip-syncing and dancing to popular songs or young people simply telling jokes to the camera. However, there has been rising concerns about accounts that promote violence and anti-Semitism on the platform and the company’s reluctance to commit to its own terms of service in monitoring the content online.
Professor Gabriel Weimann, author of ‘Spreading Hate on TikTok’, outlines some of the ways that the company fails to address the problems. He writes that the platform has attracted the attention of far-right extremists, who often post content that is anti-Semitic. These include, but are not limited to, theories that Jews caused 9/11 or run some ‘global conspiracy’ to control the world’s governments and media institutions.
The study, which was conducted between February and May of 2020, highlighted hundreds of far-right attackers, anti-Semitic posts, videos of speeches by Hitler, and promotion of the KKK. These included 43 anti-Semitic messages and 14 different instances of the Nazi salute.
Most of the time, children are unaware of the content that they are exposed to. Since 41% of TikTok’s user base is under the age of 24 - with most of that category under the age of 16 - they are unaware of some of the social or historical contexts of the material they see.
“You don’t expect a platform like that to be invaded by evil forces,” Professor Weimann said in an exclusive interview with CTech. “When I say evil forces, we focus on anti-Semitism, but there are instances of pedophiles, sexual material, or pornographic material.”
According to TikTok’s own policy, users may not post “any material that is racist or discriminatory, including discrimination on the basis of someone’s race, religion, age, gender, disability, or sexuality.” However, this has been largely ignored and it appears that its terms of service policies were rushed out to quickly adapt to their sudden rise in usership.
Rick Eaton, Senior Researcher at the Simon Wiesenthal Center, discussed how TikTok has had to play catch up with ‘legacy’ platforms like Facebook and Twitter: “I think that the biggest thing is that the tech companies, or the more traditional companies like Facebook, Google, and Twitter, have had a lot more time to deal with the situation and to go through these various stages of being used by anti-Semites and terrorists. They’ve had a little bit more time to go assess their policies and put into place various measures to deal with them… if all of a sudden you go from 100,000 users to 800 million, there’s going to be some growth issues there.”
As far-right violence appears across social media platforms, questions often arise about the role of government and what part should be played by policymakers to curb some of the behavior seen online. In the U.S., it is made more complicated by First Amendment protections for free speech partnered with a president who is liberal with his own facts on Twitter.
“Many people, unfortunately, don’t understand the definition of censorship,” Eaton explains. “The only definition of censorship is when the government tells you you can’t do something.”
It seems that tech companies are often unlikely to assume responsibility for the content that appears on their platform. Under Section 230 of
the Communications Decency Act, social media companies can technically identify as a ‘platform’, and not a publisher. This means they don’t take any legal responsibility over the material that is found on their sites.
So where does this leave us? Since TikTok is Chinese-based and not American or European, it doesn’t answer to the same rules as other companies in western democracies. For now, 3rd party institutions need to fill in those gaps and make sure companies are acting responsibly by moderating the content posted by their users.
“Facebook has presented the best model because from the very beginning of that company, they understood the potential for pedophiles and others,” said Rabbi Cooper from the Simon Wiesenthal Center. “Insofar as TikTok, in some ways it is reminiscent of new companies coming in and making an impact.”
Whatever happens, TikTok is an appealing platform to billions of young and impressionable minds across the world. Professor Weimann urges TikTok to ‘play by its own rules’ in implementing some of the policies it has. If not the response could be deadly. A study by Tel Aviv University’s Kantor Center
published in April found that 2019 witnessed a rise of 18% in major violent cases compared to 2018 (456 cases in 2019 compared to 387 in 2018). Those incidents resulted in the killing of seven Jews. At least 53 synagogues and 28 community centers and schools were attacked in 2019. Online manifestations of hate have seen a major surge since the outbreak of the coronavirus pandemic, with Jews being accused of creating or spreading the virus or benefiting from it.
“These are the people who will be molesting women, or the people who, in the future, will use aggression as a way to solve their conflicts,” Weimann concludes.
TikTok did not respond to our request for a comment on this story.