TikTok, known for short videos of people lip-syncing and dancing to their favorite songs, is grappling with a problem familiar to other social networks: hate speech.
On Thursday, TikTok said it removed more than 380,000 videos in the US this year for violating its rules against hate speech. The company also banned more than 1,300 accounts for hateful content or behavior, and took down more than 64,000 hateful comments
TikTok defines hate speech as “content that intends to or does attack, threaten, incite violence against, or dehumanize an individual or group of individuals on the basis of protected attributes like race, religion, gender, gender identity, national origin, and more.”
The short-form video app, which is used by 100 million Americans, is filled with content that promotes white supremacist groups and antisemitism, according to a report released in August by the Anti-Defamation League. Civil rights activists have urged social networks, including Facebook, to do more to pull down hateful content on their sites.
TikTok has other problems to worry about outside of hate speech. It’s been targeted by the Trump administration because the video app is owned by Chinese tech company ByteDance. The president has signed two executive orders that impact TikTok’s future. One order would bar any US transactions with ByteDance and its subsidiaries, which means that TikTok would effectively be banned in this country if it’s not sold to another company by Sept. 20. Another executive order, signed last week, ordered ByteDance to sell TikTok’s US operations within 90 days. Both orders cite national security concerns about TikTok. The company has said that it wouldn’t turn over US user data to the Chinese government even if it were asked to do so.
Microsoft, Oracle and Twitter have been in talks with ByteDance about purchasing TikTok’s US operations. But acquiring a social media company also means they would have to deal with content moderation issues. In an interview with Wired, Microsoft co-founder Bill Gates has described a potential deal with TikTok as a “poisoned chalice,” adding that “being big in the social media business is no simple game.”
Eric Han, who oversees safety at TikTok in the US, said in a blog post that the company wants to improve its hate speech policies and how it tackles this problem. He outlined five ways TikTok is trying to address hate speech. The company consults with experts to make any changes to its hate speech policies and takes action against this type of content by banning accounts or not including offensive content in search results. When users search for “heil Hitler” on the app, for example, the company wouldn’t show these results or redirects them to its rules against hate speech. The company also trains content moderators to understand different cultures. Some groups have reclaimed ethnic slurs to fight back against people who try to oppress them with derogatory terms. TikTok said it’s trying to increase transparency about content moderation and invest in tackling hate speech.
“We recognize the perhaps insurmountable challenge to completely eliminate hate on TikTok — but that won’t stop us from trying,” Han said.