TikTok Changes Content Rules
TikTok on Wednesday released detailed rules about the videos it permits and prohibits, seeking to respond to concerns that its policies to protect users.
The updated guidelines outline a total of 10 categories of videos that aren’t allowed on the Chinese-owned social media app, including those that glorify terrorism, show illegal drug use, feature violent, graphic or dangerous content or seek to peddle misinformation that’s designed to deceive the public in an election, the company said.
TikTok previously banned a wide range of short-form videos that could cause users harm. However, the huge popularity of the app has raised quastions about how exactly the company enforced those rules. TikTok faces unique challenges because of its Chinese ties, which have fueled suspicions that it manages content in a way that satisfies Beijing’s censors. ByteDance, the company behind TikTok, has been accused of removing videos for political reasons, but now the company has denied the allegations.
“We’re committed to being transparent about the ways we maintain the app experience users expect while providing the protections they deserve,” Lavanya Mahendran and Nasser Alsherif, who lead global trust and safety at TikTok, wrote in a blog post.
Company executives said that the new rules would be applied globally but that they would also “localize and implement” them “in accordance with local laws and norms,” raising questions about how they ultimately might be enforced around the world.
TikTok has been downloaded more than 1.5 billion times. It has proved popular among young ages who post short videos of themselves and their friends.
The U.S. Army and Navy recently banned TikTok from government-issued devices and the Pentagon has directed that its approximately 23,000 employees uninstall it from their phones.
In October, TikTok retained two former members of Congress to help create a special committee of “outside experts” to “advise on and review content moderation policies covering a wide range of topics, including child safety, hate speech, misinformation, bullying, and other potential issues,” it said in a blog post.
TikTok’s new rules bar support for a wide array of terrorism and hate groups, including their logos and portraits, but exempts educational and satirical depictions, the company said.
The app also bans content depicting “drinking liquids or eating substances not meant for consumption” or that could mislead “community members about elections or other civic processes.” Content that depicts sexual arousal or nudity is also banned, but the company makes exceptions for educational or artistic purposes, such as videos “discussing or showing mastectomy scars.”
Slurs are prohibited, though TikTok said it “may give exceptions” when they’re used self-referentially or in lyrics. But music that promotes “hateful ideologies” is banned, according to the new rules.