TikTok has recently come under scrutiny due to concerns over child safety in the US and other countries, primarily because of its large userbase of young people and the presence of inappropriate content on the platform. In response, TikTok, owned by China’s ByteDance, has announced new measures to give parents more control over what their teenagers can see on the app. These measures come in the form of content filtering controls added to the “Family Pairing” feature, allowing parents to filter out videos that contain specific words or hashtags while still keeping their children engaged with the app.
Family Pairing was introduced by TikTok in 2020 as a way for parents to link directly to their children’s accounts and remotely manage certain features, such as disabling direct messages, setting screen time limits, and enabling a “restricted content” mode. In the past year, TikTok also added a tool that automatically filters out videos containing certain words or hashtags from users’ For You or Following feeds.
The new content filtering controls essentially combine these two features, giving parents the ability to remotely filter out videos containing specific words or hashtags from their children’s accounts. TikTok states that this tool is being added to Family Pairing to empower caregivers in reducing the likelihood of their teenagers being exposed to content that they may find jarring or inappropriate.
It’s important to note that children will be informed when their parents apply these filters, and they will have the option to opt-out if they choose. TikTok believes that this transparency can also facilitate conversations about online boundaries and safety between parents and their children. The company emphasizes that it wants to respect young people’s right to participate in the platform while also addressing concerns regarding their safety.
In addition to these new controls, TikTok has announced its plans to establish a global Youth Council later this year. The purpose of this council is to listen to the experiences of young people who directly use the platform, allowing TikTok to better understand their needs and make appropriate changes to ensure the safest possible experience for its community.
TikTok has faced criticism in the past for exposing children to content that promotes self-harm, eating disorders, and other inappropriate material, sometimes using altered hashtags to bypass moderation. The company is currently facing new content regulations in the UK through the Online Safety Bill, while US lawmakers are working on the Kids Online Safety Act, which would require social media platforms like TikTok to implement online safeguards for children. TikTok was recently banned in Montana, but the company is challenging the ban, arguing that it violates the First Amendment and other laws.
These efforts by TikTok to enhance child safety measures and provide parents with greater control over their children’s online experience are seen as positive steps in addressing the concerns surrounding the platform. By implementing content filtering tools and actively involving young people in decision-making through the Youth Council, TikTok aims to create a safer and more responsible platform for its users.
It’s worth mentioning that the content on TikTok is constantly evolving, and the company is committed to adopting new measures and regulations to ensure the safety and well-being of its users. As the platform continues to grow in popularity, it becomes increasingly important to address and mitigate any potential risks, especially when it comes to protecting young users who may be more vulnerable. By working towards a safer and more controlled environment, TikTok aims to provide a positive and enjoyable experience for all its users while maintaining the necessary safeguards to protect children.