To further protect minors on its platform, TikTok will allow creators to restrict their suggestive content to those over 18. A measure that starts from a good intention, but which could open the door to abuse…
Like any social network, TikTok has its share of excesses, particularly with regard to the protection of its young audience, with whom it is extremely popular. It regularly finds itself at the heart of scandals, in particular because of the exposure of minors to dangerous content – which even leads to the death of some users – violent and sexual. Children and adolescents therefore find themselves massively exposed to age-inappropriate content, because they are not aware of the risks they run and are not emotionally mature enough. Fortunately, there are tools that allow parents to protect their children when browsing the Chinese social network (see our practical sheet).
TikTok has implemented and continues to improve its algorithms to identify sexually oriented videos and hate speech. He has just banned livestreams at least 18 years old and passed the minimum age to organize them at the age of majority following many incidents where elderly viewers paid teenagers to strip naked online. The platform announces in a press release that it extends this audience control feature by allowing creators to restrict their videos to those over 18. A ban which may not be effective, given that it is based on the age indicated on the user profiles, and which could well open up new perspectives – in particular financial – for the Chinese social network.
TikTok: an ambivalent audience control function
The audience control function is apparently part of TikTok’s ongoing effort to control the content of the platform and thus better protect minors. As a reminder, TikTok’s community guidelines prohibit nudity, sexual activity, and sexually explicit content. Sexually suggestive content cannot appear in the recommendations of the “For you” feed. Now, creators will be able to restrict teens’ access to “borderline” suggestive content that doesn’t directly violate community guidelines — and therefore escapes algorithms. “By identifying sexually suggestive or explicit content, we’ve stopped teen accounts from viewing over 1 million overtly suggestive videos in the last 30 days alone”explains the Chinese giant, while admitting to having difficulty detecting “borderline” content (mentions and apology for suicide and eating disorders, promotion of incest, mutilation, etc.).
The project was tested in early 2022 and should be available worldwide in the coming months. Even if it certainly starts from a good intention, this tool could however open the door to other problems and lead to abuse. In particular, it could make TikTok a little more permissive towards restricted content and attract a new audience and type of content, much like OnlyFans and Mym (Me Your More), which allow users to subscribe to their creators and influencers by exchange of exclusive content. Of course, the sexual content ended up being the majority — that’s what pays the most. The door is now open to a very lucrative business thanks to the monetization of suggestive content that is bordering on the limits.
“Our goal has always been to make sure our community, especially the teens on our platform, have a safe, positive, and joyful experience when they come to TikTok,” writes the company. “We’ve already taken significant steps to help ensure their feeds are filled with appropriate content for them, and these enhancements mark an important next step in achieving that goal.” Still, as long as the social network does not effectively verify the age of users at the time of their registration, it will be very difficult to really protect minors – even if you must be 13 years old to register, many lie about their true age.