Is Instagram deliberately feeding child pornography? The social network’s algorithm recommends videos of sex and sexualized children to users following only children, all interspersed with advertising.

Is Instagram deliberately feeding child pornography The social networks algorithm

Is Instagram deliberately feeding child pornography? The social network’s algorithm recommends videos of sex and sexualized children to users following only children, all interspersed with advertising.

The troubles continue for Meta. While the paid subscription for Facebook and Instagram is being attacked in court by associations and the social network is accused by several American states of “covet and attract” under 13s on its platform – while the platform is supposed to be prohibited for users of this age – Mark Zunckerberg’s firm is once again in a bad position. This time, it’s the algorithms for Reels that pose a problem. Intended to stimulate user engagement, these systems are trained to offer them more content related to what interests them. However, according to an investigation by Wall Street JournalInstagram would recommend adult content “overtly sexual” and videos sexualizing children to users following only children or adolescents.

Instagram: an alternation of sexual content and advertisements

The American daily first noticed that many of the subscribers to the user accounts were middle-aged men “having demonstrated an interest in sexualized content, both around children and adults”. He then allowed himself to realize some testing, creating accounts only following young gymnasts, cheerleaders, and other tween and teen influencers. The Reels algorithm was quick to recommend a mix of adult pornography and content sexualizing children. All interspersed with advertisements for major American brands, such as Walmart, Disney and Pizza Hut, in order to garner maximum revenue.

In a feed of Instagram-recommended Reels, an ad for the dating app Bumble appeared between a video of a person caressing the face of a life-size latex doll and another of a young girl whose face is digitally masked and who lifts his shirt to reveal his stomach. In another case, an ad for Pizza Hut came after a video of a man lying on a bed and wrapping his arm around a 10-year-old girl. Journalists were also able to observe content from one adult creator showing her crotch and another making back-and-forth movements. And these are not isolated tests! The Canadian Center for Child Protection, a child protection organization, conducted similar tests in mid-November and also concluded that the social network recommended content with “adults and children taking sexual poses”.

Instagram and child pornography: a story that is not new

The consequences of these revelations were not long in coming: several advertisers decided to withdraw their advertisements from the platform. This is the case of the dating application Bumble, the Match group – the parent company of Tinder – and Disney. For its part, Meta assures that the tests carried out by the newspaper do not reflect reality. “Our systems are effective at reducing harmful content, and we have invested billions in safety, security and brand suitability solutions”, assures Samantha Stetson, Meta’s vice president of advertising industry relations. The company said it was investigating the issue and was willing to pay for security audits to be carried out by an external company to check how often ads appeared alongside content deemed objectionable.

However, the problem has been known for several years now. In 2018, The Rat King, a YouTuber specializing in Internet abuses, had already pointed out that Instagram was a real hunting ground for pedophiles and a place for the circulation of child pornography content. Likewise, the Wall Street Journal revealed last June that the social network was the main platform used by child criminal networks for the promotion and sale of child pornography content – ​​some users use explicit hashtags like #pedowore or #preteensex to carry out their research.

The worst part is that Meta would not voluntarily make the necessary changes because this type of content generates engagement from users, and therefore makes money. According to company documents consulted by the American daily, the firm’s security managers are generally not allowed to make changes to the algorithm that could reduce the number of daily active users on the site. platform in a measurable way. There are priorities!

ccn3