Published on
Updated
Reading 2 min.
Meta announced new measures on Instagram on Thursday to protect young people from blackmail with nude photos, the platforms being increasingly scrutinized in Europe and the United States for the protection of minors online.
Meta will thus set up a “nudity checker” on Instagram, set by default on minors’ accounts, which will automatically detect images with nudity received on the application’s messaging system and will blur them.
Avoid exposure of nude photos
“This way, the recipient is not unwantedly exposed to intimate content and has the choice to view this image or not“, explains to AFP Capucine Tuffier, in charge of child protection at Meta France.
Awareness messages on sexual photo blackmail, also called “sextortion”, will be sent at the same time to the sender and receiver of the images, reminding them that this sensitive content can result in screenshots and transfers by malicious people.
“IThis is to reduce the creation and sharing of this type of images“, summarizes Ms. Tuffier.
Detection of potential criminal accounts
Furthermore, when an account has been identified by Meta’s artificial intelligence tools as potentially being the source of this type of blackmail, its interactions with minor users will be strongly limited.
A potential “criminal” account will therefore not be able to send private messages to a minor’s account, will not have access to its complete list of subscribers (the minors’ accounts being hidden) and the minors’ accounts will no longer appear in the search bar, details Capucine Tuffier.
Meta will also warn a young user if they have come into contact with a potential blackmailer. The minor will then be directed to a dedicated “Stop Sextortion” site and will have access to a telephone crisis line in partnership with associations.
Good in his body, good in his head!
Better protect the mental health of young people
These new measures will be tested from May in a handful of countries in Central and Latin America before a global deployment in the coming months.
Meta, accused in the United States and France of harming the mental health of adolescents, had already announced in January a first round of measures to better protect young users.
Among these, a minor user will now need explicit permission from their parents to change their account from private to public, access more so-called “sensitive” content or have the possibility of receiving messages from people they is not already following on the platform.
The European Commission has launched separate investigations into Meta, Snap (Snapchat), TikTok, YouTube on the measures implemented to protect the “physical and mental health” of minors.