Apple spoke again about the “CSAM” system that it had to cancel

Apple spoke again about the CSAM system that it had


Apple, one of the largest technology companies in the world, is again forced to cancel the “CSAM” system. spoke.

apple, It made a big impact last year with the innovations it prepared to increase child safety. To remember, three new features were introduced, developed in partnership with experts. Here first iOS 15, iPadOS 15, watchOS 8 And macOS Monterey The built-in messages app, along with the bad (Nude photos etc.) can hide media contents. The most important detail in the new infrastructure is Child Sexual Abuse Material (CSAM) referred to as a child sexual object There had been a plan to reduce the spread of content showing as That’s what the tech giant is for. iOS 15 And iPadOS 15 together with announced that they would start to do device-based photo scanning. This photo scan will process based on algorithm trained with a special database, and photos containing child abuse can be found directly on iPhones and iPads or It would be detectable in iCloud backups. The company later canceled this system, which received a lot of backlash in general. Apple spokesperson to WIRED in his statement, “We’ve decided not to advance the CSAM detection system we previously recommended for iCloud Photos. Children can also protect their personal data from corporate screening and we will continue to work with governments, child advocates and other companies to help protect young people/children, protect their privacy rights and make the internet a safer place for all of us, especially children.” he said.

YOU MAY BE INTERESTED

There’s another recent statement on this subject, this time Apple’s director of user privacy and child safety. Erik Neuenschwander “Scanning each user’s privately stored iCloud data would create new risks for data thieves and also create a slippery ground that could have unintended consequences. Scanning even one type of content opens the door to mass surveillance and can also generate a request to search other encrypted messaging systems. Therefore, after collaborating with a number of privacy and security researchers, digital rights groups, and child safety advocates, Apple concluded that development of the CSAM scanning mechanism would not continue, even if it was created specifically to protect privacy.” said.

lgct-tech-game