Apple cancels the child-focused ‘CSAM’ system, which has attracted reaction

Apple cancels the child focused CSAM system which has attracted reaction


appleIt was said that the decision to postpone the new period it will start for child safety, and then an annulment decision on this issue may have come.

apple, Last year, it really made a big impact with the innovations it prepared to increase child safety. To remember, three new features were introduced, developed in partnership with experts. Here first iOS 15, iPadOS 15, watchOS 8 and macOS Monterey With the built-in messages app now using machine learning, the bad (Nude photos etc.) can hide media contents. The most important detail in the new infrastructure is Child Sexual Abuse Material (CSAM) referred to as a child sexual object There had been a plan to reduce the spread of content showing as

Because that’s what the tech giant is for. iOS 15 and iPadOS 15 together with announced that they would start to do device-based photo scanning. This photo scan will process based on algorithm trained with a special database, and photos containing child abuse can be found directly on iPhones and iPads or It would be detectable in iCloud backups.

Those who host or spread these photos would then be reported to the authorities and necessary steps would be taken to arrest them. Photos detected by the system would be checked by human watchers before a person could be charged. This infrastructure is due to the reactions that have arisen. delayed before release. At that time, Apple did not cancel the process, and announced that it would review and make corrections in line with the notifications received. Then, from the company’s child safety page. Child Sexual Abuse Material (CSAM) It was found that he removed all the related phrases.

YOU MAY BE INTERESTED

However The Verge to the website speaking Apple spokesperson, announced that no step back was taken for the system. Despite this, the company still has not announced when the system will be fully commissioned. So on top of all this, what was the final result? The company has now completely canceled the system, which has attracted reactions. Apple spokesperson to WIRED in his statement, “We’ve decided not to advance the CSAM detection system we previously recommended for iCloud Photos.

Children can also protect their personal data from corporate screening and we will continue to work with governments, child advocates and other companies to help protect young people/children, protect their privacy rights and make the internet a safer place for all of us, especially children.” said.

By the way, Apple today iCloud in the scope of “Advanced Data Protectionunder the name end-to-end encryption expanded its infrastructure to 23 different categories. This step, which will significantly increase security, was welcomed. iCloud data in all categories below is now end-to-end encrypted, no one including Apple can’t access:

-Device Backups
-Message Backups
– iCloud Drive
-Notes
-Photos
-Reminders
-Safari Bookmarks
-Siri Shortcuts
-Voice Memos
-Wallet Cards

lgct-tech-game