European Union wants to scan all sent messages

European Union wants to scan all sent messages


European Union “CSAM” It wants to scan all messages sent on a basis. As you can guess about this It attracts a lot of reactions.

The European Union will soon introduce some new rules that will deeply affect the world of instant messaging. In this context, including those that are encrypted Bulk scanning of digital messages will be mandatory. The EU’s aim here Child Sexual Abuse Material (CSAM) passed as, To combat content that portrays the child as a sexual object. When the rules that have been on the table since 2022 are put into effect, there will be a change in the messaging world. “load control” period will begin. This system will automatically scan all digital messages of citizens in Europe, including shared images, videos and links. Services that will install this tracking technology will obtain permission from people before scanning messages. Those who do not accept this will not be able to share images or URLs as messages. This infrastructure basically breaks end-to-end encryption, which poses serious risks. Making a statement on this exact issue, Signal’s head, Meredith Whittaker, said: “The EU’s new message screening plan is simply a new brand of old-style surveillance. “Whether you call it a backdoor, a front door, or “upload control,” the steps will weaken encryption and create significant security vulnerabilities.” said. Among the people who reacted most to this issue were: Will Cathcart, the head of WhatsApp, is also included, but the EU does not seem to push back.

YOU MAY BE INTERESTED IN

Apple, which previously had to shelve its plans regarding CSAM, came to the fore. The technology giant has made great progress in recent years with its innovations to increase child safety. To remind you, three new features developed jointly with experts were introduced. here first iOS 15, iPadOS 15, watchOS 8 And macOS Monterey The built-in messages app, along with machine learning, uses machine learning toNude photos etc.) was said to be able to hide media content. The most important detail in the new infrastructure is CSAM It had been a plan to reduce the spread of content. This is why the technology giant iOS 15 And iPadOS 15 together with announced that they would start device-based photo scanning.

This photo scanning will be processed on the basis of an algorithm trained on a special database, and photographs containing child abuse will be processed directly on iPhones and iPads or It could be detected in iCloud backups. The company later had to cancel this system, which attracted a lot of criticism. Apple spokesperson to WIRED In his statement, he said, “We’ve decided not to move forward with the CSAM detection system we previously recommended for iCloud Photos. Children can also be protected from companies scanning their personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people/children, protect their privacy rights and make the internet a safer place for all of us, especially children.” he said.

Apple’s director of user privacy and child safety, Erik Neuenschwander, made a statement on this issue last: “Scanning each user’s privately stored iCloud data would create new risks for data thieves and would also create a slippery slope that could lead to unintended consequences. Scanning even one type of content opens the door to mass surveillance and may create a desire to search other encrypted messaging systems. So, after collaborating with a number of privacy and security researchers, digital rights groups, and child safety advocates, Apple concluded that development of the CSAM scanning mechanism, even though it was created specifically to protect privacy, would not proceed.” he said.

lgct-tech-game