“Apple Is Rolling Out Mass Surveillance To The Entire World”: Edward Snowden And Other Experts Criticizes Apple’s Plans To Fight Child Pornography

Apple has announced the addition of a new cryptographic monitoring system, CSAM scanning, with iOS and iPadOS to limit the distribution of child pornography. The company said it will now scan photos from iCloud and Messages for pornographic content with children using machine learning. 

However, many experts are appalled at Apple’s plans under the pretext of protecting children — they believe Apple’s new measures as potentially dangerous.

Whistleblower Edward Snowden called Apple’s digital inspection a tool of mass surveillance. He warns, “they can scan for anything tomorrow.”

The Electronic Frontier Foundation (EFF) criticized Apple in an article and noted that the implementation of the new measures opens up potential backdoors in a secure encryption system.

According to EFF, Apple now has a “fully built system just waiting for external pressure to make the slightest change.” With “change” means that Apple could scan for anything instead of child pornography.

The organization indicated that scanning content with a specified database could lead to dangerous use cases. For example, in a country where homosexuality is a crime, the government “might require the classifier to be trained to restrict apparent LGBTQ+ content.”

Matthew Green, a security professor at Johns Hopkins University, said the people who control the database that identifies sensitive photographs have too much power.

Whatsapp boss Will Cathcart also criticizes this: “This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” writes on Twitter.

The question is also open how Apple wants to deal with demands from other governments that want to search for certain content.

Apple wants to scan the photos directly on the users’ devices. For this purpose, hashes are to be created from the existing files and compared locally with a database that contains hashes of known child pornographic material. The database should be updated regularly with updates from iOS and iPadOS. If a threshold value from several analyzes or vouchers stored in the iCloud is exceeded, the system should sound the alarm, and Apple checks the content manually.

Apple has not yet responded directly to public criticism. According to 9to5 Mac, the person in charge said in an internal mail that “Apple acknowledges the ‘misunderstandings’ around the new features, but doubles down on its belief that these features are part of an ‘important mission’ for keeping children safe.”

Sabarinath
Sabarinathhttps://techlog360.com
Sabarinath is the tech-savvy founder and Editor-in-Chief of TechLog360. With years of experience in the tech industry and a computer science background, he's an authority on the latest tech news, business insights, and app reviews. Trusted for his expertise and hands-on tips for Android and iOS users, Sabarinath leads TechLog360 with a commitment to accuracy and helpfulness. When not immersed in the digital world, he's exploring new gadgets or sharing knowledge with fellow tech enthusiasts.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More from this stream