CSAM: Apple Stopped The Implementation Of Child Pornography Detection

Apple has finally stopped the development of the CSAM detection presented in the summer of 2021. The feature was supposed to recognize child pornography when uploading photos to iCloud using special hashes. 

The technology was misunderstood in many places when it was announced, then discussed controversially, and the rollout was later paused.

This was stated by the senior vice president of Apple, Craig Federighi, in an interview with The Wall Street Journal. Federighi said that Apple’s focus on child protection has been on areas such as communication and providing parents with tools to protect their children in iMessage. “Child sexual abuse can be headed off before it occurs. That’s where we’re putting our energy going forward.”

He also noted that the rejection of the project was indirectly related to the introduction of end-to-end data encryption on iCloud.

For CSAM (child sexual abuse material) detection, hashes of photos uploaded to iCloud were to be compared with hashes of known child sexual abuse material from a National Center for Missing & Exploited Children (NCMEC) database. According to original plans, Apple would have turned on the law enforcement authorities if a threshold of around 30 hits was exceeded.

Bhasker Das
Bhasker Das
Bhasker Das, with a master's in Cybersecurity, is a seasoned editor focusing on online security, privacy, and protection. When not decrypting the complexities of the cyber world, Anu indulges in his passion for chess, seeing parallels in strategy and foresight.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More from this stream