Apple Scans iPhones for Images of Child Abuse

26

Tech giant Apple is going to scan iPhones to find out whether photos in iCloud contain child abuse. Critics see opening Pandora’s box that takes away privacy allows for mistakes and abuses authoritarian regimes.

 

In a statement, Apple says it will now do more against the distribution of images of child abuse, described in the US as Child Sexual Abuse Material or CSAM. Therefore, from the next versions of its operating systems (notably iOS 15, iPadOS 15, WatchOS 8 and MacOS 12 Monterey), it will install tools that detect when child abuse images are uploaded to iCloud.

As soon as a number of those images are detected, human control follows and law enforcement is informed about who is collecting those images. “The system is designed for privacy and will help Apple transfer valuable information to law enforcement about CSAM collections in iCloud Photos.”

At the moment, the intention is to apply automatic scanning to US users and only for child pornography. But Apple itself indicates that it does not stop there. “This is an ambitious program, and protecting children is an important responsibility. These efforts will evolve and expand over time.”

At the same time, there will also be a tool for parents that will provide notifications if their child receives ‘sensitive material’ via an iMessage message. Again, that control is done with the help of AI.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.