Apple Scraps Plan to Scan iPhones for Child Abuse Images

92

Apple is scrapping its plan to scan images on iOS devices and iCloud for images of child abuse. The plan caused a lot of controversies.

 

The American tech giant announced the plan in August 2021. Apple goes further than competitors with this. While many companies only scan content in the cloud, Apple also wanted to check content on users’ devices. The plan led to a lot of criticism, especially concerning privacy.

Later in 2021, Apple stated that the controversial scanning feature was being deferred. However, in a statement, Apple now tells Wired that it is waiving the scans.

“Children can be protected without companies going through personal data, and we will continue to work with governments, children’s advocacy groups and other businesses to help protect young people, ensure their right to privacy and make the internet a safer place for children.” and all of us,” Apple said.

The company, therefore, waives the scans and focuses on its Community Safety, which was made available in December 2021. This is an opt-in functionality aimed at parents that warns both children and their parents if they receive or send sexually explicit images via iMessage.

Such scans are not limited to Apple. But the practice is often less obvious than the theory. For example, last summer, it became known that a man was kicked out of his Google account by Google because he had shared images of his child’s genitals with a doctor, purely for medical reasons. But that’s how the photo passed through Google’s servers, at which point the company thought it necessary to close the account.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.