in ,

iPhones Will Now Scan Child Sexual Abuse Images

The feature will analyze your device via Apple user’s iCloud photos account for sexually explicit images of children.

iPhones Will Now Scan Child Sexual Abuse Images

Apple Inc. said it will analyze your iPhone and iPad for sexually abused images and will report any relevant findings to the authorities. The company explains in a news release about the technical process of it will work.

This feature will also introduce additional features to Apple voice assistant Siri, The company is also rolling out two related features to Siri and search. The systems will be able to respond to questions about reporting child exploitation and abusive images and provide information on how users can file reports.

The second feature warns users who conduct searches for material that is abusive to children. The Messages and Siri features are coming to the iPhone, iPad, Mac, and Apple Watch.

Follow Startup Packs for instant news about Tech, Business, Science, and more!

Well, this feature not only sticks to your iOS devices but also works during chats. According to Al Jazeera, As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit.

Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind, Apple’s announcement said. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

Follow Startup Packs for instant news about Tech, Business, Science, and more!

Apple said the Messages feature uses on-device analysis and the company can’t view message contents. The feature applies to Apple’s iMessage service and other protocols like Multimedia Messaging Service. The feature in Messages is optional and can be enabled by parents on devices used by their children.

Regarding the user’s concerns about safety and privacy, the company said in a statement.

Apple said its system has an error rate of “less than one in 1 trillion” per year and that it protects user privacy.

“Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account, Apple only learns about images that match known CSAM.”

Apple Inc.

Source: Al Jazeera

This is indeed a great initiative from Apple to reduce child abuse as we have seen the increment with the help of technology, Apple has found a way to at least reduce and we believe other major companies will also follow this to reduce child abuse. The company also expects iPhone sales to increase by 50% despite the global chip shortage.

More About Tech Giants

What do you think?

Written by Hammad Khalid

Leave a Reply

Your email address will not be published. Required fields are marked *

Christopher Columbus Quotes

Top 60 Christopher Columbus Quotes Of All Time

WhatsApp's CEO And Other Tech Officials Fire Back at Apple's Child Abuse Safety

WhatsApp’s CEO And Other Tech Officials Fire Back at Apple’s Child Abuse Safety