in , ,

WhatsApp’s CEO And Other Tech Officials Fire Back at Apple’s Child Abuse Safety

WhatsApp’s head says Apple’s new privacy is a surveillance system and the app won’t be adopting that system!

WhatsApp's CEO And Other Tech Officials Fire Back at Apple's Child Abuse Safety

On Thursday, Apple Inc. said it will analyze your iPhone and iPad for sexually abused images and will report any relevant findings to the authorities and now, some tech officials had some concerns with that!

WhatsApp head says the app won’t be adopting Apple’s Child Safety measures. that was meant to stop child abuse from spreading digitally with the help of technology. But is there any transparency? Only Apple Inc. knows!

As reported by The Verge, WhatsApp head Will Cathcart, In a Twitter thread, he explains his belief that Apple “has built software that can scan all the private photos on your phone,” and said that Apple has taken the wrong path in trying to improve its response to child sexual abuse material, or CSAM.

On Thursday, this news arrived and shook everyone as no one had ever expected this to happen in near future. Apple explains in a news release about the technical process of it will work. It will not only stick to your phone’s gallery but also work during chats.

Regarding the user’s concerns about safety and privacy, the company said in a statement.

Apple said its system has an error rate of “less than one in 1 trillion” per year and that it protects user privacy.

Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account, Apple only learns about images that match known CSAM.

Apple Inc.

Matthew Green, an associate professor at Johns Hopkins University, pushed back on the feature before it was publicly announced. He tweeted about Apple’s plans and about how the hashing system could be abused by governments and malicious actors.

A lot of security officials have major concerns about this feature. Twitter is exploding with tweets. Edward Snowden retweeted the Financial Times article about the system, giving his own characterization of what Apple is doing.

Epic CEO Tim Sweeney also criticized Apple, saying that the company “vacuums up everybody’s data into iCloud by default.” He also promised to share more thoughts specifically about Apple’s Child Safety system.

Source: The Verge

The tweets go on and there are many, But most importantly! These tweets are associated with tech officials or extremely confidential personals who are affiliated with major organizations like Jon Hopkins etc.

But, the general public (who don’t give a shit about privacy but cares for child abuse cases) and people associated with advocacy work to end child sex trafficking are somehow agreeing with Apple.

Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) calls Apple’s work “a major step forward” for efforts to eliminate CSAM.

No one likes child abuse and everyone wants to stop this brutality but some people and major officials are unwilling to help tech companies to invade their privacy. We hope… at some point, we all agree on something to stop this abuse.

More About Tech Giants

What do you think?

Written by Hammad Khalid

Leave a Reply

Your email address will not be published. Required fields are marked *

iPhones Will Now Scan Child Sexual Abuse Images

iPhones Will Now Scan Child Sexual Abuse Images

Huawei Revenue Faces Biggest Ever Fall!

Huawei Revenue Faces Biggest Ever Fall!