Apple delays controversial child protection features after privacy outcry

Last month Apple announced regarding the child protection feature, but now it is delaying the roll out of the feature, which usually scans the user’s photos for child sexual abuse material; Apple is now getting criticism on this new feature and according to discuss this feature could reduce the user privacy. Now the changes have been scheduled to roll out later this year.

Statement of Apple regarding the roll-out of child protection feature

According to a statement of Apple, they stated that “Last month they announced plans for this new child protection feature in a positive intention to protect children from any kind of Child Sexual Abuse Material, but now it is getting delay because we want to take some extra time to do customers feedback, different research and advocacy groups to bring more improvement in this feature to make it better and our focus would be totally on the child safety feature.”

The Original press release of Apple is everything about the changes they want to bring via these new child protection features to protect child abuse and sexual abuse materials. The press release statement was the same as the top page but the release detailed the three key changes in the works. The one change is regarding the search, and Siri would point to resources to stop child sexual abuse material if a user searched for info associated with it.

child protection feature

How will the child protection features work?

The other two changes are fully based on significant scrutiny. One would inform parents imminently whenever their kids will get sexually explicit videos or photos, and for kids, it will be a blur. The other would have scanned all the images stored in the iCloud photos for child sexual abuse material and report them to the moderators of Apple; if they find things right, they might refer the reports to the NCMEC (National Center for Missing and Exploited Children).

To not hamper the user’s privacy, Apple detailed the iCloud photo scanning system at a certain length. Overall, it will scan the iCloud stored photos on your iOS device and access those photos along with a database of known CSAM images hashes from several child safety organizations.

Criticism of privacy & security experts

Though Apple has no intention to hamper the user’s privacy by rolling out these new child protection features, many security and privacy experts heavily complained about the company for this new feature. They argue that this feature could have worked as system surveillance, which indirectly violates the trust of users who usually trust the Apple device for its strong security and privacy features.

Statement of Electronic Frontier Foundation

According to the Electronic Frontier Foundation, no doubt the ingestion behind the feature is well. Still, the new system feature would break many key promises of the messengers’ encryption and definitely open the door to the large scale of abuses. Undoubtedly, Apple is getting huge criticism from many sides regarding this new system feature but still trying to resolve all these issues. Hopefully, they will work on those areas and bring a fully secured and authentic feature in their next rollout. That’s the reason they are taking a long time.

Also Read: Amazon Is Reportedly Making Its Own New TV This Year

 

 

Leave a Comment