Apple is one of those companies which is famous for ‘Privacy’ and ‘Security’ bundled into its products.

The Cupertino giant has numerous times portrayed itself as a Privacy oriented brand, by emphasizing it every time.

While all of it is not true and even some of its products can be exploited like the AirTags for example, the company still cares about user privacy (it says so).

And now, the privacy reputation of Apple is at risk.

Last week, Apple announced a new system (CSAM Protection – Child Sexual Abuse Material Protection) that will enable it to flag images of child exploitation uploaded to iCloud storage in the U.S. and report them to authorities.

This step from Apple received huge accolades from Child protection agencies and associations. But no so much support from Privacy advocates.

What this actually means?

According to Apple, its algorithms will scan images on a user’s iPhone or iCloud for child abuse or illegal pornography and flag the same to authorities.

The company believes the CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Apple always puts Privacy as its stronghold in all of its products

Apple also calls this approach different from the same old industry standard because it uses its control of hardware and sophisticated mathematics to learn as little as possible about the images.

Actual images will not be scanned as it is only comparing hashes, the unique numbers that correspond to image files.

While this is all good, security researchers are more concerned if this technique will be used by Apple for political purposes by the pressure from foreign governments.

Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow

– Edward snowden, nsa whistleblower

WhatsApp’s Chief Will Cathcart was also not happy with this decision.

He tweeted,

Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone, even photos you haven’t shared with anyone

Contradicting yourself, Apple ?

In 2019, Apple bought a giant billboard in Las Vegas during CES with the slogan “What happens on your iPhone, stays on your iPhone.”

This is not true from now on.

Apple sees this new system as part of its privacy-protecting tradition. It is assuming it as a win-win situation in which it’s protecting user privacy while eliminating illegal content.

While the company claims the system can’t be repurposed for other kinds of content, you will never know if some foreign government or authorities pressurize Apple to use it for their own good.

Apple Privacy
It might be a reality soon.

Like how Apple has been censoring its apps and content for countries like China, to prevent itself from getting banned.

I, for one, chose Apple products for their simplicity and privacy. But now, I feel dejected as Apple is contradicting its own policies and commitments.

A few months back, Apple tweaked its user agreement in China to give a third party (Guizhou-Cloud Big Data – GCBD) legal ownership of Chinese customers’ iCloud data.

This provided a legal shield to the Chinese government who can access customer’s data just like that.

All I am saying is, we are responsible for what and whom we share our data with.

At the time of writing this article, Apple is still going ahead with this controversial privacy system feature in USA, and we expect it will be coming to other countries soon.

Apple Privacy
Stay away from Apple?

Thanks for reading our article. If you liked it, share it with your family and friends.

Follow our Facebook and Twitter page for more contents and news.

Author

Leave a Reply

Your email address will not be published. Required fields are marked *