Tech

How Edward Snowden commented on Apple’s desire to check photos


Apple’s scheme to automatically scan users’ photos has come under fire. This is despite the noble goal of this project, which is to a tool CSAM that will be used to track and identify child sexual abuse images.

The latest criticism of CSAM has come from Edward Snowden. He is known for his constant advocacy of users’ rights in terms of privacy and digital security.

Snowden has described the company’s plan as a disaster being prepared. And it was a tragic move. as Male With this step, it declares war on users’ privacy and digital security.

Read also: Apple scans photos stored on its devices

Apple scans users’ photos

Apple was planning to include this new tool in iOS 15, which is the awaited version of the iPhone operating system. The term CSAM is an acronym for Child Sexual Abuse Material.

The company’s plan was to check photos that were uploaded to iCloud. With the aim of identifying this dangerous content within them. Whereas, the company will notify the authorities directly as soon as it discovers any images that contain child abuse content.

Snowden describes this situation as the company making the iPhone phones work for it, not for the benefit of its user. Or rather, iPhones remain owned and controlled by Apple even after users pay for them and buy them.

Also Read: Is Apple Dominating Group Watching With SharePlay

According to Snowden’s description, Apple has “betrayed” its users with this feature, although the company is known for its efforts towards privacy, and it always uses the term privacy in promoting its phones and services.

Users can opt out of child abuse image recognition by preventing their photos from being uploaded to iCloud.

There have been many concerns that governments misuse Apple’s CSAM tool. In addition, smartphones can expose their owners to many problems if this feature – and similar features – do not work properly.

Although the company has stated that it will not be subject to any directives from governments in this regard, concerns still exist. The company’s policies in dealing with governments – as described by specialists – are also very flexible.

Although CSAM relies on automated algorithms to match images, this is not enough to make it completely safe for users. Some describe the company’s decision to develop this feature as a largely reckless decision, often for political and promotional purposes, and not just because of Apple’s keenness to protect children.

Read also: How to stop Apple from checking your iCloud photos

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button