It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Apple’s new stance on iPhone privacy is troubling – and where is Tim Cook, by the way?all agree on children photos, but what about widely spread photos symbolizing political, religious, or sexual orientations? What if @Apple is asked by governments to add them to the database? a privacy nightmare, great material for our uni courses…
In other words, could a country like China come to Apple and say, you know what, since you all already have a system in place to take action on child porn as informed by a database of specific material, we want to give you another database of material. Instead of CSAM, though, we want you to search for and take action whenever you find something different. That something we want you to look for? Users with non-government-approved content on their devices.
You can't look at anything apple does without looking at how the Chinese government will use it. They are a defacto Chinese company and bound by all it's whims and rules. They can not leave China without destroying their business.
originally posted by: marg6043
Scanning your phone without permission...
originally posted by: BuddytheYorkie
a reply to: AugustusMasonicus
I must agree with you. I have not “given” permission for an update since before the lockdowns last year. I simply say “ remind me later”. It USED to work fine. After reading a report about OPs topic last week I checked my current OS.
Surprise...my phone has been updating all along at some point on its own.
I’m not concerned about them looking for kiddy pics as I don’t have any, but it was alarming that my phone has updated with my permission at least 4 times. Even with the appropriate check marks being tabbed.
I was late joining the apple dumpling gang but this will be the last fruit purchase I ever make. I’ll be going to the “burner” phone from this point forward.
First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.
Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.
Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.