For years customers have entrusted Apple with their mobile privacy, but an announcement has customers worried that may no longer be the case. Apple announced a plan to prevent CSAM (child sexual abuse material) by scanning iPhones for images that may contain evidence of sexual abuse of children. Apple, with the help of safety experts, has developed several new features that help prevent the spreading of sexual content involving children. According to the company’s website, the new child safety measures will be implemented in three areas:
Messaging
Special tools in iMessage will be enabled to issue warnings of CSAM. Though the choice will still remain up to the child, both the child and parent will be notified when sexually explicit images are sent or received, and images will first appear blurred when received before prompting the child to decide whether or not they want to see it.
iCloud Photos
To combat the online spread of CSAM and prevent further exploitation of children, the iCloud feature will use an on-device machine learning to scan photos saved in the iCloud by comparing and matching them to CSAM image “hashes” already in the NCMEC (National Center for Missing and Exploited Children) system. This feature will allow Apple to report any photos that may contain explicit content involving children to NCMEC.
Siri and Search
As well as providing resources for parents and children to help make their own CSAM reports, Siri will now intervene when searches are made related to or involving CSAM via pop-up warnings of the illegality of CSAM.
At a glance, these new features may appear as a revolutionary implementation of child safety, but the announcement sparked concern among many users over mobile privacy. The process of scanning for harmful or offensive material has been used by many other tech companies, the main concern for this specific case stems from the fact that Apple scans not only iCloud, but the photos from iCloud stored on iPhones themselves, raising many questions about data security including what it means for future technology. Will Apple have access to information on other apps? Will they have access to text messages on every iPhone? If mobile restrictions start now, where will they stop? Can it be used to enhance government surveillance?
Apple defended the new update in a Q&A PDF posted on their webpage concerning frequently asked questions on the subject. The company dismisses accusations of gaining personal information via the new features, stating the safety feature in iMessages is separate from the CSAM detection feature for iCloud photos. They also explain that the communication safety features must be enabled on the device, which means users still have control over what they send and to whom. Parents can only opt to turn on safety notifications for accounts of children twelve and under. Any child accounts for ages thirteen through seventeen will not have the parental notification option.
According to the document, CSAM photos are not uploaded or stored on the iPhones in order to match images. CSAM image detection can only detect images already approved and provided by child safety organizations, and Apple has promised to reject any request from the U.S. or any other government to add images to the hash list.
The safety features are scheduled for release as part of the new iOS update coming later this year.