Apple has announced that the new versions of its operating system (iOS 15) due to be released later this year will have new applications to help restrict the spread of child sexual abuse material for customers in the USA.
Before an image is stored onto iCloud (Apple’s storage service that allows users to store documents/photos/videos on remote servers), the technology will search for that image against other images flagged as a child sexual abuse image and if a match is found, a human reviewer will assess the results and will report the user to law enforcement.
The technology searches against other images in a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations that have already been flagged as being inappropriate.
The images are translated into numerical codes called “hashes” which can then be matched to an image on an Apple device. Apple has also confirmed that the technology will catch edited but similar versions of original images.
Apple states that the system has an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging an account”. It also says it will manually review each report to confirm that there is a match and then it can take steps to disable a user’s account and report to the police.
As the tool only looks for images that are already in the NCMEC database, it is understood that in circumstances where say a parent takes a photo of their child in the bath, the tool would not flag/report such content.
In a statement on Apple’s website, it confirms: “At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”
As well as this tool, there will also be the introduction of additional new child safety features in areas developed in collaboration with child safety experts. These are:
- New communication tools to enable parents to play a more informed role in helping their children navigate online communication. The Messages app will warn users about sensitive content, while keeping private communications unreadable by Apple. When receiving this type of content, the photo will be blurred and the child will be warned that the content may be dangerous as well as being presented with helpful resources, and also being reassured it is okay if they don’t want to view the photo. The child will also be told that their parents will get a message if they do view it. Similar protections will be in place if a child attempts to send a sexually explicit photo.
- Updates to Siri and Search will provide parents and children extra information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for topics/material related to child sexual abuse.
Despite the intention behind the new updates, there has been significant concerns by users that their phones/devices could be turned into surveillance tools by governments and that the technology is effectively a “backdoor” into Apple’s software. It is a belief that the new updates raise significant legal and policy questions regarding privacy/data protection. Apple however has confirmed that the new technology offers significant privacy benefits as it only learns about users’ photographs if they have known child sexual abuse material in their iCloud account. The technology is limited to detecting child sexual abuse material stored in iCloud and Apple “will not accede to any government’s request to expand it”. Despite the assurance, it has been noted that Apple has made concessions to governments in the past in order to continue operating in their countries such as removing thousands of apps from its App Store in China as well as storing user data on the servers of a state-run telecommunications company.
The President and Chief Executive of the NCMEC has stated “Apple’s expanded protection for children is a gamechanger. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”
Thorn, a company co-founded by Ashton Kutcher and Demi Moore to develop new technologies to combat online child sexual abuse and eliminate this material from the internet, reported that in 2020 there were 21.4 million reports of child sexual abuse material containing 65.4 million images and videos. Many of these involved children 10 years old or younger. Many more incidents go unreported and 75% of children trafficked or sold for sex are advertised online.
On its website, Thorn commends Apple in taking this step. Julia Cordua, CEO of Thorn said “We consistently advocate for the will and resources from some of the brightest minds in technology to be directed at creating a future free of online child sexual abuse. This is not a battle of one, but a battle of many. There is a growing community of those who demand an end to the spread of child sexual abuse material (CSAM); who say yes to solving the world’s most difficult challenges, recognizing that we — the builders, creators, and defenders of the internet — choose to use technology for good. And to be successful we need companies like Apple — and many others — to continue to collectively turn their innovation and ingenuity to this issue, creating platforms that prioritize both privacy and safety. It’s something we have to do — for every child victim and every survivor whose most traumatic moments have been disseminated across the internet. And as is often the case when doing this work, someone has to take the first step.“