Apple has confirmed that they will be scanning each iPhone and iPad, to detect images related to child sexual abuse, to find and report those images to higher authorities immediately. Apple has also stated that they want to help children from sexual predators. With this power, they will be capable of monitoring and reporting images that involve the sexual abuse of children. Additionally, Apple also wants to regulate the spread of child sexual abuse material (CSAM).
Apple will utilize the tech to match these images by matching existing images on a user’s iPhone with images on the CSAM database to identify images that potentially consist of child sexual abuse. They have labeled this particular tool as “neutralMatch,” which will decrypt messages of the iPhone user and detect any image that involves child sexual abuse. And Apple has not entirely left this task to AI. If there is a potential match, a human would identify and review the image and report to the responsible authorities about the image and the user in control of that image.
However, there have been many arguments against this system. These arguments state that governments and organizations could misuse this particular tech to frame innocent people by sending them questionable content. According to Matthew Green, a well-known cryptography researcher, also has the same concerns, which state that fooling Apple’s algorithm could be easy and can be used to frame innocent people.
Apple, however, still claims that it will take every step to protect user’s data and privacy from organizations that have pressurized Apple to provide access to the general data of the users. Many digital rights groups additionally state that Apple might just be diminishing their brand image, which has consistently promoted privacy and security, as this particular tool compromises end-to-end encryption.
Many other Digital rights groups additionally want Apple to drop this idea, as it potentially compromises the end-to-end encryption on their messaging platform and enforces surveillance and censorship. Regardless, Apple looks at the positive picture that the tool is painting, protecting children from sexual abuse, and regulating CSAM content. However, it creates a backdoor for many governments and organizations that can diminish user privacy and security.
Moving over to the positive reactions. It includes the statement by the CEO of the National Centre for Missing and Exploited Children, John Clark, who stated that with this tool, Apple could make a big difference and potentially save the lives of a lot of children and protect them from child abuse, as well as regulate horrific and brutal CSAM content from circulating the internet. Additionally CEO of Thorn also added that Apple would make a big difference as they would be providing Digital Safety for children in the USA.
This tool looks pretty hopeful when it comes to protecting the lives of children as there are millions of Apple users in the USA. With this, Apple might be in a powerful position to rightfully limit and regulate CSAM content and bring justice to children who might be in danger or are in danger.