Apple Against Child Pornography: How Its Artificial Intelligence Tools Work and Controversies

Apple against child pornography

Recognized for the protection of the personal data of its users, Apple has so far been ranked behind its competitors in terms of the detection of illegal content. The apple brand recently unveiled new tools to detect and report child pornography.

Having built its reputation on protecting the personal data of its users, Apple and its new device are raising concerns in the minds of users as well as those of privacy experts. They fear that it will later be used by governments, in mass surveillance and espionage of citizens.

Read also: Surf the internet safely.

How does Apple's artificial intelligence plan to fight against the sexual exploitation of children?

These new tools, CASS Detection (for Child Sexual Abuse Material), intervene on iMessage messaging and on Apple's iCloud server. They are designed to identify images of child sexual exploitation, through artificial intelligence, by scanning all images that go through the messaging, sent or received (for children's accounts linked to a family subscription).

As for the scan of the iMessage messaging, the images of theiPhone or the iPad but they are not transmitted by the artificial intelligence tool. They receive digital identifiers and are then scanned and compared to those of sexual abuse of minors.

As for the iCloud, the tool scans all the images, compares them to a database of child pornography images provided by the NCEM (National Missing and Exploitation Children's Center) of the USA and other organizations campaigning for the protection of children. .

In the event of a match, the images will be subjected to human evaluation, the user's account will be blocked and the police will be notified. Which, for some, corresponds to an invasion of privacy. Apple says the system is so precise that there will be virtually no risk of a false positive.

Controversies in Apple's fight against child pornography

This artificial intelligence from Apple is attracting the attention of many people, users, whistleblowers and privacy experts. Without denying Apple's willingness to participate in the fight against child sexual abuse, they fear that Apple's fight against child pornography and its new tools will gradually give way to mass surveillance.

Some believe that governments could ask Apple to extend this detection system to other purposes, something the brand promises to refuse. Apple meanwhile seeks to reassure by affirming that people who do not blame themselves for nothing will have nothing to fear and that this artificial intelligence will not sacrifice the privacy of its users.

As we know, unfortunately, computer systems sometimes contain vulnerabilities that can facilitate bypass and exploitation by people with rarely good intentions. It is therefore not impossible that, in the future, these new tools from Apple will bring more problems than solutions, in terms of individual freedoms and the fight against child sexual abuse.

1 comment
Leave comments

Your email address will not be published. Required fields are marked with *

Total
0
Share