Apple CSAM FAQ addresses misconceptions and concerns – .

Apple CSAM FAQ addresses misconceptions and concerns – .

Apple has responded to misconceptions and concerns about its photo scan announcements by posting a CSAM FAQ – answering frequently asked questions about features.

While child safety organizations have hailed Apple’s plans to help detect possession of child pornography (CSAM) and protect children from predators, there has been a mix of informed and uninformed critics …


The mainstream media confusion arose when Apple simultaneously announced three separate measures, with many non-technical people confusing the first two:

  • Explicit warnings on iMessage photos for children in iCloud Family groups
  • Detection of known CSAM photos by scanning fingerprints
  • Reply to Siri and search for CSAM documents with warning and links to help you

There was also a lack of understanding of the methods used by Apple. In the case of iMessage, it uses AI on the device to detect images that appear to be nudes; in the case of CSAM, it is comparing digital fingerprints with fingerprints generated from stored photos of a user.

In any case, no one at Apple can see the photos, with the sole exception of someone reported to have multiple CSAM photos, when someone at Apple will manually check the low-res copies to make sure they match. before the police are informed. .

There has also been confusion between the risks of privacy and abuse with features as they are today (which are zero to extremely low) and the potential for abuse by authoritarian governments at a future date. Cyber ​​security experts have warned of the latter, not the former.

Apple has previously attempted to address the government’s repressive concerns by only going into the United States for now and declaring that expansion will be country-by-country, taking into account individual legislative environments. The FAQ now attempts to resolve this and other issues.


Apple has posted a six-page FAQ designed to address some of the concerns that have been raised. He begins by acknowledging the mixed response.

We want to protect children from predators who use communication tools to recruit and exploit them, and limit the dissemination of child pornography (CSAM). Since we announced these features, many stakeholders, including privacy organizations and child safety organizations, have expressed their support for this new solution, and some have asked questions. This document serves to answer these questions and to bring more clarity and transparency to the process.

The company then emphasizes that the first two functionalities are completely separate.

What are the differences between communication security in Messages and CSAM detection in iCloud Photos?

These two features are not identical and do not use the same technology.

Communication Security in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. This only works on pictures sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on the device and therefore does not modify the guarantees of confidentiality of the messages. When a child account sends or receives sexually explicit images, the photo will be blurry and the child will be notified, useful resources will be presented to them and reassured that there is no problem if they do not want to see or send the photo. As an added precaution, young children may also be told that, to make sure they are safe, their parents will receive a message if they see it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM out of iCloud photos without providing information to Apple about photos other than those that match known CSAM images. Possession of CSAM images is illegal in most countries, including the United States. This feature only affects users who have chosen to use iCloud Photos to store their photos. It does not affect users who have not chosen to use iCloud Photos. There is no impact on other data on the device. This feature does not apply to messages.

Other points highlighted in the FAQ include:

  • IMessages to and from children are never shared with law enforcement
  • IMessages remain end-to-end encrypted
  • Kids with abusive parents can safely ask for help through iMessage if they only use text
  • Parents are only notified if children 12 and under continue despite a warning
  • CSAM fingerprint matches are manually reviewed before law enforcement is notified

The most delicate problem remains

The greatest concern raised by the EFF and others remains. While the system now only reports CSAM images, a repressive government could provide Apple with a database containing other material, such as the famous Tank Man photo in Tiananmen Square, censored in China.

Apple responded to this by saying that it would not allow this:

Could governments force Apple to add non-CSAM images to the hash list?

Apple will refuse such requests. Apple’s CSAM detection capability is designed only to detect known CSAM images stored in iCloud Photos that have been identified by experts from NCMEC and other child safety groups. We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have categorically refused those requests. We will continue to refuse them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to extend it. Additionally, Apple performs a human examination before reporting to NCMEC. In the event that the system flagged photos that do not match the known CSAM images, the account will not be deactivated and no report will be filed with NCMEC.

This statement, however, is based on the fact that Apple has the legal freedom to refuse. In China, for example, Apple was legally required to remove the VPN, news, and other apps, and store Chinese citizens’ iCloud data on a server owned by a government-controlled company.

There is no realistic way for Apple to promise that it will not comply with future government-supplied database processing requirements of “CSAM images” which also include matches for materials used by reviewers. and the protesters. As the company has often said when defending its actions in countries like China, Apple complies with the law in every country in which it operates.

FTC: We use automatic affiliate links which generate income. Suite.

Check out 9to5Mac on YouTube for more Apple news:


Please enter your comment!
Please enter your name here