Apple shares child sexual abuse faq to fight rumors of photo scans / digital information world – .

0
24
Apple shares child sexual abuse faq to fight rumors of photo scans / digital information world – .


Apple’s technique for tackling child pornography on the Internet took a 360-degree turn when people began to question this functionality. To overcome all the worries and a whole bunch of misconceptions, Apple is posting a Child Sexual Abuse (CSAM) FAQ (Frequently Asked Questions), a step-by-step guide to answering a bunch of questions raised. had the plan pre-approved by various security organizations and therefore carried the approved seal, but people still had to doubt the true intention of the initiative for a program solely based on protecting children from stalkers and detecting material harassment.

In the initial plan, he announced three main areas he would cross, including iMessages. The main purpose of the platform is to provide notifications or warnings against explicit images on iCloud groups. Then, Apple included the detection of images already fed into the database thanks to the detection of fingerprints. Finally, search requests and Siri for these will result in a warning and links to websites that could help you.

There has been a lot of misunderstanding regarding these three major points and the methods involved by the tech giant. To clarify, the platform clarified that in-app AI will be used to assess images that may hint at nudity, while CSAM will only be used for images that require the digital imprint process. This entire process will be based solely on understanding and sensing the system, without any interference from Apple employees. However, someone in the company may need to get involved when the image is blurry or needs additional confirmation in order to match it with the pre-existing CSAM database. This can also happen when there are multiple CSAM results. This is to ensure the moment when the involvement of the police is necessary.

Another major problem is the misuse of this information by ‘higher authorities, i.e. government authorities and this is, surprisingly, something that even cybersecurity experts have warned users against. . To combat this, it is only being launched in the United States for the time being, per the United States Legislature. When he travels with other countries, he will adopt the same procedure.

Apple first thanked those who supported its CSAM initiative and then addressed the main concerns by responding to questions posed. The main points, including the CSAM FAQ, include;

  1. CSAM on iCloud and Protection on iMessages are two different features and are in no way related. CSAM on iCloud will only be implemented on those who use iCloud to store images. IMessages protection will prevent children from sharing explicit content. Content on both platforms will remain hidden and secure.
  2. The exchanged iMessages will never surface in front of a law enforcement institution with the end-to-end encrypted iMessages.
  3. Children can also use iMessages to ask for help via text message only when subjected to abusive parents.
  4. Parents are not notified immediately but when a child under 12 continues the activity despite a warning.
  5. All CSAM fingerprints will first be verified manually if law enforcement needs to intervene.

While Apple has a fairly solid list of all the images included in the CSAM database, users are concerned that the government will add its own personal database to the already existing one to merge and create a matching one. better to their criteria. Apple states that this would not imply such requirements. He has a clear line on what is considered child sexual assault material and he will stick to it. As Apple always has, it will continue to defy any government-imposed changes and stick to its original policies. While that is a good promise, there is no rule that would legally allow Apple to ‘have such freedom. When launching its products, it must comply with the government in order to allow users to have access to it. Countries like China are forcing the company to comply with government requirements and, in this case, many run the risk of overloading the CSAM database with photos of protesters and critics, defying the entire purpose of the campaign. ‘initiative.

Now we just have to wait patiently while Apple rolls out the feature to more countries, adapting to their policies. Overall, we think this is a pretty positive and constructive incentive that everyone needs.

SOPA Images via Getty

Read more: Someone is watching you: 6 tips to stop being tracked online

LEAVE A REPLY

Please enter your comment!
Please enter your name here