Apple Urged to Drop Project to Analyze iMessages and Images for Sexual Abuse

0
23
Apple to Scan iPhones and iPads for Child Sexual Abuse Images


More than 90 political and advocacy groups around the world have published an open letter urging Apple to abandon plans to scan messages from children for nudity and phones from adults for images child sexual abuse.
“While these capabilities are intended to protect children and reduce the spread of child pornography, we are concerned that they may be used to censor protected speech, threaten the privacy and security of people around the world, and have consequences. disastrous for many people. children, ”the groups wrote in the letter, according to a Reuters news agency report Thursday.

The largest campaign to date on a single-company encryption issue was organized by the US-based Center for Democracy and Technology (CDT).

Some overseas signatories in particular are concerned about the effect of changes in countries with different legal systems, including some already hosting heated fights over encryption and privacy.

“It’s so disappointing and upsetting that Apple is doing this because they’ve been a staunch ally in the defense of encryption in the past,” said Sharon Bradford Franklin, co-director of CDT’s Security and Surveillance Project.

An Apple spokesperson said the company addressed privacy and security concerns in a document last week, explaining why the complex architecture of the scanning software should resist attempts to subvert it.

The signatories included several groups in Brazil, where courts have repeatedly blocked Facebook’s WhatsApp for failing to decipher messages in criminal investigations, and the Senate passed a bill that would require message traceability, which would require somehow to mark their content.

A similar law was passed in India this year.

“Our main concern is the consequence of this mechanism, how it could be extended to other situations and to other businesses,” said Flavio Wagner, president of the Brazilian independent section of the Internet Society, which is the one of the signatories of the letter.

“This represents a serious weakening of encryption. “

The other signatories were in India, Mexico, Germany, Argentina, Ghana and Tanzania.

Surprised by the outcry that followed its announcement two weeks ago, Apple offered a series of explanations and documents to claim that the risks of false detections are low.

Apple said it would deny requests to expand the image detection system beyond child images reported by clearinghouses in multiple jurisdictions, although it has not said it would withdraw from a market rather than obey a court order.

While most objections so far have been about device scanning, the coalition letter also accuses a change to iMessage in Family Accounts that would attempt to identify and confuse nudity in messages from people. children, allowing them to see it only if the parents are informed.

The signatories said this step could endanger children from intolerant households or those looking for educational materials. More generally, they said the change would break end-to-end encryption for iMessage, which Apple has strongly championed in other contexts.

“Once this backdoor feature is integrated, governments could force Apple to extend notification to other accounts and detect objectionable images for reasons other than sexually explicit,” the letter said.

Other groups that have signed on include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.



LEAVE A REPLY

Please enter your comment!
Please enter your name here