Are Apple’s Child Abuse Tools Bad for Your Privacy? – .

0
28
Are Apple’s Child Abuse Tools Bad for Your Privacy? – .


Apple unveiled a plan two weeks ago based on good intentions: to eliminate images of child sexual abuse from iPhones.

But as is often the case when changes are made to digital privacy and security, tech experts quickly identified the downside: Apple’s approach to digitizing people’s private photos could give authorities law enforcement officials and governments a new way to monitor citizens and persecute dissenters. Once a chip in the privacy armor is identified, anyone can attack it, they argued.

The conflicting concerns have exposed an intractable problem the tech industry seems no closer to solving today than when Apple first fought with the FBI over a dead terrorist’s iPhone. five years ago.

Technology that protects the privacy of an ordinary person can also cripple criminal investigations. But the alternative, according to privacy groups and many security experts, would be worse.

“Once you create this backdoor, it will be used by people you don’t want to use,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, a digital rights group. “It’s not a theoretical prejudice. It is an evil that we have seen happen time and time again.

Apple did not expect such a backlash. When the company announced the changes, it sent reporters complex technical explanations and glowing statements from child safety groups, computer scientists and Eric Holder Jr, the former U.S. attorney general. After the news went public, an Apple spokesperson tweeted a reporter from Ashton Kutcher, the actor who helped found a group that fights child sexual abuse, encouraging the movements. .

But her voice was largely muffled. Cyber ​​security experts, the manager of the messaging app WhatsApp and Edward Snowden, the former intelligence contractor who leaked classified government surveillance documents, have all denounced the move as setting a dangerous precedent that could allow governments to look at people’s private phones. Apple has scheduled four more press briefings to tackle what it called misunderstandings, admitted to spoiling its messages and announced new guarantees to address some concerns. More than 8,000 people responded with an open letter calling on Apple to stop its movements.

For now, Apple has said it is moving forward with its plans. But the company is in a precarious situation. He’s worked for years to make iPhones more secure, and in turn, he’s made privacy a central part of his marketing pitch. But what was good for business also turned out to be bad for abused children.

A few years ago, the National Center for Missing and Exploited Children began disclosing how often tech companies were reporting cases of child pornography, commonly known as child pornography, on their products.

Apple was near the bottom of the pack. The company reported 265 cases to authorities last year, compared to Facebook’s 20.3 million. This huge gap was in large part due, in most cases, to Apple’s choice not to search for such images to protect the privacy of its users.

In late 2019, after articles in the New York Times about the proliferation of child sexual abuse images online, members of Congress told Apple it better do more to help law enforcement. , otherwise they would force the company to do so. Eighteen months later, Apple announced that it had found a way to fix the problem on iPhones while, according to it, protecting the privacy of its users.

The plan included modifying his virtual assistant, Siri, to direct people who ask questions about child sexual abuse to the appropriate resources. Apple said it will also soon allow parents to activate technology that scans images in their children’s text messages for nudity. Children 13 and older would be notified before sending or viewing a nude photo, while parents could ask to be notified if children under 13 did.

These changes have sparked little controversy over Apple’s third new tool: software that scans users’ iPhone photos and compares them to a database of known child sexual abuse images.

To avoid false positives and obscure images of abuse, Apple has taken a complex approach. Its software reduces each photo to a unique set of numbers – a kind of image fingerprint called a hash – then compares them to hashes of known child abuse images provided by groups like the National Center for Missing. and Exploited Children.

If at least 30 photos of a user appear to match the abuse images, an Apple employee reviews the matches. If any of the photos show child sexual abuse, Apple sends them to authorities and locks the user’s account. Apple has announced that it will activate the feature in the United States in the coming months.

Law enforcement officials, child safety groups, abuse survivors and some computer scientists have praised the measures. In statements provided by Apple, the president of the National Center for Missing and Exploited Children called it a “game changer,” while David Forsyth, president of computer science at the University of Illinois at Urbana-Champaign, said said the technology would catch child abusers and that “innocent users should experience minimal or no loss of privacy.”

But other computer scientists as well as privacy groups and civil liberties lawyers immediately condemned the approach.

Other tech companies, like Facebook, Google, and Microsoft, also scan user photos for child sexual abuse, but they only do this on images that are on the companies’ computer servers. In Apple’s case, much of the digitization is happening right on people’s iPhones. (Apple said it will scan photos that users choose to upload to its iCloud storage service, but the scanning is still done on the phone.)

For many technologists, Apple has opened a Pandora’s Box. The tool is said to be the first technology built into a phone’s operating system that can examine a person’s private data and report it to law enforcement authorities. Privacy groups and security experts fear that governments looking for criminals, opponents or other targets will find many ways to use such a system.

“As we understand now, I’m not that worried about Apple’s specific implementation being abused,” said Alex Stamos, a Stanford University researcher who previously led Facebook’s cybersecurity efforts. “The problem is, they’ve now opened the door to a surveillance class that had never been opened before. “

If governments had previously asked Apple to analyze photos of people, the company might have replied that it couldn’t. Now that it has built a system that can, Apple has to say it won’t.

“I think Apple has clearly tried to do it in the most responsible way possible, but the fact that they are doing it is the problem,” said Galperin. “Once you’ve built a system that can target any database, you’ll be asked to point the system to a database. “

In response, Apple assured the public that it would not accede to such requests. “We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future, ”the company said in a statement.

Apple has indeed fought demands to weaken smartphone encryption in the United States, but it has bowed to governments in other cases as well. In China, where Apple manufactures almost all of its products, it stores the data of its Chinese customers on computer servers owned and managed by a state-owned company, at the request of the government.

In the United States, Apple was able to avoid more intense fights with the government because it still transmits a lot of data to the police. From January 2018 to June 2020, the most recent data available, Apple turned over the contents of the iCloud accounts of 340 clients per month to the US authorities with warrants. Apple still hasn’t fully encrypted iCloud, which gives it access to customer data, and the company scrapped plans to add more encryption when the FBI hesitated, according to Reuters.

Apple’s fights with the FBI over smartphone encryption have also been defused as other companies have regularly been able to hack iPhones for police. It is still expensive and time consuming to get into a locked iPhone, but it has created an effective middle ground where the police can access the devices they need for investigations, but it is more difficult for them to abuse the technology.

This encryption deadlock has also allowed Apple to maintain its brand of championing privacy, as it does not actively provide access to the police. But that compounds the potential damage from its new tools, security experts have said.

For years, technologists have argued that giving police access to phones would fundamentally undermine device security, but now governments can point to Apple’s endorsement of its photo scanning tools as a method that helps police everything. while preserving privacy.

Apple “took their entire platinum privacy mark, and they applied it to this idea,” Stamos said. “This Apple solution ruins the whole debate and sets us back for years. “

[This article originally appeared in The New York Times.]

LEAVE A REPLY

Please enter your comment!
Please enter your name here