You’ve heard of Pegasus before. It is the brand name of a family of spyware tools sold by the NSO Group, an Israeli group of hackers who sell their products to intelligence agencies, law enforcement and the military around the world. . developed, marketed and licensed to intelligence agencies, law enforcement and armed forces around the world by the Israeli group NSO.
A massive data breach investigation by the Guardian and 16 other media organizations around the world suggests widespread abuse of NSO Group’s hacking software by government customers. The company insists it is intended for use only against criminals and terrorists, but the investigation found that journalists, human rights activists and opposition politicians are also targeted. As our phones are increasingly external brains, storing our lives in digital form, a successful deployment of Pegasus can be devastating. Messages, emails, contact details, GPS location, calendar entries and more can be retrieved from the device in minutes.
On Sunday, the Guardian and its media partners began publishing the results of the investigation into the NSO group, Pegasus and the people whose numbers are on the leaked list:
The Guardian and its media partners will release the identities of those whose numbers were on the list in the coming days. They include hundreds of business leaders, religious figures, academics, NGO workers, union representatives and government officials, including ministers, presidents and prime ministers.
The list also contains the number of close family members of a country’s ruler, suggesting that the ruler may have asked his intelligence agencies to explore the possibility of monitoring their own relatives.
The presence of a number in the data does not reveal whether there has been an attempt to infect the phone with spyware such as Pegasus, the company’s signature monitoring tool, or whether an attempt has been made. success. There are a very small number of landlines and US numbers on the list, which NSO says are “technically impossible” to access with its tools – revealing that some targets have been selected by NSO customers even s. ‘they couldn’t be infected with Pegasus.
There is much more to read on our site, including the fact that the numbers of almost 200 journalists have been identified in the data; links to the murder of Jamal Khashoggi; and the discovery that a political rival of Narendra Modi, the autocratic leader of India, was among those whose numbers were found in the leaked documents.
But this is a technical newsletter, and I want to focus on the technical side of the story. Mainly: how the hell did that happen?
Messages come from inside the house
Pegasus affects the two largest mobile operating systems, Android and iOS, but I’m going to focus on iOS here for two reasons: one is a technical issue that I’ll get to in a moment, but the other is that, well As Android is by far the most widely used mobile operating system, iPhones have a disproportionately high market share among many demographic groups targeted by NSO Group customers.
This is in part because they mostly exist in the upper tiers of the market, with price tags that keep them out of the reach of most smartphone users around the world, but still within the reach of politicians, activists. and journalists potentially targeted by governments around the world. .
But it’s also because they have a reputation for safety. Since the early days of the mobile platform, Apple has struggled to make sure that hacking iOS was difficult, downloading software was easy and safe, and installing patches to protect against attacks. Newly discovered vulnerabilities were the norm.
And yet Pegasus has been running, in one way or another, on iOS for at least five years. The latest version of the software is even capable of running a brand new iPhone 12 running iOS 14.6, the latest version of the operating system available to normal users. More than that: the version of Pegasus that infects these phones is a “zero click” exploit. There are no questionable links to click or malicious attachments to open. You only need to receive the message to become a victim of the malware.
It’s worth stopping to note what is and isn’t worth criticizing Apple here. No software on a modern computing platform can be bug-free, and therefore, no software can ever be completely pirate-proof. Governments will pay a lot of money for the functional exploits of the iPhone, which motivates many unscrupulous security researchers to spend a lot of time trying to find a way to break Apple’s security.
But security experts I’ve spoken to say there’s a deeper unease at work here. “Apple’s assured pride is simply unprecedented,” Patrick Wardle, former NSA employee and founder of security developer Mac Objective-See, told me last week. “They basically believe their path is the best. “
What this means in practice is that the only thing that can protect iOS users from an attack is Apple – and if Apple fails, there is no other line of defense.
Safety for the 99%
At the heart of criticism, Wardle agrees, is strong motivation. Apple’s security model is based on ensuring that for the 99% – or more – that the biggest security threat they’ll face is downloading a malicious app while trying to find an illegal stream of data. ‘a Hollywood movie, their data is safe. Apps can only be downloaded from the company’s own App Store, where they are supposed to be verified before publication. When installed, they can only access their own data or data that a user explicitly decides to share with them. And no matter what permissions they are granted, many features of the device are permanently blocked from them.
But if an application finds a way to escape this “sandbox,” then the security model is suddenly reversed. “I don’t know if my iPhone is hacked,” Wardle says. “My Mac computer on the other hand: yes, it’s an easier target. But I can look at a list of running processes; I have a firewall that I can request to show me what programs are trying to communicate with the internet. Once an iOS device is successfully penetrated, unless the attacker is very unlucky, that implant will not be detected.
A similar problem exists at the macro scale. An increasingly common way to ensure that critical systems are protected is to use the fact that an endless number of highly talented professionals are constantly trying to break them – and pay them money for the vulnerabilities they find. This model, known as the “bug bounty”, has become mainstream in the industry, but Apple has lagged behind. The company offers bug bounties, but for one of the richest organizations in the world, its rates are pitiful: an exploit like the one the NSO Group has deployed would net a reward of around $ 250,000, which that would barely cover the cost of salaries for a team that could find it – not to mention having a chance to outbid the competition, who wants the same vulnerability for darker ends.
And security researchers who decide to try to help fix iPhones are hampered by the same security model that allows successful attackers to hide their tracks. It’s hard to successfully research weaknesses in a device that you can’t physically or digitally take apart.
In a statement, Apple said:
Apple unequivocally condemns cyberattacks against journalists, human rights activists and others who seek to make the world a better place. For more than a decade, Apple has been the industry leader in security innovation, and as a result, security researchers agree that the iPhone is the most secure and secure consumer mobile device. of the market. Attacks like the ones described are very sophisticated, cost millions of dollars to develop, often have a short lifespan, and are used to target specific individuals. While this means that they are not a threat to the overwhelming majority of our users, we continue to work tirelessly to defend all of our customers and are constantly adding new protections for their devices and data.
There are ways around some of these issues. Digital forensics still works on iPhones – despite, rather than because of Apple’s stance. In fact, this is the other reason why I focused on iPhones rather than Android devices here. Because if the group NSO was good at covering matters, it was not perfect. On Android devices, the platform’s relative openness appears to have enabled the company to successfully erase all of its tracks, meaning we have very little idea of the Android users targeted by Pegasus who were successfully affected.
But iPhones are, as always, more delicate. There is a file, DataUsage.sqlite, which records software running on an iPhone. It is not accessible to the user of the device, but if you back up iPhone to computer and search in the backup, you can find the file. The Pegasus files had been deleted from that file, of course – but only once. What the NSO Group did not know, or perhaps did not notice, is that every time software is run, it is listed twice in this file. So by comparing the two lists and looking for inconsistencies, Amnesty researchers were able to spot when the infection landed.
So there you have it: the same opacity that makes Apple devices generally safe makes it harder to protect them when that security is broken. But it also makes it difficult for attackers to clean up after themselves. Maybe two wrongs make a right?
If you’d like to learn more, please sign up to have TechScape delivered to your inbox every Wednesday.