security

The Scourge of Commercial Spyware—and How to Stop It – Lawfare


The past year has been a politically challenging one for the spyware industry. Years of public revelations have spotlighted a shadowy set of spyware companies selling and servicing deeply intrusive surveillance technologies that are used against journalists, activists, lawyers, politicians, diplomats, and others. Most notorious among them, of course, is Israel’s NSO Group and its Pegasus spyware, though it is by no means alone. In turn, the United States and others have increasingly come to view commercial spyware as not only a human rights risk but also a national security threat. They are acting accordingly, with the United States blacklisting bad actors and partnering to develop national, regional, and global efforts to counter commercial spyware. The Biden administration has imposed domestic constraints on the spyware industry, and 10 other governments joined the United States during the Summit for Democracy this year in recognizing “the threat posed by the misuse of commercial spyware and the need for strict domestic and international controls on the proliferation and use of such technology”—despite the fact that they (thus far) lag behind the United States in executing spyware-related policy commitments.

In this context, Asaf Lubin’s Lawfare Digital Social Contract paper aims to contribute to the ongoing debate over the largely unaccountable spyware industry. As longtime observers of the commercial spyware market and advocates for accountability for the human rights abuses this industry perpetuates, we welcome efforts such as Lubin’s to advance concrete reforms—and want to constructively engage with three topics raised by his approach. 

First, Lubin begins with a recognition that spyware poses “significant risks” in the absence of a “rule of law framework,” situating himself within the mainstream consensus of civil society and those governments trying to tackle the industry’s lawlessness. We are glad that he raises the question of rule of law. From our perspective, that question requires that scholars ask not only what a global legal framework should look like, but whether—and, if so, how—spyware meets the requirements of law in general as well as in specific application. Since this is a complex global conversation involving demonstrable interferences with individual privacy, due process, and freedoms of expression and association (among other rights), the appropriate legal framework should be located in international human rights law, which requires (a) a legitimate interest in any given interference with a fundamental right, and a demonstration of (b) legality and (c) necessity and proportionality. (See, for instance, the Human Rights Committee’s General Comment 34.) 

We accept, for the sake of this argument, that states turn to surveillance tools out of a legitimate interest in national security or public order (though in many of the publicly known cases involving, for instance, journalists or politicians, that interest is not detectable). Even so, the existence of such interest is only one aspect of the legal inquiry. The legality requirement means that the use of spyware must be provided by law with specificity and accompanied by strict safeguards that sufficiently eliminate the risk of arbitrary interference. Necessity and proportionality, often cited together, require that, among other things, the interference be the least intrusive instrument among those that might achieve the legitimate objective and be proportionate to the aim pursued. 

Lubin does not undertake this critical analysis of legality, necessity, and proportionality, asserting instead that “[s]pyware solutions are … essential in the age of ‘going dark.’” This claim of necessity echoes spyware industry talking points. However, basing a discussion on the premise that “spyware is essential”—without interrogating that claim—precooks an outcome to a “rule of law system” (Lubin’s phrase, page 2) that would preserve spyware tools like Pegasus. It flips the appropriate analysis. 

Beginning with the legality test, spyware proponents must identify practical human rights constraints that can be applied in the context of the extraordinary invasions that advanced digital surveillance technologies enable (remote unencumbered access to a device and all its contents as well as real-time commandeering of audio, video, geolocation, and text functions). There may be technological and legal solutions to this considerable problem, but neither the industry nor its defenders and clients have shown what that might look like in law or in practice. Given the evidence of such widespread human rights abuses by the industry’s clients, it is easy to suspect that any genuine, enforceable constraints would interfere with the economics of an exceptionally lucrative industry. No wonder they are not proposing serious rules. 

At the same time, when it comes to necessity and proportionality, the burden is on the industry (and the states using spyware) to demonstrate the unavailability of other, less intrusive law enforcement tools  including powerful surveillance practices that do not rely on remote penetration of one’s private device. In the absence of government transparency regarding the equities at stake, however, there is little public evidence supporting the assertion of necessity; most of the public evidence involves not crime prevention but human rights violations uncovered by civil society organizations like Citizen Lab, Amnesty International, and Access Now. In this connection, Lubin raises the specter that, without spyware, governments would require vulnerable encryption and rely on backdoors to access personal devices. We too have seen government officials make this kind of claim, and we do not doubt that law enforcement and intelligence services require technical solutions to effectively address security threats. But we question the assertion that, in the event commercial spyware were not available, “law enforcement would have imposed the requirements that backdoors be designed by software vendors and would further compel the assistance of those vendors,” as Lubin puts it on page 23. A range of other “lawful intercept” options exist, the application of which depends on the remit of the entity seeking to conduct such digital interception and the practical circumstances of any given case. Essential considerations include applicable legal restraints, due process requirements, and whether extraterritorial surveillance is required, which should trigger the application of mutual legal assistance treaties among states. 

Readers Also Like:  US Takes Security-First Focus for $39 Billion Chip Aid (1) - Bloomberg Law

Until these legal questions are answered, it is premature to suggest that spyware is “essential.” And it is not the job of the victims or human rights organizations to do the work of answering for the industry or law enforcement. At the same time, policymakers must integrate the effort of laying the groundwork for what a legal use of spyware would involve into consideration of what global legal constraints would require. For example, we doubt that a case can be made specifically for Pegasus’s legality, which allows us to begin to imagine a global regime that addresses what functions of spyware—that is, offensive digital surveillance tools with the advanced capabilities exhibited by Pegasus and its peers in the commercial spyware industry—cannot be lawful and thus cannot be approved for global transfer.

Second, Lubin’s paper implicitly assumes that the private sector is entitled to develop a commercial market to supply digital intrusion tools to governments. The question of private-sector participation, however, is separate and distinct from that of whether governments have a right to engage in digital intrusion. We think this issue deserves further inquiry.  

There is a real normative danger in asserting or assuming the “legitimacy” of a commercial market for spyware (as the paper’s subtitle suggests). The use of digital intrusion techniques by governments should remain truly exceptional and subject to the standards of international human rights law. Yet the spyware trade turns the human rights legal assessment on its head: Private companies presume a right to develop and supply the means to engage in digital intrusion, without reference to how the tools are actually used. Predictable incidents of rights abuse are then characterized as unavoidable collateral damage, of which the companies often claim they have no knowledge and on which they cannot comment due to contractual obligations or national security concerns (see, for example, responses of Cyberbit and NSO Group). 

It is unclear how any effort to “reign in” spyware companies, such as the U.S. executive order prohibiting certain uses of commercial spyware, will effectively tackle the crux of the problem: By design, the private sector has little control or bargaining power over a government’s secret use of the invisible products provided to it. It is difficult to imagine a government client knowingly purchasing a digital intrusion system that does not guarantee such opacity or, frankly, a company wanting to assume the responsibility of policing product use by state intelligence and law enforcement. Hence, the international community is relegated to a “wait and see” approach to accountability, whereby rights abuses perpetrated through commercial spyware are not prevented and only half-heartedly addressed if they are ever uncovered. Why is a multibillion-dollar industry to thrive on this basis? Why are private investors profiting at the expense of activists, dissidents, journalists, lawyers, and the security of the digital ecosystem as a whole? 

The involvement of the private sector in the development and dissemination of state digital intrusion tools adds another layer of self-serving interests to an already complex geopolitical landscape. Whereas state interests in digital surveillance generally revolve around security and political influence and control, surveillance companies are motivated by profit and the need to generate returns on investment. Financial factors drive product diversification and innovation in exploit development, as well as the need to expand the customer base. In the case of NSO Group, for example, following the revelations of the Pegasus Project, disputes arose between Berkeley Research Group, which managed the company, and company creditors over whether to continue to sell to states with track records of human rights abuse. 

Readers Also Like:  Security driven by AI and machine learning - Sports Business Journal

Indeed, there is no “off-ramp” for private-sector participants that supply exceptionally dangerous means of digital intrusion. When confronted with the negative human rights impacts of their products, how should they handle the offending intellectual property? How will they accommodate investors? It is no surprise that a private equity-backed enterprise such as NSO Group simply cannot reverse course, even after being blacklisted, because there is too much money at stake. NSO Group has instead focused on influence campaigns and relentless lobbying to continue unfettered operations (as demonstrated in U.S. filings required under the Foreign Agents Registration Act). Yet it should go without saying that an approach to addressing human rights abuses that depends on accommodating the financial interests of spyware companies and their backers is fundamentally flawed.

Moreover, recent history has shown that the outsourcing of spyware tools to the private sector has resulted in not only human rights abuses on a massive scale, national security threats, and degradation of the digital security environment, but also an inherently compromised product with the potential to undermine legitimate law enforcement and intelligence investigations. When a common commercial solution is provided across multiple states and government entities, the likelihood that one or more of those entities will abuse and expose the product increases. Extensive technical research by the Citizen Lab, Amnesty International, and others, as well as the blockbuster findings of the Pegasus Project, provide ample evidence of this effect.    

We assert that, in fact, state deployment of advanced digital intrusion techniques that have the potential to undermine human rights and global cybersecurity should be difficult, requiring not only robust vulnerabilities equities processes (VEPs), but also stringent human rights assessments undertaken with respect to every intrusion, and a state’s own participation in the technical development and limitations of the tools it employs. We agree with Lubin that the critical work of VEPs is effectively bypassed through the work of spyware companies. We would go further, however, to argue that if state agencies wish to engage in targeted digital surveillance, they must not rely on third-party commercial solutions to do so. The development and deployment of offensive digital surveillance tools should qualify as an “inherent state function,” undertaken by state actors pursuant to a legal framework in compliance with international human rights law, for which they remain directly responsible and accountable. Such activity should be subject to the kinds of public transparency and debate, accountability, and oversight that are further eroded by the participation of the private sector. 

Third, looking specifically at policy, Lubin finds fault with the proposal to impose a moratorium on the spyware trade, something supported widely by civil society organizations, UN experts, and at least one state (Costa Rica). He asserts, without more, that there are “misconceptions” about the proposal. For more than four years, however, the idea (initiated in the UN by one co-author) has been explicit that governments should impose an “immediate moratorium on the global sale and transfer of the tools of the private surveillance industry until rigorous human rights safeguards are put in place to regulate such practices and guarantee that Governments and non-State actors use the tools in legitimate ways.” There is nothing fuzzy about this proposal: Stop the global trade until effective safeguards are adopted. Lubin incorrectly asserts that “none of these organizations and human rights experts [supporters of a moratorium] has ever proposed what a human-rights-compliant design and use of spyware could even look like.” Along the same lines, he continues on page 21, “There is something disingenuous about calling for a temporary moratorium—subject to an ultimate human-rights-protecting legal framework— when no one who calls for the moratorium is willing to commit to the work of developing that very framework.” 

In truth, governments and international organizations are already engaging with an extensive body of work done by nongovernmental organizations, academics, independent experts, and others that has laid the groundwork for global attention to the spyware trade and has identified human rights standards and frameworks that should apply to spyware. The 2019 UN report that called for a moratorium includes one section entitled “Framework for the protection of fundamental rights against targeted surveillance”—and it sketches out national legal constraints, public mechanisms for approval and oversight of spyware, the provision of tools of redress for victims, and other international and private-sector actions. As far back as 2017, a co-author, in her capacity as a lawyer with the Citizen Lab, co-wrote with Lab Director Ronald Deibert a widely circulated paper subtitled “a checklist for accountability in the industry behind government hacking.” Many others have engaged, including a civil society declaration identifying key elements of a framework of control. In an essay in Foreign Affairs this year, Deibert—whose Citizen Lab has devoted over a decade of work on this subject, effectively bringing mercenary spyware to public attention—noted a range of rules that should be adopted as part of a regulatory framework. Lubin proposes a regulatory process drawing from ideas in the regulation of the private military contractor industry, which we identified four years ago (here, here, and, more recently, here). Nongovernmental  organizations’ reports on the abuses of spyware regularly include proposals for legal control

Readers Also Like:  Recent Security Trends for Sports Apps - TechSpective

The point is not that Lubin was under some kind of obligation to cite these reports; it’s that these reports reflect careful study and thinking about the very subject he says people are just unwilling to commit to doing. They do what Lubin says their “disingenuous” authors won’t do. They identify legal frameworks, they highlight ideas for global regulation, they propose approaches at national levels, and so on. Many experts have testified before legislatures, courts, and public forums and engaged constructively with government officials in North America, Europe, India, and elsewhere in a wide-ranging effort to build a global framework. Engaging with these ideas would have given the paper a firmer footing in the reality of today’s global effort to constrain the scourge of commercial spyware. 

Lubin concludes his critique of the moratorium proposal with the assertion that “those who call for a temporary moratorium secretly harbor the sense that no framework would ever be good enough.” He cites to one of the co-authors, who not-so-secretly has testified before the European Parliament that he doubts whether spyware like Pegasus—with its enormous power of intrusiveness and difficulty of control—can ever meet the standards of human rights law, regardless of legal framework. If that is found to be accurate (and we think that it is), there is no basis to think that, once a human rights framework is in place, there could be a “restoration of business” (as Lubin says on page 21) for companies like NSO Group or products like Pegasus, although Lubin claims moratorium advocates “envision” just that. Indeed, Lubin, who believes “spyware is here to stay,” misleadingly suggests that Professor Fionnuala Ní Aoláin, as UN Special Rapporteur on counterterrorism and human rights, has embraced his version of “pragmatism and abandonment of purism”  In fact, in a report to the UN Human Rights Council in April, Ní Aoláin emphasized, “Spyware which fails to display such features [of auditability, transparency, specificity, and so forth] cannot, however otherwise tightly regulated, be capable of human rights compliance.” We wholeheartedly share that view. This is not to say that all surveillance technology would be unlawful, but neither is it a “pragmatic” approach that simply accepts that regulation should work around the industry—rather than the other way around. 

What’s next for global regulation of the spyware industry? A moratorium remains an important way forward, a stopgap measure at a time of ongoing global abuse and intimidation, but in fact, we do not believe there is one silver bullet that will eliminate the threat spyware like Pegasus poses to human rights and national security. 

Some work has made and continues to make a major impact: the reporting of human rights organizations and journalists, continued engagement by UN human rights experts, lawsuits by victims and corporate actors like WhatsApp and Apple, investigations by legislatures, restrictions and blacklists imposed by governments, and emerging multilateral conversations. Identifying key features of transparency and accountability, as done by Ní Aoláin, will allow the international community to identify technologies that are regulable, consistent with human rights law, and those that are not. Government transparency, oversight and accountability, and effective pathways to remedy must be part of the regulatory ecosystem, at national and international levels, but so must the question of whether private-sector involvement itself must be ended or at the very least subject to the narrowest boundaries. It is also critical that those governments that committed to take action at the Summit for Democracy actually do so—to establish their promised “robust guardrails” on use, prevent export to malicious actors, and “drive reform” internationally, among other things. 

Ultimately, the global community sees the extraordinary abuses facilitated by the spyware industry and perpetrated by its clients. A global regime of control and accountability, driven by human rights standards and with which governments find it in their best interests to comply, must be the endgame.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.