As I type these words, I have to take on faith that the Washington D.C. police, the FBI, the DEA, and the Secret Service are not raiding my house. I also have to take on faith that federal and state law enforcement authorities are not tapping my various phones. I have no way of knowing they are not doing these things. They certainly have the technical capability to do them. And there’s historical reason to be concerned. Indeed, there is enough history of government abuse in the search and seizure realm that the Founders specifically regulated the area in the Bill of Rights. Yet I sit here remarkably confident that these things are not happening while my back is turned—and so do an enormous number of other Americans.
The reason is that the technical capability for a surveillance event to take place does not alone amount to the reality—or likelihood—of that event’s taking place. And though the D.C. police certainly have the battering rams to take down my door, there are at least two other less-visible barriers to their entry. One is the substance of the law, which forbids their entry in the absence of probable cause of a crime. The other is the compliance and oversight mechanisms that ensure the police follow the law. If one has confidence in those two things, the technical capability of government to conduct an abuse actually does not pose an unmanageable threat.
For much the same reason as I am not rushing home to guard my house, I have a great deal of confidence that the National Security Agency is not spying on me. No doubt it has any number of capabilities to do so. No doubt those capabilities are awesome—in the wrong hands the tools of a police state. But there are laws and rules that protect me, and there are compliance mechanisms that ensure that the NSA follows those laws and rules. These systems are, to be sure, different from those that restrain the D.C. cops, but they are robust enough to reassure me.
They are not, however, robust enough to reassure Julian Sanchez. Sanchez does not especially question NSA’s compliance mechanisms—nor does he question the integrity of the people who operate the agency’s programs. At a recent Brookings event, in fact, he declared of former NSA Deputy Director John “Chris” Inglis—with whom he was appearing—that “If we absolutely must have a nigh-omniscient, planet-spanning electronic Panopticon, then Chris Inglis is the sort of person whose hands I want on the lever.”
Sanchez’s problem, by and large, is instead with the substance of the law, which in his view permits too much and poses inherent dangers. This is the right focus. The fundamental problem here is not the government’s capabilities. It is not the names of scary-sounding programs. It is not the occasionally bombastic language of PowerPoint presentations. And it is not ultimately a concern that the NSA will fail to follow the rules we give it.
At the base of the NSA controversies, rather, is a lack of social agreement about the proper contours of the rules. It is a lack of agreement about the nature and integrity of the judicial and legislative oversight mechanisms we have created and the degree to which they can reasonably function in secret. It is a lack of agreement about the degree to which the Internet requires patrolling both for threats to the platform itself and for evidence on the platform of threats exogenous to it. It is a lack of agreement about the relative weight we should give to each of several different kinds of security—both personal and collective security—against each of several different kinds of violation.
But like a lot of NSA’s critics in the current debate, Sanchez sometimes mingles fears about the rules with fears about the capabilities themselves. Indeed, he concludes with the arresting observation that “the question we should ask about … systems [like the NSA’s] is the question we should ask about, say, biological weapons: Not whether we are satisfied with how (as far as we know) they are currently being used, but whether the consequences of their misuse are so great that, if and when it occurs, it will be too late to do much about it.”
The analogy here is, of course, imprecise. Biological weapons are not being used against us every day—though cyber attacks, exploitations, and espionage are a daily reality. More fundamentally, we have not all opened our veins to a collective intravenous drip to which the whole world has access and can introduce pathogens the way we have all put the Internet on our desk, in our briefcases, in our pockets and on our bodies. We can plausibly talk about banning biological weapons, at least at the state level. We cannot plausibly talk about banning Internet spying or signals intelligence or even offensive cyber operations. We are in a land, rather, in which some significant amount of this activity is simply part of the landscape and is going to remain so. And the restraints are not going to be—as with biological weapons—flat categorical bans on the development of capabilities or even on their situational deployments and use. The restraints are going to lie, rather, in the development of rules and compliance mechanisms in which we feel comfortable reposing trust.
In other words, we need to separate more rigorously discussion of capabilities from discussion of rules. Yes, these capabilities are dangerous. But so are any number of other government capabilities—the ability to conduct air strikes, for example. Sanchez worries about programs that insert vulnerabilities into commercial encryption systems, implant back doors in routers, and track individuals. But despite his analogy to banning biological weapons, he is presumably not arguing against ever doing these things. If he is, he arguing for a remarkable unilateral disarmament in an ongoing international cyber arms race. But if he’s not, the question necessarily pivots away from the inherent menace of the programs and precisely to the question Sanchez disclaims: “whether we are satisfied with how … they are currently being used,” and to what extent the rules we have in place do or do not prevent the uses with which we are uncomfortable.
There is no one right answer to this question. A democratic polity can come down in any number of places along a spectrum of aggressiveness and restraint, reflecting different allocations of different sorts of risk—the risk that NSA is tapping you as you tap your keyboard and risk that NSA is dark on subjects against whom we expect aggressive collection.