In this post, I want to note one aspect of what Julian has written with which I agree. I particular, I am of the view that changing technology is creating a world in which huge amounts of data are becoming pervasively available for analysis. And the automation of analysis of that data may well work a sea change in how we approach privacy.
What is surprising to me is how little of what Julian (and Glenn) seem to worry about has anything to do with this fundamental change. Let’s leave aside (just for this post) our differences about the implementation of National Security Letters and FISA warrants and see if we can’t at least agree that they aren’t fundamentally different from administrative subpoenas and Title III warrants. Yes, I know, the issuing authorities are different and the standards of issuance are different, and that matters to Glenn and Julian more than it does to me.
But at the highest level of discussion they are in all respects similar in operation to existing law enforcement tools—they have rules; they are governed by laws; and they are subject to the potential for abuse. But that abuse is also a well-known phenomenon and we would no more eliminate FISA warrants because of potential intelligence abuses than we would Title III warrants because of law enforcement abuses.
Why is that? Because we think the costs of doing so outweigh the benefits (or, to put it conversely, the advantages we gain from having these tools outweighs the dangers that arise from them). This is a calculus we make all the time in law enforcement and intelligence activity. To put it most prosaically, we arm police because doing so stops crime and the gains we get in stopping crime outweigh the abuses that arise from police who misuse their weapons, or so we think.
To be clear, my point here is not to assert that my weighing of values is the right one or that my assessment of the relative costs and benefits is correct. Though I’m quite certain of my views, what I am asserting is that these sorts of questions all share enough characteristics that we know how to discuss them.
The surprise, for me, is that we don’t spend enough time talking how the changing nature of surveillance changes that paradigm. There is a crying need for that discussion (as the recent case involving GPS surveillance, United States v. Maynard, demonstrates).
I think one of the reasons that we don’t is that we are locked into concepts of privacy that were developed before the data analysis revolution. One thinks of the old DOJ v. Reporters Committee case where the Supreme Court developed the concept of “practical obscurity” to define a principle of privacy. In practice that concept is eroding. And given the utility of this sort of data analysis, and the likely persistence of the terrorist threat, it is as a matter of practical reality unlikely that governments will give up these analytical tools anytime soon, if ever. A realistic appraisal suggests that these tools are likely a permanent part of the national landscape for the foreseeable future.
Yet I join Julian in thinking that the use of such analytical tools is not without risks. The same systems that sift layers of data to identify concealed terrorist links are just as capable, if set to the task, of stripping anonymity from many other forms of conduct—personal purchases, politics, and peccadilloes. The question then becomes how do we empower data analysis for good purposes while providing oversight mechanisms for deterring malfeasant ones?
Old concepts of privacy (I call it “Antique Privacy” just for fun) focused on prohibitions and limitations on collection and use—and those are precisely the conceptions which technology is destroying. In this modern world of widely distributed networks with massive data storage capacity and computational capacity, so much analysis becomes possible that the old principles no longer fit. We could, of course, apply them but only at the cost of completely disabling the new analytic capacity. In the current time of threat that seems unlikely. Alternatively, we can abandon privacy altogether, allowing technology to run rampant with no control. That, too, seems unlikely and unwise.
What is needed, then, is a modernized conception of privacy—one with the flexibility to allow effective government action but with the surety necessary to protect against government abuse. Perhaps we can agree on that and begin thinking of privacy rules as both protective and enabling?