washingtonpost.com  > Live Discussions > Technology

Data Surveillance

Jeffrey Rosen
George Washington University law professor
Tuesday, April 20, 2004; 2:00 PM

Most Americans do not care about exposing themselves to massive data surveillance but they should, says George Washington University law professor and New Republic legal affairs editor Jeffrey Rosen in his new book, "The Naked Crowd." Rosen discussed technology and the uneasy balance between security and privacy on April 20 at 2 p.m. on washingtonpost.com.

A transcript follows.

Jeffrey Rosen (Photo courtesy of Random House Inc.)

Editor's Note: Washingtonpost.com moderators retain editorial control over Live Online discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions.


washingtonpost.com: Good afternoon and welcome to our Live Online session with Jeffrey Rosen. I'm Robert MacMillan, technology policy editor at washingtonpost.com. Jeffrey, I'd like to start off by asking what gave you the idea to write this book.

Jeffrey Rosen: The book is a response to a challenge by my friend and teacher Lawrence Lessig, who writes about cyberspace. We were on a panel about liberty and security after 9/11, and I denounced the British surveillance cameras, which I had just written about for the New York Times magazine, as a feel good technology that violated privacy without increasing security. Lessig politely but firmly called me a Luddite. These technologies will proliferate whether you like it or not, he said, and you should learn enough about them to be able to describe how they can be designed in ways that protect privacy rather than threatening it. I took Lessig's challenge seriously, and spent a year learning about the technologies and describing the legal and architectural choices they pose. The rest of the book followed naturally, and it's an attempt to think through the behavior of the relevant actors who will decide whether good or bad technologies are adopted -- that is, the public, the executive, the courts, and the Congress.


Washington, D.C. : What is the status of the Pentagon's Total Information Awareness program, later changed to Terrorism Information Awarneness. Is the program really dead or is the CIA or another outfit retooling it to deploy in secrecy?

Jeffrey Rosen: TIA has been formally defunded in the Homeland Security bill, but research continues on projects like it. I debated Admiral Poindexter recently, and he stressed that the project devoted to studying technologies, rather than deploying them -- the TSA, FBI, and other agencies have parallel projects that remain in force. Some privacy advocates fear that the CAPPS 2 program that will be deployed at airports later this year is a version of TIA in sheep's clothing, but I think there are important differences that make CAPPS 2 less threatening.


Washington, D.C.: Are there any types of data surveillance programs that you think are worth keeping? Or should our country scrap them all?

Jeffrey Rosen: The CAPPS 2 program, as currently conceived, might be worth considering in a limited context. Unlike TIA, it's devoted to authentification -- proving that I am who I say I am -- not identification -- trying to decide whether my consumer behavior resembles that of the 9/11 terrorists. Also, there are important use limitations that prevent intelligence officers from sharing data with law enforcement unless there's evidence of an outstanding warrant for a violent felony. Both of these restrictions will help to protect privacy. Whether improving identification systems at airports is an effective way of fighting terrorism is another matter -- most of the 9/11 terrorists had valid ids. But I'm not opposed to all data mining and am trying to think about ways to construct them well rather than badly.


Arlington, VA: How do you suggest that our nation improve security and protect against future terrorist attacks if we don't have a way to mine data and "connect the dots"?

Jeffrey Rosen: Connecting the dots and information sharing are indeed important. There's nothing inherently good about inefficiencies and stovepipes -- they help to protect privacy but only at the cost of making it harder for security forces to do their jobs. What we need are sensible use limitations that allow information sharing but don't allow the political prosecution of people for lower level crimes. The Germans have understood this lesson -- because of the cautionary tale of the communists and fascists, the German intelligence service has broad surveillance authority to collect information without probable cause, but it's only allowed to share the information with law enforcement authorities if it finds evidence of terrorism or violent crimes. Evidence of low level crimes -- such as adultery -- can't be shared and is excluded from court. That's a good model, and we need to think creatively about other controls on the use of data to allow information sharing while protecting privacy at the same time.


Reston, Va.: Did you spend much time studying biometric security technologies? Are they really as "secure" as their proponents contend? And how do I safeguard my personal biometric data?

Jeffrey Rosen: I did spend a fair amount of time studying biometrics -- there was a trip through Silicon Valley that felt like the pre-dot com days, as I visited little start ups trying to hawk their wares to the homeland security department. (My favorite was a small mom and pop operation devoted to hand geometry!) That trip convinced me that there are huge differences in effectiveness between different types of biometrics -- face recognition seems more or less like a scam, because it's accuracy rates are so low. While something like hand geometry is fairly accurate for one to one authentification and can't be used for secondary purposes. The question of the security of the biometrics is separate from their effect on privacy, and the danger that data can be hacked is always present. But this depends on how the data is stored. The best way to safeguard data is not to store it centrally -- to use a thumbprint, for example, as a private key that could unlock data on a card, without storing the thumbprint in a database. That way you have complete control over the circumstances in which the data is revealed.


Foggy Bottom, DC: Where is a guarantee of privacy rooted in the U.S. Constitution anyway?

ps Any other tips for your Con Law II final much appreciated!;

Jeffrey Rosen: After Con Law II, you should know where the right to privacy appears in the Constitution!

Why not start with the Fourth Amendment?

As for more tips, see you at the review session on Thursday!


Olney, MD: THANK YOU for finally addressing the myth of the offensiveness of frisking "innocent little old ladies" and such. I've been saying for years that to avoid racial profiling, terrorists would obviously turn to the very young or old to use as mules. All it takes is one gray-haired granny whose grandson's nice friend fixes that broken heel for her, or gives her his grandfather's old cane that he doesn't need anymore, right before her flight.

By the way, I've had "The Unwanted Gaze" on my list of books highly recommended to me for quite a while, but which would you recommend I read first, that or "The Naked Crowd?

Jeffrey Rosen: Thank you for this nice note. You're absolutely right that creating a category of trusted travelers increases the incentive for terrorists to use "cleared passengers," or ones who seem more trustworthy, as moles and hidden operatives.

As for the books, I'd recommend both! But since I'm plugging The Naked Crowd these days, why not start with that?

Thanks again for asking.


Augusta, ME: The problem with self-awareness of our privacy rights seems to stem from the classic American fear of embarrassment. People who trumpet their privacy rights often seem didactic, boorish, loud and subject to conspiracy theory beliefs. Seemingly normal people, feeling no pain from intrusion into their lives, don't think about it. Does this seem like a plausible sociological explanation?

Jeffrey Rosen: Yes indeed, the question of how to define privacy is much harder than appears. In general, people are more concerned about embarrassment (or losing control over the conditions of their own exposure) than they are about reticence, anonymity, discretion, or many of the other values privacy protects. That's why Monica Lewinsky was happy to talk about her abortion on TV, but objected to the subpoena of her bookstore receipts. Some cultures -- such as France, for example -- believe that dignity is inherent in privacy; and people shouldn't be able to violate their own privacy, even if they want to. That's why French courts have prevented the posting on the Internet of embarrassing pictures for which people have voluntarily posed. But Americans are much more uncomfortable with the paternalistic idea that people should be protected from violating their own privacy. Generally, you're right that people don't care about surveillance or other intrusions until they feel some tangible loss or consequence, and by that point it's too late.


Mill Valley, CA: A UK study this week found that most computer users would happily give up their passwords for a chocolate bar, and that many would do so without any incentive at all. How is it possible to build secure systems when their users are so cavalier about their personal information?

Jeffrey Rosen: Thanks for this wonderful (and alarming) example of people's cavalier attitudes about personal information. In the US, people are willing to sell their personal data for a toaster, or less! It's impossible to protect privacy unless individuals care about it, and as the previous poster suggested, it's not clear that many people do. The only way to make people care about building secure systems is to try to make a case describing the harms of abandoning privacy -- for better or worse, that's what I'm trying to do in these books.


Washington, DC: While your book mainly focuses on government-backed intrusions, it seems that most privacy policy groups focus entirely on what corporations are tracking. Personally, I find it annoying that many of them shout "big brother" when corporations mine for data, but seem to be completely oblivious to what our own government is doing.

Do you think enough is being done to educate the general public about the issues you discuss in your book? Now that the parts of the Patriot Act are set to expire, and the administration sets out to not only renew them but expand them, do you think more privacy groups and civil liberties groups should pay a little more attention to this and less attention to what Google is doing with their email service?

Jeffrey Rosen: The question of whether the government or the private sector poses a bigger threat to privacy is debated hotly among different cultures. In general, Americans tend to be more fearful of government intrusions and Europeans about intrusions by the private sector. This reflects our different historical experiences: Europe is more concerned about violations against honor and dignity because of an aristocratic traditions where everyone knew their place, while in America suspicion of government has long united civil libertarian liberals and libertarian conservatives.

These groups have done a good job at trying to educate the general public about the harms of government surveillance, and the conservatives in particular are less concerned about private sector violations, and more resistant to government regulation of the private sector. The public/private divide is a bit artificial in an age when the Government is contracting out its data collection to private companies like Choice Point. But the bottom line is that the number of people who care a lot about privacy will always be a minority in America. The Attorney General has cited polls suggesting that 50 percent of the public thinks the Patriot Act strikes the right balance between privacy and security, 20 percent thinks it doesn't go far enough and only 20 percent think it goes too far. That sounds like a pretty good approximation of the number of people who will ever get really fired up by privacy.


Washington, DC: Come on... is there REALLY a naked machine?

washingtonpost.com: And here, after all, is the burning question that's truly on people's minds...

Jeffrey Rosen: There is -- check out the web site for the Pacific Northwest Laboratories, where it's described!


La Jolla, Calif.: It seems like the mainstream press falls short in its abilities to explore complex privacy issues, often sounding like trade press with obscure terminology and failing to explain difficult concepts. Do you see this trend changing in any way? Do you even agree that this happens? Tx.

Jeffrey Rosen: Privacy is such a complicated issue -- and it means so many different things to different people -- that it's hard to blame the mainstream press for reducing complex technological issues to the oversimplified questions that people care most about (identity theft! doubleclick! the naked machine!) The real problem here isn't the laziness of the press but the fact that the public has a short attention span and people only respond to threats they think will affect their own lives. The pleasure of writing books about privacy is that you have more space to try to explore these fascinating issues in all of their complicated dimensions. Whether people respond is another matter ...


San Francisco, Calif.: Do you think popular demand for privacy will eventually drive the growth of a privacy technology industry to rival the size of the security technology industry?

Jeffrey Rosen: I fear not. As those Patriot Act polls suggest, there isn't that much of a popular demand privacy, and before 9/11, companies that tried to market themselves as pro privacy (such as Zero Knowledge) failed, because people care more about convenience than privacy. After 9/11, it's even more of an uphill battle. Still, for a good argument that companies should care about privacy or face the economic consequences of embarrassment and scandal, I'd recommend Anne Cavoukian's The Privacy Payoff. She's the privacy commissioner of London, Ontario, and makes the economic case as well as it can be made.


Wilmington, Delaware: What do you think of companies such as Acxiom, who specialize in housing details about persons for marketing purposes? Recently they have purchased companies internationally to expand their reach. What regulations do you think will govern the use of that information nationally and internationally? Do you think the information could be used by the Federal government?

Jeffrey Rosen: This is an important and complicated question. Acxiom and Choicepoint are currently engaged in partnerships with the federal government to engage in data mining, and there are plans to expand these partnerships. Right now, much of the government's piggybacking on private data warehouse is unregulated by law -- courts have tended to hold (unconvincingly in my view) that when you surrender information for one purpose you've surrendered all expectations of privacy for it for all purposes. Ideally, Congress would think about imposing controls on the use of data that I explored in an earlier post, and these would apply to the private as well as the public sector. But because of the libertarian resistance to regulation of the private sector in Congress, I'm not optimistic that these sensible regulations would pass.


Philadelphia, Pa.: Monitoring huge amounts of data to try to mine for security risks does seem like a tortuous idea, and perhaps a futile use of technology. But trying to station adept people-watchers in places where millions of people walk by all the time seems equally daunting. How do we handle the sheer bulk of the problem without subjecting ourselves to mass surveillance?

Jeffrey Rosen: The idea that you can stop terrorism through data mining, surveillance cameras, or other forms of suspicionless surveillance hasn't panned out in countries that have struggled with terrorism in recent years. Britain found that it caught no terrorists after wiring itself up with 4.2 million surveillance cameras. Similarly, Israel, which has done better in stopping hijackings than any other country, hasn't relied on data mining at all, but has focused on human intelligence -- in depth interviews by trained intelligence officers of every passenger who boards a plane. I'd feel more secure if Americans were less suspicious of human discretion, and not so willing to search for technological silver bullets.


Clarksburg, Maryland: The vast majority of American citizens don't understand how the Patriot Act's provisions adversely affect their privacy.

Who is to blame:

-- the average citizen who is apathetic about privacy concerns,

-- the government for passing such draconian legislation, or

-- the press for not doing a better job of publicizing the invasive aspects of the legislation?

Jeffrey Rosen: The Patriot Act is very complicated. I'm a junkie, but it took me two years of teaching the act in a law school seminar before I felt confident enough about its technical details to form a relatively informed opinion. It's hard to know who to blame -- for better or worse, the American public isn't great about sitting still to absorb complicated legal and technological debates, and there's no reason individual citizens should feel the need to do so. But it's important not to politicize the Patriot Act -- that is, not to defend or attack it hyperbolically -- unless you have some idea what it says. For those who want a quick summary of the most important provisions, my favorite cocktail party guide is by Dahlia Lithwick in Slate magazine. Just put her name in the archives and you'll find it right away.


Silver Spring, Md.: You say that CAPPS II is less threatening, but didn't the gov't secretly get passenger data from several major airlines? Isn't that a major privacy violation?

Jeffrey Rosen: Some of the airlines that "voluntarily" shared data with the TSA are indeed being legally challenged -- and the statutory arguments are complicated, although worth puzzling through. The reason I think CAPPS II is ultimately less threatening is because no human being can ever connect an individual to his or her personally identifiable data. That is, the system is designed to confirm that I am who I say I am, not to reveal my bookstore purchases. Of course, oversight mechanisms need to be put in place to ensure that the system operates as the designers promise, and for this, transparency is important.


washingtonpost.com: By the way, Professor Rosen mentioned Stanford law professor Lawrence Lessig earlier in this discussion. washingtonpost.com reporter David McGuire recently hosted a Live Online session with Professor Lessig that is very interesting.


Maryland: I work for the government and I attended a meeting that highlighted commercial resources to better track individuals. It was, essentially, CAPPS II. There are several corporations that harvest names/addresses and associated data for mass-mailing purposes and have now found a niche for police/intelligence. I am really bothered by this and very unsettled.

washingtonpost.com: Robert here. This sounds a little more dire than CAPPS II. What does the intelligence grapevine say about other ambitious projects like this that might be out there now? Any idea of what this might be?

Jeffrey Rosen: Yes, this is indeed more troubling. This sounds more like TIA -- a system designed for identification and predictive behavior analysis, rather than simple authentification. There are projects devoted to this sort of potentially invasive public private partnership throughout the government. Robert O'Harrow, the Washington Post's superb technology reporter, is currently writing a book that will explore these partnerships, and their dangers, in some detail.


Lexington Park: A few years ago there was a lot of commotion about ECHELON. Does this system target individuals or does it just monitor for patterns?

Jeffrey Rosen: ECHELON as I understand it is a general surveillance system that engages in surveillance without individualized suspicion. Because it operates overseas, it's not subject to the legal restrictions that prevent the government from engaging in this sort of surveillance in the U.S. It monitors for patterns and looks for unusual or suspicious words. In a famous story on 60 minutes, a Canadian woman was investigated after a word search program run by ECHELON revealed that she had told a friend that her son BOMBED in the school play.


Fort Worth, Texas: Should we be any more or less worried about how easy it is for loan officers, landlords and other assorted companies to get information about our financial histories through the big credit companies? Private interests having all the goods on me frightens me more than knowing inefficient government bureaucrats seeing me "naked."

Jeffrey Rosen: Aha -- you have a European sensibility! In Europe, as I mentioned, people are much more concerned about easy access to financial information, and financial privacy laws there (as opposed to the US) prohibit loan officers, banks and other commercial interests from sharing consumer information without the individual's consent. In America, however, Congress has resisted efforts to pass comprehensive financial privacy laws, although individual laws protect different aspect of financial privacy.


Govt guy in Maryland again: you say that CAPPS II is mean to "confirm that I am who I say I am, not to reveal my bookstore purchases."

The computer architecture I saw can see your bookstore purchases, as well as your mortgage payments, to whom, and how much you have left to pay it in full. And much, much more.

Jeffrey Rosen: Yes indeed, but I assume that architecture you saw wasn't CAPPS II. Was it? If so, the system has been misdescribed.


Fullerton, Calif.: I am confused about some basic issues. What, really, do we have to hide that's worth hiding? Are we worried about public exposure of our pay grades, our personal foibles and failings? What will technology take from us that the government or megacorporations or all sorts of conglomerates aren't already aware of? What's really at stake here?

Jeffrey Rosen: A big and complicated question -- for a fuller answer, please read the book! Different people are concerned about different values when they talk about privacy -- some are concerned about control over information, others about dignity in the eyes of fellow citizens, others about autonomy from government. Some people are happy to be naked at airports but don't like having their emails exposed to the world. Others strike the balance in different ways. But the argument "nothing to hide, nothing to fear" -- although politically popular -- doesn't withstand close inspection. Even the most virtuous citizen can be destroyed in the eyes of his neighbors by having his telephone conversations recorded and broadcast on the Internet. All of us behave differently in different context -- telling dirty jokes to friends, saying things we don't really mean to amuse ourselves, and so forth -- and a world without privacy makes intimacy impossible, because it destroys the boundaries on which intimacy depends. Again, a satisfying answer to this hard question needs more space -- this is what I try to address in the book, and whether or not I'm convincing, only you can decide.


Rapid City, S.D.: Hello. Could you tell us a little bit about the process of putting this book together? I imagine that the research involved in doing this book must have been daunting at times to say the least!;

Jeffrey Rosen: Two of the chapters in the book stemmed out of articles for the New York Times magazine, and I very much appreciated the generosity of the editors in sending me to England and Silicon Valley to learn about the technologies. That would have been hard to pull off on my own. But once the basic reporting was completed, I spent some time trying to read broadly in psychology, sociology, and history. After that, the argument itself emerged without too much trouble.


Arlington, VA: What can people do to protect their identity and help keep personal information from Big Brother?

Jeffrey Rosen: Learn about privacy enhancing technologies, if you're really determined to cover your tracks. Don't share information without having a sense of what's likely to happen to it. Understand that sharing with the private sector may pose risks of its own, since the government has access to data held by private companies. But don't been either a paranoid or a technopositivist -- just make intelligent choices about the data you share.


Arlington, VA: DOD has just instituted a new program as adjunct to their periodic reinvestigation program for a security clearance. They have started to run employees w/TS clearances through hundreds of public databases. Should be interesting to see how this plays out.

washingtonpost.com: Do you think that this move will subject government employees to a higher level of scrutiny? Is this legal or fair?

Jeffrey Rosen: Government employees already have higher levels of scrutiny, as do citizens in professions with public responsibilities. A student of mine just complained (with justification, in my view) that she had to turn over a fingerprint in order to register for the bar. Given a choice between higher scrutiny for government employees or extending the high scrutiny to everyone, I think I prefer that government employees bear a disproportionate burden for the moment. But the courts have tended to start with the government and then allow intrusive surveillance to creep out to everyone else.

Thanks to all for a stimulating discussion. I enjoyed it a lot -- and learned from it as well!


washingtonpost.com: We're going to wrap it up now. Thanks to Jeffrey Rosen and thanks to all of you for joining the discussion. We don't have anymore time to explore this subject, but here's one final interesting note from "Govt Guy" regarding that project he was talking about earlier... Keep your eyes peeled!


washingtonpost.com: It wasn't CAPPS II. Those that demonstrated their architecture made references to CAPPS II as a good first step and how they will build on the potential for, basically, spying on "flagged" individuals. Like Robert said, it seems like TIA. I wasn't the only one who felt rather creeped out by the entire meeting. The architecture is an advanced database that can process billions of records in seconds, cross-checking an immense amounts of variables. It was pitched as police-dragnet tool, which is why our gov't unit wasn't very receptive because our mandate for addresses doesn't fit the Pentagon's or the FBI's the CIA's.


© 2004 Washingtonpost.Newsweek Interactive
Viewpoint: Paid Programming

Sponsored Discussion Archive
This forum offers sponsors a platform to discuss issues, new products, company information and other topics.

Read the Transcripts
Viewpoint: Paid Programming