BY CAROLINE PERRY
Privacy isn’t what it used to be. Post-Sony, post-Snowden, we know our digital world is insecure, yet most of us continue to share a vast amount of personal information over networks. Balancing anxiety with convenience, autonomy with value, we negotiate a new definition of privacy every time we download a new app.
“It’s not the right to be left alone anymore,” said Lee Rainie ’73, the Pew Research Center’s Director of Internet, Science, and Technology Research, speaking at Harvard on January 23. “It’s the right to be in control of what people understand about you, … what kind of sharing is done, who has access to your data, and if you can correct mistakes that others make about you.”
But, he said, it’s not really working. Fifty percent of U.S. adults responding to a recent Pew survey said they have “not much control” or “no control” over how their personal information is collected and shared.
For all our technological advancement, legacy systems persist in our email and Internet service providers, online social networks, search engines, e-commerce, electronic medical records, and other networked databases. And most of them are at risk—if not prone to security breaches or secret back doors, then certainly failing to live up to the expectations of a privacy-conscious public.
The ubiquitous online systems on which citizens depend “are all collapsing,” declared Latanya Sweeney (ALB ’95), Chief Technology Officer of the U.S. Federal Trade Commission, Professor of Government and Technology in Residence, and director of the Data Privacy Lab at Harvard. “Every single one of them is headed for a major disaster.”
Rainie and Sweeney were among hundreds of privacy experts, computer scientists, mathematicians, lawyers, policy scholars, and interested citizens who gathered at Harvard for the fourth annual Symposium on the Future of Computation in Science and Engineering. Hosted by the Institute for Applied Computational Science (IACS) and the Center for Research on Computation and Society (CRCS)—both based at the Harvard School of Engineering and Applied Sciences (SEAS)—the daylong symposium addressed “Privacy in a Networked World.”
Edward Snowden himself, the former National Security Agency (NSA) systems administrator who leaked classified records of NSA surveillance efforts, called in from Moscow via Google Hangouts for a live, unscripted Q&A with security expert Bruce Schneier, a fellow at Harvard Law School’s Berkman Center for Internet and Society, drawing a large audience in the Science Center.
Their conversation centered on the technological changes that have rendered once-secure systems vulnerable.
Risky by design
“The NSA has to balance two different focuses: defend our networks and attack their networks,” Schneier said. “And those missions, I think, made a lot more sense during the Cold War when you could defend the U.S. radios and attack the Soviet radios, because the radios were different. It was ‘us’ and ‘them.’
“What’s changed since then is that we’re all using the same stuff. Everyone uses TCPIP, Microsoft Word, Firefox, Windows computers, Cisco routers… And whenever you have a technique to attack their stuff you are necessarily leaving our stuff vulnerable. And conversely, whenever you fix our stuff you are fixing their stuff.”
Top-secret documents revealed by Snowden over the past 15 months have suggested that the NSA has influenced the design of many commercial technologies, demanded lists of encryption keys, and placed backdoor vulnerabilities in widely used software, enabling the agency to access private communications.
The documents also describe a world in which human networks, especially relationships between business and government, play as large a role as the fiber-optic ones.
“To me, the biggest surprise in all the NSA documents is the lack of surprises—that we don’t see any major ‘secret sauce’ of quantum computers or anything that says they, or really any intelligence agency, can do magical things,” Schneier said. “Ten, twenty years ago, we would have assumed that we, the academic world, were a decade behind the NSA and other countries, and it seems that that might not be true, that there is more parity than we thought.”
Peer-reviewed, mathematically proven cryptography remains the gold standard for unbreakable communications, Snowden said, and for the most part the NSA and other countries would “rather dig a hole” under the wall than climb over it.
“Encryption—[when] the mathematics [are] properly implemented—really is one of the few things that we can rely on,” Snowden said. “And this is fundamental when we talk about computer security because we’ve got to have some foundation. We’ve got to have some basis for trust.
“There’s no magic key that unlocks [all] crypto.”
“What we really have to worry about is the rest of everything,” Schneier added: “the bad implementations, the weak keys, any kind of back doors that were inserted in the software, and we’re seeing a lot of that.”
An Internet built for surveillance
As much consternation as Snowden’s NSA revelations have caused, the fact remains that location data, email content, and private photos are information that consumers seem content to entrust to corporations.
“Whether [the NSA is] getting a copy by putting an implant in a Cisco router, or by going into Google’s trunk links between their data centers, or going to Microsoft Skype with a court order and demanding a key, this is all data that is sloshing around the corporate world,” Schneier said. “We’ve built an Internet for surveillance. We’ve decided that advertising, that marketing, that personal information is the currency by which we all buy our Internet, and that fuels what countries can do.”
While some have criticized the NSA for exceeding its authority, John DeLong ’97, JD ’05, the NSA’s former director of compliance, vigorously denied that the agency had overstepped. “We can have big discussions about what the law is and what an interpretation of the law is, but there are legions of lawyers inside NSA; there are platoons of compliance officers,” DeLong said in his talk following Snowden’s conversation with Schneier. “The idea that NSA’s activities were unauthorized is wrong. It’s wrong in a magnificent way and I think that’s got to be clearly said.”
“The NSA’s very good at tactical oversight,” Schneier agreed: “Are we doing things right, are we following the rules?” But he added: “It’s very different than [asking], are these the right rules?”
Or, as Snowden put it, “Just because we can do it doesn’t mean we should.”
Securing privacy at the clinic
In addition to exploring the ethical and social implications of network surveillance, the symposium examined how similar themes apply to other spheres of technology policy, such as the sharing of private medical records.
With good reason, Americans are not yet ready to entrust their medical data to just anyone. However, there is a common interest at stake: better and larger longitudinal studies of cancer or Parkinson’s disease will be crucial to improving health care for all.
John Wilbanks, Chief Commons Officer at Sage Bionetworks, is working to improve patients’ understanding of the ways their data will be used when they consent to participate in such epidemiological and health studies. Wilbanks described how thoughtfully designed user interfaces can ensure that only the essential data are collected, that consent forms are granular and interactive, and that valuable findings are communicated back to the participants.
Cynthia Dwork, a Distinguished Scientist at Microsoft Research, addressed the desire among social scientists and medical researchers to share and jointly analyze the data collected in those large studies.
Dwork’s seminal work in cryptography, data analysis, and distributed computing has established the mathematical foundation for many modern systems. She is the co-creator of a relatively new model for data analysis, called differential privacy, that quantifies (and minimizes) the privacy risk of exposing certain information. A differentially private system would allow researchers to submit queries on a data set, up to a point. Before so much has been revealed that individual participants might become identifiable, the system would prevent access. Symposium organizer Salil Vadhan, Vicky Joseph Professor of Computer Science and Applied Mathematics at SEAS, is leading an interdisciplinary project that builds on this work to create computational, legal, and policy tools that make it easier for lay researchers to share sensitive data.
But differential privacy can only solve certain types of privacy problems, Dwork emphasized. Other problems urgently require solutions based in ethics, policy, and new business practices. Technologies that seem innocuous and trustworthy today could be very different tomorrow.
“My advertiser is not my friend,” Dwork said. “I don’t want my computer to know my mood. If I’m sad, maybe my advertiser will suggest that I buy something expensive.
“And I don’t want my advertiser to try to keep me sad. What I’m now starting to worry about is not only my advertiser exploiting my emotional state for financial gain, but also manipulating my emotional state for continued financial gain.”
“In 10 or 15 years the ability of some of these companies to analyze that data and know a fair bit may actually be far more advanced than we can even understand today,” said Betsy Masiello, a senior manager for global public policy at Google. “There will end up being conversations about the ethical use of data that are vastly different than the ones that we’re having today.”
Or, as Sweeney put it during a lively panel discussion, “The technology’s changing. He’s talking about two-factor authentication; in two years your chair will be able to tell that you’re sitting there.”