BY SUSAN LANDAU
The president gave a clear and thoughtful interview on the encryption debates Friday, and he nailed the issue. It is not about perfect security versus privacy and civil liberties; it is about our society’s willingness to accept risk.
As I write on this snowy Sunday morning, the New York Times has a headline about an attack in Copenhagen that killed two people, one at a cafe reading with a Swedish cartoonist who had done caricatures of the Prophet Muhammad and one at a synagogue, and another headline regarding a online bank heist that hit multiple financial institutions and netted many millions over an two-year period. These two stories capture the essence of the security dilemma surrounding the use of encrypted communications. Do we protect against terrorists by making it easier to eavesdrop on their — and everyone else’s — communications? Or, in order to secure our businesses, infrastructure, and daily lives, do we protect society’s communications at the risk of facilitating terrorists’ and criminals’ ability to communicate in secret?
Although better tools, such as two-factor authentication, can help secure systems, they are only part of a cybersecurity story. Communications security, ensuring no one is eavesdropping on your call or email, is essential to securing society. Such security requires end-to-end encryption, permitting the communication endpoints—and no one else—to decrypt the message. This means that only the endpoints should manage and hold the encryption keys. This fact is crucially important. Any other form of encryption—keys stored with the government (Clipper) or with communications providers—increases complexity and creates security risks, which can and will be exploited. That point bears repeating: communications tools built with law-enforcement access to the keys will not be secure against skilled opponents. But the use of encryption where the end-users—and not Apple or Google, for example—hold the keys, means, as the president observed, “Even though the government has a legitimate request [to wiretap], technologically we cannot do it.”
The fact that the communications service provider doesn’t have the encryption keys doesn’t mean law enforcement can’t listen in when it is authorized to do so. Wiretaps can work in multiple different ways. For example, by exploiting vulnerabilities already present on the target’s phone—and every large software system has vulnerabilities—it is possible to eavesdrop. The FBI has already used this technique in various cases. But the vulnerabilities route is admittedly more expensive and more complex, and it may not always be as fast to access communications as when law enforcement can tap content directly using cryptographic backdoors* built in for law enforcement.
In permitting continued use of communications devices employing encryption without law-enforcement backdoors, we are opting to trade one type of risk — inability to protect our communications from criminals and other eavesdroppers (e.g., nation-state spying)—for another—some inability to listen in (or immediately listen in) to terrorists’ and criminals’ communications. But despite law-enforcement’s continued vehemence about the need for cryptographic backdoors, there’s not an equivalence between the two types of risks. Our society’s heavy reliance on electronic communications for everything from banking transactions to business communications makes securing electronic communications crucial—and the correct security choice. The law-enforcement “solution” of backdoor government access to the keys is insecure. The president acknowledged as much in his interview, “Some in Silicon Valley would make the argument —which is a fair argument, and I get—the harms done by having any kind of compromised encryption are far greater.” For communications to be truly secure, the only ones who know the encryption keys should be the users—or their devices.
This U.S. government made this choice in 2000, when it loosened export controls on cryptography without requiring that cryptography for communications applications include law-enforcement backdoors. Enabling the use of cryptography creates risk, but widespread use of insecure communications devices creates far greater risks for society.
In the aftermath of an attack, when television is full of pictures of dazed and bloodied victims being carried to ambulances, it is easy to grandstand about how the attack should have been prevented and how we must do everything to track the terrorists. It’s not that easy. The same tools that make it simpler to track terrorists expose the rest of us to security risks that, in aggregate, do more to undermine society’s security than a gunman at a cafe or a bomb on a plane. And while it is the case that complex plots have multiple actors and multiple points where interception might occur, small plots involving a handful of actors are almost impossible to detect, especially in a free and open society. Although our politicians have been loathe to admit it, it is in fact impossible to prevent all attacks. It is long past time to be realistic that not all threats will be uncovered, and that attacks may, and will, occur. That is a crucial aspect of the encryption policy issue and must inform the discussion regarding on encryption backdoors.
(Note that we’re seeing a somewhat parallel debate concerning bulk collection of domestic telephone metadata. The chairman of the Privacy and Civil Liberties Oversight Board (PCLOB), David Medine, wrote in Lawfare on Friday that it had been a year since the board had recommended eliminating the Section 215 collection — and yet the collection was still ongoing. Both PCLOB and the NSA Review Committee had recommended the program’s end, with the PCLOB report concluding the program was ineffective. I served on the National Academies study on technical alternatives to bulk signals collection; we concluded that, “There is no software technique that will fully substitute for bulk collection where it is relied on to answer queries about the past after new targets become known.” In the face of bulk collection, we proposed various types of software controls on use (some of which are already employed today) and research on auditing methods. Notably we did not conclude that the Section 215 collection should continue—such policy recommendations were outside our remit—we simply observed that some investigative material would not be available if bulk collection ceased.)
President Obama observed that we needed to discuss the consequences of such a decision now, “The only concern is our law enforcement is supposed to stop every plot, every attack, any bomb on a plane. The first time an attack takes place in which it turns out that we had a lead and we couldn’t follow up on it, the public is going to demand answers.” The president indicated that, despite law-enforcement opposition, he would support a policy of continued public access to secured—no backdoors— communications tools. He also made clear that he would take the flak in the event of an attack that we might have been able to prevent if only …
While privacy and civil liberties are strong reasons for secure encrypted communications devices uncompromised by law-enforcement backdoors, they are not the only ones. Secure communications prevent more damage than it risks; our reliance on electronic communications means we must have secured uncompromised communications devices generally available. It is time for others, including members of Congress and members of law enforcement, to acknowledge the realities, both regarding our ability to prevent attacks as well as to where our society’s vulnerabilities actually lie. Such a realistic discussion will enable us to make the right choices on the encryption issue, as well as enabling us to handle consequences as they emerge.
* Of late, the FBI has been calling this “front door” rather than backdoor. It is somewhat irrelevant which name is used; the point is that such access defeats the intended purpose of providing encryption in the first place.