BY DOUG BERNARD
VOICE OF AMERICA
It’s no secret that James Comey, director of the Federal Bureau of Investigation, is not a fan of encryption – at least not when it puts data outside the reach of the FBI.
For over a year now, Comey has pressured congressional leaders and Silicon Valley executives to curb the use and spread of encryption tools for computers, tablets and mobile phones, but with little effect.
Comey said he wants encryption services to create secret “back doors” and “key escrows” for their products so agencies like the FBI can get emergency access to data critical for fighting crime and protecting national security.
So far, the debate has been largely low-key and one-sided, with few corporate executives eager to pick a public fight with the FBI.
But now, a group of cybersecurity professionals is pushing back at the FBI, warning that what Comey wants is not only infeasible, but could make the entire Internet significantly more vulnerable to hack attacks.
With high-profile cyberattacks on the increase and continuing revelations of U.S. government surveillance activities leaked by former NSA contractor Edward Snowden, encryption has become a booming business.
There are literally hundreds of encryption apps and services currently available that allow users to lock away their messages, texts, pictures, contacts and other data from prying eyes, with more on the way every month.
Sensing this explosive growth, industry leaders such as Apple, Microsoft and Google now offer robust encryption options for devices using their operating systems.
Last September, Apple included an iOS security change that makes it nearly impossible for police agencies to unlock devices such as an iPhone without the owner’s consent.
Previously, the operating system allowed Apple to unlock devices if the government provided a search warrant.
One month later, at a forum hosted by the Brookings Institution, Comey delivered a caustic broadside against the spread of encryption services.
Comey cautioned that law enforcement agencies like the FBI are increasingly vexed by “bad guys” using encrypted data storage and keeping nearly all their information – some of which might be evidence of illegal activity – on encrypted phones, outside the traditional access of wiretaps or database searches.
Encryption, he said, is turning every electronic device into a “safe that can never be cracked.”
“If the challenges of real-time interception threaten to leave us in the dark, encryption threatens to lead all of us to a very dark place,” he warned.
Since then, the FBI director has taken nearly every opportunity to warn about encryption’s spread.
Just last week, while announcing the arrest of a number of individuals suspected of plotting terror attacks, Comey said that some of those arrested had been communicating with ISIS militants through encrypted message platforms.
Perhaps recognizing that the encryption genie is already out of the cyber bottle, Comey is pushing tech companies to build in secret access points to their services, allowing data to be recovered without the key-holders consent. He’s also calling for “key escrows” – essentially a digital bank holding the secret encryption keys for every user.
Keys, back doors, doormats
While Comey has yet to score any significant concessions, his campaign to this point has been more lecture than debate, with corporate executives offering noncommittal assurances of working with the government to study the issue.
Some of that may have changed, however, with the release last week of a report that pushes back against Comey and warns that his proposals would make private data even more vulnerable to hacking and theft.
“Keys Under Doormats” was written by a group of 15 elite cybersecurity and cryptographic professionals and published by the Massachusetts Institute of Technology.
In brief, they argue that secret entry points into encrypted systems don’t currently exist because they simply don’t work.
Moreover, they say such paths “pose far more grave security risks, imperil innovation on which the world’s economies depend, and raise more thorny policy issues than we could have imagined when the Internet was in its infancy.”
Kurt Rohloff is one of the nation’s leading computer scientists working in cryptography.
Before serving as professor of computer science at the New Jersey Institute of Technology, Rohloff worked as a defense contractor at DARPA, and has long experience with encryption software.
VOA spoke with Rohloff to get his perspectives on the arguments both for and against director Comey’s encryption campaign.
VOA: Director Comey says encrypted devices are like safes that can never be unlocked, putting potentially life-saving information out of reach to law enforcement. Does he have a point?
In point of fact, we’re pretty much there already. There are encryption technologies that can encrypt everything on a phone, or a hard drive, or even your car. It’s just not going away.
In terms of making things better, I’d push back and say that his proposals would actually roll back the clock in terms of security, making us less secure.
That said, I’ve worked [developing encryption software] for years, and from a personal point of view, I frankly think I fundamentally have a right to privacy. I don’t want for other folks to have access to what I consider my private information.
But specifically, as for backdoors, that’s just a bad idea. That’s just another point of access for bad guys to get my private information.
Beyond negotiating the privacy issues, could his proposals actually work? Or, by definition, would they make data more vulnerable?
The latter. At a very basic level, it’s a math problem. Can we prove the math actually works? My understanding is that the math is only partially there, but not all the way there, and not as secure as it needs to be.
And there are other problems, like implementation. It’s actually very, very hard to write crypto software – software that runs systems that are truly secure, that don’t have any bugs or any holes. There’s a reason so few people like me do this, because it’s so hard.
Then there’s the liability. Suppose there’s a bug in the crypto system and there’s a resulting financial loss: Who’s going to pay for that? Is the government liable? Is the contractor that wrote the code?
Another problem is actually setting up the crypto environment: How do you host it? Where do you put it? Who has access to it, manages it and where are the keys stored? These are huge issues that may make the math beside the point.
So back doors and key escrows are, at their core, fundamentally insecure?
Yes. Building these things in is fundamentally building insecurity into the system by putting in a back door. Look at some of the recent high-profile attacks – say the OPM breach. These kinds of attacks, the adversaries had a relatively well-distributed attack pattern. They were going after different companies and different branches of the government.
But now with a key escrow or back door, you concentrate all this vulnerability into one point. It would probably lead to greater attacks on this one known vulnerability, rather than the more distributed attacks we’re seeing now.
Shouldn’t law enforcement have limited access to data that could protect the nation or actually save someone’s life?
This actually gets at one of the biggest issues I’ve had with the overall formulation of this debate. A lot of [Comey’s] justifications have been primarily anecdotal.
I was at a talk that director Comey gave in January, and he whipped out the old saw of “protecting children from pedophiles.”
Yeah, it might in one or two cases, but I have yet to see any real hard statistics that there’s going to be any statistically significant advantage that we’ll get in protecting these children from abuse or worse. I just have a hard [time] justifying widespread privacy intrusion if it’s just director Comey pointing out one or two cases that happened over the past couple of years where encryption prevented a timely investigation.
Great Britain, Russia and China are also considering similar proposals for mandating access to encrypted data. Do you worry the rest of the world might get out ahead of the U.S. on this?
Well, that’s an interesting analogy: A potential justification for implementing this technology is because the Chinese are doing it. They’re not exactly a paragon of freedom of speech.
I would turn it around. If you look back to the ’90s when the Clipper Chip was an issue, a lot of businesses actually left the U.S. and moved, for example, to Israel.
Israel right now has a very vibrant cryptographic community, a vibrant cybersecurity community. That’s driven by several reasons, but among them are U.S. businesses off-shoring a lot of technological development into a more fertile environment.
Rather than the world rushing ahead of the U.S., I think we would actually see more U.S. companies setting up foreign subsidiaries, giving those foreign nations the upper hand.
Some cryptographers we’ve spoken with have expressed concern that Congress or the courts just don’t “get” encryption and the technological and social issues involved. Is encryption just too hard to understand?
It’s a difficult subject to penetrate. I think there’s a general challenge for us in the technology community being able to properly educate decision-makers and policymakers. We’ve all seen that YouTube clip of [former Alaska Senator Ted Stevens] calling the Internet “a series of tubes.” This is a concern a lot of us should have. Of course, there are folks that have a relatively good understanding.
For example, the prime minister of Estonia is a former software developer. I think Estonia actually gets it right, because they have such a large stake in pushing technology forward. So, yes, the understanding of technology is uneven across our government, but I think there’s a serious effort to correct that.