Recent cryptography news serves as a microcosm of the development of computer security technologies. The discovery of fully homomorphic encryption by Craig Gentry, a Stanford PhD student working at IBM this summer, is by far the biggest headline in cryptography theory this week, month, year, and (probably) decade. Essentially, fully homomorphic encryption can perform arbitrary computations on encrypted data while preserving the encryption. For example, a spam filter could be used to identify encrypted emails containing spam, or an audit logging system could append an entry into an encrypted log file without decrypting it and then re-encrypting it.

Now, nothing is perfect right out of the gate, and there are caveats to this discovery. For the scheme to work, one must know in advance the maximum number of computations that can be performed on an encrypted file. It’s not practical; the discovery shows only that it is possible. Last but not least, we’ve already developed schemes that allow some limited operations, such as search, on encrypted data. These have been around for years, and some have even been reported on technical news sites. But even taking these concerns into account, the discovery is legitimately headline news.

The media loves to report juicy computer security stories, particularly relating to the discovery of new cryptographic techniques. Unfortunately, these headlines distract from the primary concern of the average computer security professional: We are just not using the tools we have! Consider last summer when a flaw in the DNS protocol became huge news. It was a problem that could have been completely avoided using existing cryptography. We just weren’t using it. In fact, despite Dan Kaminsky’s recent efforts, we still aren’t using it. Here’s a great quote from Dan:

DNS is the world’s largest PKI without the ‘K.’All DNSSEC does is add keys.

Why haven’t we “added the ‘K’” yet? DNSSEC has been sitting in a drawer, and even after last summer, it doesn’t appear to be a priority. It is designed with security in mind from the start; it is real, practical, and can be implemented without another breakthrough in cryptography. Only, we aren’t using it. And this has been the pattern of cryptography technologies for the last few decades:

  1. Some smart people create something like public key encryption and/or fight against ludicrous export controls on cryptography tools.
  2. The story becomes headline news for a day or two, and we all walk around feeling great about how we ‘solved’ the security problem and we’re all going to be ‘safe’ soon.
  3. A few weeks pass and we find that no one is actually using the inventions that were just created and/or saved from oppressive regulation.
  4. Eventually, we start all over from Step 1 with a new miracle discovery in computer security. That’s what happened this week.

Consider email encryption. Gmail (and most other webmail providers) still doesn’t support GPG. Gmail also doesn’t use persistent SSL connections by default, which means that your emails are delivered to your web browser in plain text when there’s a cheap and effective form of encryption that could easily be enabled. This was old news when I blogged about it here nearly two years ago, but Google is “looking into whether it would make sense” only recently, perhaps because of a letter organized earlier this month by Chris Soghoian and signed by numerous computer security experts.

I’m not saying that fully homomorphic encryption isn’t important, or that solving this longstanding, open academic question isn’t an achievement. It is important, exciting, and a huge achievement. All I’m saying is that fully homomorphic encryption, or any security technology, won’t solve computer security and privacy problems unless we start using the tools we have.

Edited to add: Here’s a nice piece by Brian Krebs that talks more about the letter sent to Google about ecrypting by default. In particular, I love this quote:

“What we’re saying in this letter is that as an iconic service, and one that professes to be concerned about user safety, Google could set a good example and set the right defaults, and if users want to switch back to something less secure, then they can.”

Dr. Eugene Spafford

(Full Disclosure: I am working with Dr. Spafford this summer at CERIAS on campus at Purdue University.)