Get email delivery of the Cadence blog featured here
I have always had a passing interest in encryption and security. My PhD is on network file systems, where managing who has access to what data is an important aspect. I also spent the best part of a year working for a biometric security company (fingerprints and one-time-passcodes).
When Practical Cryptography by Bruce Schneier first came out, I immediately purchased a copy, at least partially because in that era it seemed plausible that the government might try to restrict knowledge of cryptography. It has never liked the idea that people might be able to talk without the government being able to listen, and it still doesn't. I think back then encryption software was still regarded as a "munition" and subject to full-on export regulation.
But the Internet came along and we all use cryptography every day, although it is hidden from us in our web browsers and our smartphones.
When I was at VLSI Technology, we were the foundry for building the "Clipper" chip, announced in 1993. This was an encryption chip intended for telephones (this was before cellphones) but there were two keys. The first key was the one used by the two parties to encrypt their conversation. The second key would be held in a key escrow database managed by the government and could be used to decrypt any intercepted conversation without requiring access to the first key. However there was enormous backlash and eventually the whole project was dropped. It wasn't dropped so much because of the backlash but because other strong encryption technologies became widespread (such as PGP and PGPfone) and because it was obviously an opportunity for foreign suppliers of competing chips without the escrow requirements. One issue that was not a big deal at the time was the risk that the key escrow database itself would be compromised, at which point all encrypted phone calls would be vulnerable. Having seen what has happened to, for example, the Office of Personnel Management (OPM), where the records of over 22M people were compromised, this is not in the least far fetched. Or the NSA stealing millions of cell-phone SIM card keys from the manufacturer, Gemalto in France.
Encryption has been in the news due to the iPhone of the San Bernadino killer. Since he destroyed his personal phone and the iPhone was provided by his employer, it seems unlikely that there would be anything much on it, but presumably the FBI thought they would never have a stronger case than with the phone of a known terrorist. If they expected everyone to support them, they were probably surprised when the entire tech industry and many others opposed their access. The most recent news is that they have backed off and supposedly feel that an Israeli firm can get them in without Apple's help, which I find hard to believe.
The big problem is that once you compromise security, it is not just the so-called good guys who can exploit the loophole. Potentially all the bad guys, from Chinese security agencies to Russian crooks, have access. And experience has shown that once the government has access in the special case of terrorism, it rapidly will degenerate until every little law enforcement agency wants access too, for drug deals or tax evasion or other offences that are not in the same league as terrorism. Just look at how RICO, originally created to address organized crime, has been misused for trivial matters.
This is before worrying about the precedent, that if the US government can force Apple to provide a way to decrypt the phone of a suspect, then how about the Chinese government, or Iran?
This is is not a theoretical worry, that other people will gain access. Apple already had a lot of data stolen through an interface that was added to give access to law enforcement. The Greek government, a few years ago, discovered that all their senior politicians' cellphone calls had been tapped through an interface in Ericsson phone switches that had been added to allow law enforcement to tap calls when required. Adding a backdoor to encryption is like deliberately adding a security flaw: it will, probably sooner rather than later, get discovered. But unlike a real security flaw, since it is meant to be there, it can't be patched and fixed.
Just last week, security researchers discovered a security flaw in Apple's iMessage that allowed images (and perhaps text) to be stolen during transmission. The bug was quickly fixed in 9.3 which conveniently was scheduled to go out earlier this week. The underlying fact remains that encryption is hard enough, in fact impossible, to get 100% right even when trying as hard as possible. Adding backdoors and other loopholes is asking for trouble, asking for crimiinals to take our identities, empty our bank accounts, or for foreign governments to compromise whatever they want.
Also, just as with the Clipper chip, compromising Apple's security would be a gold mine for non-American suppliers who would not be vulnerable in the same way and could provide secure phones, such as the GranitePhone (French). Or for calls, encryption could be provided in software and operate over the data connection in much the same way as Skype does (which apparently used to be secure until Microsoft purchased them, and then, at least according to Der Spiegel in Germany, the security was weakened so that the NSA has access).
In the words of Ian Miers, one of the researchers at John Hopkins who discovered the flaw in Apple's iMessage:
The main point is that encryption is hard to get right. Imagine the number of things that could go wrong if you have more complicated requirements like a backdoor.
If you are interested in security and encryption, I recommend Bruce Schneier's blog, Schneier on Security, and his books. He is the person who coined the phrase "security theater" to describe much security (such as much of what the TSA does) that is just for show with little impact on actual security.
Most compromise of security happens not through "breaking" the encryption but through alternative weaknesses. Here are a couple of examples that go to show just how hard security is:
A couple of weeks ago, John Oliver chose encryption for the main segment of his HBO show Last Week Tonight. He ended up making pretty much the same point: in the encryption arms race, the good guys are only six months or a year ahead of the bad guys. Like the Red Queen and Alice in Through the Looking Glass, it takes all the running you can do to stay in the same place. You can watch the segment:
I mentioned iOS 9.3 above. This is nothing to do with encryption, but if you have an iPhone and you use it in the evening, and you have installed iOS 9.3 then I recommend doing the following: go to "settings" and then "display and brightness". Click on "night shift". Turn it on. Make sure that the "from/to" is set to "sunset to sunrise". This will cut back on blue light, which inhibits melatonin production, which you need to produce to sleep. If you have used f.lux on your Mac (I assume it is on the PC too) then it is very similar. When you sleep better, you can thank me.
Previous: A Brief History of Cadence: the Post-Costello Years
Next: Memory Standards and the Future