Crypto Regulation, Part 1: What is it all about?

Over the last weeks, we’ve had a slew of politicians asking for new legislation in response to the Paris attacks. The proposed new regulations range from a new Data Retention directive (here’s my opinion on that) to PNR (Passenger Name Records, data about the passengers of all Flights within europe) data exchange within the EU.

By far the most worrying suggestion initially came from UK Prime Minister David Cameron, but was taken up by Barrack Obama, the Counterterrorism Coordinator (pdf) of the EU, and the German Innenminister (Minister of the Interior), de Maziere: A regulation of encryption technology. The reasons they give are very similar: We need to be able (in really important cases, with proper oversight, a signed warrant et cetera) to read the contents of any communication in order to “protect all of us from terrorists”.1)

The irony of justifying this with the Paris attack, a case where the terrorists were known to the relevant authorities and used unencrypted communication, is apparently lost on them.

In this series of posts, I will take a look at what crypto regulation means, how it could (or could not) work in practice, and why it appeals to pro-surveillance politicians either way.

An (extremely) brief primer on cryptography

Cryptography is used in order to hide information from unauthorized readers. In order to decrypt a message, you need three things: The encrypted message (obviously), knowledge about the algorithm that was used to encrypt it (which can often, but not always, be easily determined), and the cryptographic key that was used to do it. When we talk about crypto regulation, we usually assume that algorithm and message are known to whoever wants to read them, and the only missing thing is the key.

Cryptography is all around you, although you may not see it. In fact, you are using cryptography right now: This website is protected using SSL/TLS (that’s the https:// you see everywhere). You are also using it when you go to withdraw money from an ATM, when you send a mail, log into any website, and so on. All of those things use cryptography, although the  strength (meaning how easy it is to break that cryptography) varies.

A (very) brief history of crypto regulation to date

Crypto regulation is hardly a new idea. For a long time, the export of encryption technology was regulated as a munition in the United States (the fight for the right to freely use and export cryptography was called the Crypto Wars and spawned some interesting tricks to get around the export restriction). This restriction was relaxed, but never completely removed (it is still illegal to export strong encryption technology into “rogue states” like Iran).

During the last 10 years or so, there haven’t really been serious attempts to limit the use and development of encryption technology2), leading to the rise of many great cryptographic tools like GnuPG, OTR, Tor and TextSecure.3) But now, there appears to be another push to regulate or even outlaw strong encryption.

What is “strong encryption”?

In Cryptography, we distinguish between two4) different kinds of encryption. There is transport encryption and end-to-end encryption. Transport encryption means that your communication is encrypted on its way from you to the server, but decrypted on the server. For example, if you send a regular eMail, your connection to the server is encrypted (no one who is eavesdropping on your connection is able to read it), but the server can read your eMail without a problem. This type of encryption is used by almost every technology you use, be it eMail, chats (except for a select few), or telephony like Skype.

The major drawback of transport encryption is that you have to trust the person or organization operating the server to not look at your stuff while it is floating around on their server. History has shown that most companies simply cannot be trusted to keep your data safe, be it against malicious hackers (see Sony), the government (see PRISM), or their own advertising and analytics desires (see Google Mail, Facebook, …).

The alternative is end-to-end encryption. For this, you encrypt your message in a way that only allows the legitimate receiver to decrypt it. That way, your message cannot be read by anyone except the legitimate receiver.5) The advantage should be obvious: You can put that message on an untrusted server and the operators of said server cannot read it.

The drawback is the logistics: The recipients need to have their cryptographic keys to decrypt the message, which can be a hassle if you have a lot of devices. The key can also be stolen and used to decrypt your messages. For some usage scenarios like Chats, there are solutions like the aforementioned OTR and TextSecure (which you should install if you own an Android phone), but there is no such solution for eMails. End-to-End-Encryption also does not protect the metadata (who is talking to whom, when, for how long, et cetera) of your messages, only the contents.

When politicians are talking about “strong encryption”, they are probably referring to end-to-end encryption, because that data is much harder to obtain than transport-encrypted data, which can still be seized on the servers it resides on. To read your end-to-end encrypted data, they would have to seize both the encrypted data and your encryption keys (and compel you to give them the passwords you protected them with), which is a lot harder to do.

Conclusion

Now that we have a basic understanding of the different types of encryption used in the wild, we can talk about how to regulate them. This will be covered in part 2 of this series.


Thanks go out to niemalsnever, FreeFall and DanielAW for proofreading and suggestions. Any remaining mistakes are solely mine.

Flattr this

Footnotes

Footnotes
1 I dislike the term “Terrorist” because it can (and has been) expanded to include pretty much anyone you disagree with. However, for readability, I will use it in the connotation most used by western media, e.g. meaning the Islamic State, Al Quaeda, et cetera.
2 Although it appears that these efforts where simply put into introducing backdoors into algorithms and implementations instead.
3 And not-so-great, but still necessary tools like OpenSSL.
4 We obviously distinguish between many more than that, but for this article, two will be enough.
5 This is, again, a gross simplification, but sufficient for this article.