Monthly Archives: November 2015

Sustainable data services

Today’s spin development on the Investigatory Powers bill was in the Telegraph, stating that end-to-end encryption would be banned after all. One line from an anonymous Home Office spokesman struck me:

“That means ensuring that companies themselves can access the content of communications on their networks when presented with a warrant, as many of them already do for their own business purposes, for example to target advertising.”

One reading of this is that internet companies leave people no privacy anyway, so there’s no point complaining about intelligence service intrusion. I’ve heard that particular line before from the side of government agencies.

But there’s another way to look at this all. It’s not just end-to-end communication services that “have” customer’s data in such a way that they can’t actually get at it. It’s an entire fast-growing industry of “privacy enhancing technologies”. “Sustainable data services” you might even call them (see where this is going?) Cloud providers, genomic data analysers — all of them safeguarding people’s data that the UK government might at some point want to throw a warrant at. And there’s a history of using a generous definition of “communications services” if that delivers more surveillance data.

If the government can throw us three different lines of spin per day, I can indulge in a silly speculation. Here it comes. What if the UK government is intending to kill off those internet services that aim to use privacy as a selling point? Maybe Apple specifically, maybe the entire privacy enhancement sector. Makes no sense, whatsoever, does it?

Except … Same could have been said for the solar energy industry. A growing industry, good for the economy, good for the environment, good for the carbon targets – and still it got zapped, with companies closing down, jobs lost, and economic capital mindlessly destroyed. Best guess why? Vested interests behind the government. The oil industry and their fracking friends.

So if this government are prepared to sacrifice one healthy growing branch of the economy to satisfy the vested interests behind the screens, why not another? And we can speculate further about what these vested interests may be – the securocrats, or even those internet industries that have more or less given up on privacy.

Of course this is a silly speculation. Silly silly silly. Shouldn’t let myself get dragged into conspiracies.

But if the ban on end-to-end encryption remains on the table (see previous post on this blog), I still think the “privacy enhancing techniques” industry is at risk. The next line in the Home Office spokesman quote is

These companies’ reputations rest on their ability to protect their users’ data.

That’s so nearly right that it’s really wrong. It is the ability to protect their users’ data so well that even the companies themselves can’t get to them.

Mr Cameron has lost his keys

It’s almost a year now since David Cameron started his attack on encryption: on 25 November 2014 he said (in the debate following the ISC Lee Rigby report: [Hansard, column 764]

The question we must ask is: are we prepared to have a means of communication —the internet and a number of modern methods— that we are not able to intercept? My answer is clear: we should not accept that. We should legislate to ensure that that is the case.

On the assumption that this refers to confidentiality and thus encryption, Mr Cameron clearly wants to be able to forbid encryption that he cannot decrypt – no matter how many denials and “U-turns” (including today’s) have followed this story. In other words, in Mr Cameron’s view, all decryption keys are ultimately his. Question is, where did he leave them?

Didn’t he have them all along?

If the US government had had their way in the late 1990s when strong cryptography first came within reach of the broad population, they and other “trusted parties” would have safely overseen crypto, by keeping copies of our keys. This idea of a “key recovery mechanism” or “key escrow” would have helped us, hapless individuals: if we lost our keys we could ask them for a spare copy. For house keys, this works quite well: my neighbours’ sons are indeed quite happy that I have a spare one in escrow. So far, I haven’t been tempted to use this key to find out my neighbours’ secrets: it’s a matter of trust — then again, it’s not my job to convince myself that my neighbours aren’t terrorists. In terms of technical risks (such as safe storage and transmission of keys), key escrow is not all that different from having encryption back doors. In the international context of the internet, the question also arises of which government(s) gets those keys. For all of those reasons, it is good that key escrow never took off.

Can’t he just ask nicely?

The last time UK government flashed up encryption as a problem, in the early 2000s, they ended up with Part 3 of the Regulation of Investigatory Powers Act. This gives government services the power to demand the keys for any encrypted information they come across. Secrecy may be imposed on such requests, jail may follow when keys get refused, even up to five years if the investigation concerns national security or child indecency. Obvious excuses like “I lost the key” (and “you need to prove that I had it in the first place”), or “helping you decrypt my child abuse images would incriminate me” have already been declared invalid.

So this seems to be sufficient for Cameron’s purposes – five years in jail surely is a serious threat. But Cameron wants more – what he hasn’t admitted so far is that he does not only want to be able to find out all communications, but he also doesn’t want the people spied on to be aware of this. That may be a justifiable requirement if the person is part of a larger conspiracy, but more broadly it is an expectation that has become all too natural in this golden age of surveillance.

So if he won’t ask users, where does he expect to find keys?

Cameron’s hope is that encrypted communications of individuals can still be snooped on without them knowing, by going for the internet services that enable these communications. He and his government colleagues have been reassuring us that they don’t want to restrict the use of encryption in electronic commerce. No wonder, and not just because secure e-commerce is crucial for the UK economy. In any case, the intelligence information there is in the transactions, not in the communications, which the internet firms store and, if needed, share anyway. Amazon, for example, are on record with how they have mostly provided information when asked, and ebay help government agencies ensure their customers aren’t involved in illegal activity.
Internet services outside the jurisdiction of UK and friends are less likely to be helpful, but the government needs to impose draconian internet measures to punish non-compliance in other areas anyway — though EU “net neutrality” may yet prevent them.

End-to-end encryption, and keeping the keys?

So for encrypted communications between users and their internet services, certainly those with a UK business presence, Cameron can get all the keys he needs by just asking. How about for end-to-end encryption, where the communication is enabled by an internet service but takes place between two of its users? Joanna Shields gave the most explicit characterisation of the government position on this in the Lords this week:

The Prime Minister did not advocate banning encryption; he expressed concern that many companies are building end-to-end encrypted applications and services and not retaining the keys.
[…]
It is absolutely essential that these companies which understand and build those stacks of technology are able to decrypt that information and provide it to law enforcement in extremis.

Like he has done consistently before, Cameron displays a lack of technical sophistication. If the companies “retained” the keys, it would not be called “end-to-end-encryption”. The point of “end-to-end” is not just that it isn’t decrypted in the middle, but foremost that it can’t be decrypted other than by the intended recipient.

So how do the endpoints get a key for end-to-end encryption?

Surely to set up end-to-end encryption, the enabling service provides the key initially for both parties to communicate? That would indeed be the simple obvious solution — but it is based on the assumption that the service can be trusted. If that trust is not absolute, the security model must assume the key can and will be abused (the service might as well have the key in escrow), and thus encryption must be considered insecure. With the public’s awareness of surveillance programs like PRISM, service providers that use privacy as a selling point should not even want to ask for such absolute trust.

Skype, for example, uses the AES encryption system for calls. This means that there is a single key shared between the two parties. Skype avoids saying whether the central service ever knows or retains this key, and this is one of the reasons for EFF refusing to call Skype encryption end-to-end.

So does the service have to forget the key?

Under normal security assumptions, once something is known to a party, it is never forgotten. So if the service is initially involved in establishing the key, it must be assumed that it will keep and potentially use any knowledge gained from that.

Fortunately, and surprisingly the first times you see it, two parties that want to communicate securely can actually safely agree a key between them without using a third party and with only insecure communication channels between them. The most famous method for this is the Diffie-Hellman method, and protocols based on this are still used in practice. The method used in WhatsApp end-to-end encryption, for example. In such a context, the central service never knows the key — which is in this case a frequently changing series of AES keys.

Shields probably understands this, and WhatsApp was actually mentioned explicitly in that Lords exchange. She does not talk of “retaining” a key that the service never knew – merely about the service being able to decrypt. But only “in extremis” – and maybe that should really be interpreted as: going beyond what is possible, or asking the service to lie to its users about the true level of security provided.

Overall then, in this last year we have seen government’s policy statements on encryption become more focused — but hardly more realistic.