, ,

Listen to Tim Berners-Lee: Don’t weaken encryption!

It’s a bad idea to intentionally weaken the security that protects hardware, software, and data. Why? Many reasons, including the basic right (in many societies) of individuals to engage in legal activities anonymously. An additional reason: Because knowledge about weakened encryption, back doors and secret keys could be leaked or stolen, leading to unintended consequences and breaches by bad actors.

Sir Tim Berners-Lee, the inventor of the World Wide Web, is worried. Some officials in the United States and the United Kingdom want to force technology companies to weaken encryption and/or provide back doors to government investigators.

In comments to the BBC, Sir Tim said that there could be serious consequences to giving keys to unlock coded messages and forcing carriers to help with espionage. The BBC story said:

“Now I know that if you’re trying to catch terrorists it’s really tempting to demand to be able to break all that encryption but if you break that encryption then guess what – so could other people and guess what – they may end up getting better at it than you are,” he said.

Sir Tim also criticized moves by legislators on both sides of the Atlantic, which he sees as an assault on the privacy of web users. He attacked the UK’s recent Investigatory Powers Act, which he had criticised when it went through Parliament: “The idea that all ISPs should be required to spy on citizens and hold the data for six months is appalling.”

The Investigatory Powers Act 2016, which became U.K. law last November, gives broad powers to the government to intercept communications. It requires telecommunications providers to cooperate with government requests for assistance with such interception.

Started with Government

Sir Tim’s comments appear to be motivated by his government’s comments. U.K. Home Secretary Amber Rudd said it is “unacceptable” that terrorists were using apps like WhatsApp to conceal their communications, and that there “should be no place for terrorists to hide.

In the United States, there have been many calls for U.S. officials to own back doors into secure hardware, software or data repositories. One that received widespread attention was in 2016, when the FBI tried to compel Apple to unlock the San Bernardino attack’s iPhone. Apple refused, and this sparked a widespread public debate about the powers of the government to go after terrorists or suspected criminals – and whether companies need to break into their own products, or create intentional weaknesses in encryption.

Ultimately, of course, the FBI received their data through the use of third-party tools to break into the iPhone. That didn’t end the question, and indeed, the debate continues to rage. So why not provide a back door? Why not use crippled encryption algorithms that can be easily broken by those who know the flaw? Why not give law-enforcement officials a “master key” to encryption algorithms?

Aside from legal and moral issues, weakening encryption puts everyone at risk. Someone like Edward Snowden, or a spy, might steal information about the weakness, and offer it to criminals, a state-sponsored organization, or the dark web. And now, everyone – not just the FBI, not only MI5 – can break into systems, potentially without even leaving a fingerprint or a log entry.

Stolen Keys

Consider the widely distributed Content Scramble System used to secure commercial movies on DVD discs. In theory, the DVDs were encoded so that they could only be used on authorized devices (like DVD players) that had paid to license the code. The 40-bit code, introduced around 1996, was compromised in 1999. It’s essentially worthless.

Or consider the “TSA-approved” luggage locks, where the locks were nominally secured by a key or combination. However, there are master keys that allowed airport security staff to open the baggage without cutting off the lock. There were seven master keys, which can open any “TSA-approved” lock – and all seven have been compromised. One famous breach of that system: The Washington Post published a photograph of all the master keys, and based on that photo, hackers could easily reproduce the keys. Whoops!

Speaking of WhatsApp, the software had a flaw in its end-to-end encryption. as was revealed this January. The flaw could let others listen in. The story was first revealed by the Guardian, which wrote

WhatsApp’s end-to-end encryption relies on the generation of unique security keys, using the acclaimed Signal protocol, developed by Open Whisper Systems, that are traded and verified between users to guarantee communications are secure and cannot be intercepted by a middleman.

However, WhatsApp has the ability to force the generation of new encryption keys for offline users, unbeknown to the sender and recipient of the messages, and to make the sender re-encrypt messages with new keys and send them again for any messages that have not been marked as delivered.

The recipient is not made aware of this change in encryption, while the sender is only notified if they have opted-in to encryption warnings in settings, and only after the messages have been re-sent. This re-encryption and rebroadcasting of previously undelivered messages effectively allows WhatsApp to intercept and read some users’ messages.

Just Say No

Most (or all) secure systems have their flaws. Yes, they can be broken, but the goal is that if a defect or vulnerability is found, the system will be patched and upgraded. In other words, we expect those secure systems to be indeed secure. Therefore, let’s say “no” to intentional loopholes, back doors, master keys and encryption compromises. We’ve all seen that government secrets don’t stay secret — and even if we believe that government spy agencies should have the right to unlock devices or decrypt communications, none of us want those abilities to fall into the wrong hands.