Cryptography Policy Laundering
By Susan Landau
Professor Landau is Bridge Professor of Cyber Security and Policy at the Fletcher School of Law and Diplomacy and the School of Engineering, Department of Computer Science, Tufts University.
Last fall law enforcement officials from Australia, Canada, New Zealand, the United Kingdom, and the United States—the Five Eyes—issued a Statement of Principles on Access to Evidence and Encryption. Conflict between tech firms and governments is four decades old, and while the intelligence community seems largely to have accepted that communications are encrypted, law enforcement continues to battle the technology’s availability.
Cryptosystems consist of two parts: an algorithm—a method for encrypting—and a secret—the key—a variable that applied during the use of the algorithm. These abstract definitions are most easily explained by an example; the easiest is the Caesar shift, a two thousand year old method that shifts each letter by a fixed amount—that’s the algorithm for the Caeser shift. The key indicates how much shift should be; a shift of two would take the letter “a” to “C,” “b” to “D,” … and “y” to “A,” and “z” to “B.”
Now a Caeser shift is a quite elementary cryptosystem; the ones used to authenticate a user at an ATM machine or to securely send a credit card number over the Internet are far more complex.
Let’s return to the issue plaguing law enforcement. Strong forms of encryption—essentially encryption unbreakable by current methods—secure the data on devices and protect communications. But because the data is locked away, encryption can prevent government investigators from obtaining evidence when conducting investigations.
One solution might be to make encryption keys recoverable; then if a user loses a key, or a government investigator with a lawful warrant needs access to the encrypted information, it’s possible to provide the data, whether encrypted voice communications or data on a locked device. Such an escrow method would solve the law-enforcement issue—and also save the day when a user forgets her key and can’t open an account.
There are, in fact, many encryption systems that do store the keys; companies, for example, typically escrow employee encryption keys, for example, so that if the employee is hit by the proverbial bus, the company can recover their employee’s files. Such key recovery “solutions”, however, create their own set of security risks.
Concentrating encryption keys in a single—or few—locations makes it a rich target for attackers; what’s worse is that the key recovery center is extremely hard to secure. If the key recovery center is used frequently, securing it is even harder. That’s why again and again, computer security experts strongly recommend against such systems, calling them “inherently less secure, more expensive, and much more complex than those without [escrow capabilities].” (Disclosure: I was a co-author on the second work). The point is that requiring key recovery systems at a national scale—for that’s what would be required for law-enforcement access—would be a serious security risk. It’s not just computer security experts who believe that such escrow systems are a mistake; so do many from the intelligence and law-enforcement communities.
In 2015, former US Department of Homeland Security Michael Chertoff former NSA Director Mike McConnell, and former US Deputy Defense Secretary William Lynn III wrote that, “With almost everyone carrying a networked device on his or her person, ubiquitous encryption provides essential security.” During the height of the battle between Apple and the FBI over the San Bernardino terrorist’s locked iPhone, former NSA Director Michael Hayden told New America that, “American security is better served with unbreakable end-to-end encryption than it would be served with one or another front door, backdoor, side door, however you want to describe it.” And Jonathan Evans, the former ex-head of the UK’s Security Service, MI5, told The Guardian, “I’m not personally one of those who thinks we should weaken encryption because I think there’s a parallel issue, which is cybersecurity more broadly … It’s very important that we should be seen and be a country in which people can operate securely-that’s important for our commercial interests as well as our security interests, so encryption in that context is very positive.”
That’s what makes the September Five Eyes statement so surprising. For this is not a statement by the governments of the Five Eyes; rather, it came from the “Attorneys General and Interior Ministers of the United States, the United Kingdom, Canada, Australia and New Zealand.” In other words, the government ministers who have been fighting for controls on the use of strong encryption issued an international statement arguing that unless the tech companies came to heel, laws would follow restricting the use of encryption. This has been policy that the Australian government has been pushing for quite some time; it fit in with long–term U.S. law enforcement interests as well. But it’s not U.S. government policy, nor, with the exception of Australia, is it government policy elsewhere.
What followed after the September Five Eyes statement was somewhat predictable. In December, the Australian government passed an encryption bill that enables law enforcement to request that in the event a communications company or provider and law enforcement are unable to access a suspect’s data, the company must build a new function to help police get at a suspect’s data, or face fines (this was what FBI Director James Comey sought to force Apple to do during the 2016 standoff over the San Bernardino terrorist’s iPhone). This is what we call an encryption backdoor. The reason they are such a poor idea is that they don’t only provide law enforcement with access to the bad guys; weakening the encryption system by enabling others to access the data also enables the bad guys access to break in. That’s why the intelligence and law-enforcement experts quoted above oppose these backdoored “solutions.”
It’s too soon to tell the consequence of the Australian action. But The Economist reported that tech firms are not very interested in Australia and may choose to pull out of the market. For if the ability to sell in a market of 25 million people means compromising security, it’s not worth it.
It might be that the ministers involved, themselves had some second thoughts about the statement—or perhaps they even got pushback from the national security side of their respective governments. The statement disappeared from its original posting shortly after it was posted, though it may be still be found on the Australian government site. As policy laundering goes, the Five Eyes Statement of Principles on Access to Encryption and Evidence may not have been a particularly wise one.