One of the most hotly contested issues in information security is whether the government should encourage the ubiquitous use of strong encryption to protect data both at rest and in transit, or whether the government (and by this, I mean any government) should require users to use only “government approved” encryption.
That is crypto algorithms that have been deliberately weakened to permit government agents – presumably with some lawful authority – to obtain access to the contents of encrypted messages.  With the transition from the Obama administration’s mixed messages on backdoor encryption to the new Trump administration, what can we expect?
The short answer is, more mixed messages.
The good thing about crypto is, that when properly implemented and universally used, it makes it really really hard to obtain the contents of messages.  This is why breaches of encrypted databases or messages don’t count under the law as data “breaches.”
Encrypted data ‘aint data.  They are just files.  It’s also why failure to encrypt databases, laptops, thumb drives, other storage media is considered by the FTC and others to constitute both “deceptive” trade practices and “unfair” practices.  Encryption is a basis for security under the NIST Framework, the ISO standards, the PCI-DSS requirements, and just about everything else.  That’s cause crypto works.
There are currently all kinds of ways to defeat crypto – even if properly designed and implemented.  They run from the deployment of RAM scrapers to capture data prior to encryption or after decryption, to Man-in-The-Middle (MiTM) attacks on SSL encryption, to keyloggers and Trojan horse programs stealing or generating crypto keys, to screen capture devices capturing data displayed on a screen (e.g, a camera over your shoulder).
And that’s when crypto is properly implemented.  As with the German Enigma codes, which were broken partly because of errors in deployment and patterns and routines of transmission, even strong crypto can be broken if the user or the software makes mistakes.  And finally, there’s always brute force – try every combination and you’re bound to get lucky sometime.  These are just the unclassified ways.
Problem is, for properly encrypted data it may be really hard to decrypt – particularly on the fly.  So it would be really convenient if the government had a master key to unlock all encrypted files that it could use if – and only if – there was lawful authority. Of course, nobody would every figure out how to generate such a master key, right?  I mean, remember Cosmo from  the movie Sneakers (kids, ask your parents.)
So how does the new incoming administration feel about such a mandatory backdoor encryption key for US law enforcement and intelligence agencies?  They strongly support it and they oppose it as worthless and a waste of time and energy.
Both incoming Attorney General nominee Jeff Sessions and incoming CIA Director nominee Mike Pompeo have come out in favor of such crypto backdoors.  Politico has reported that Pompeo told the Washington Examiner in January:
“Our intelligence folks and our law enforcement people need to have access to the information in the same way they do other kinds of activities.  It would not be permissible for you to build a home and not let law enforcement in if they had a search warrant…. It has to be the case that persons who built the house and control the technology have to respond to that lawful request for information.”
It’s not clear whether this means that Pompeo is advocating that device manufacturers, ISP’s hardware, software, firmware and crypto providers all install mandatory backdoor programs or what.
In addition, in order to be effective, the use of such “backdoor crypto” would have to be ubiquitous and mandatory.  Otherwise you end up with a situation where the good guys are using weakened crypto (for which the government and maybe hackers have keys) but the bad guys are using homebrew crypto, or available foreign crypto with no such restriction.
In fact, in a January 3 op-ed piece, Pompeo himself recognized this fact noting that mandatory backdoor crypto “”do little good, since terrorists would simply switch to foreign or home-built encryption.”  No duh.
Continuing with the criminalization of security, Pompeo also raised the specter of targeting individuals or organizations who attempt to protect their data and that of their customers in the same op-ed noting that “the use of strong encryption in personal communications may itself be a red flag” of criminal or terrorist activity.
So just like paying cash for something, or using P2P programs, or TOR browsers, refusing to expose your activities to public scrutiny is a sign of criminal activity.  If you don’t have anything to hide, why are you using crypto?  Under this regime, the mere use of encryption technology – something that virtually every security standard compels – becomes probable cause for a search, or at least a watch list.  Damned if you do.  Same thing if you don’t.
Similar views were echoed by Attorney General nominee Jeff Sessions, when he criticized Apple for not designing its phones in a way to provide for easy decryption of customer messages upon demand by the government (and the Courts.)
Sessions noted that “In a criminal case, or [it] could be a life and death terrorist case, accessing a phone means the case is over. Time and time again, that kind of information results in an immediate guilty plea, case over.”
That’s a curious way to put it – I can understand the “ticking time bomb” case – you need immediate access to an iPhone to prevent some imminent harm.  But Sessions seems to suggest that everyone’s privacy and security should be weakened in order to spare the government the time and expense of taking someone – anyone in any kind of case – the burden of going to trial and to get an “immediate guilty plea.”
Remember, to get a guilty plea, you need to have a defendant.  And if the phone belongs to that defendant, you can already likely compel that defendant to give up his or her own PIN, key, or biometric.  If you can’t compel such production, it’s because the Constitution forbids it.  Sure, there could be times when the locked phone is not the defendant’s – but inducing a guilty plea is not a great justification for weakening everyone’s privacy and security.
President-elect Trump has indicated the same sentiments when he called for a boycott of Apple and their products in the wake of the company’s refusal to voluntarily attempt to create a backdoor for the security of its iPhone to permit the government to investigate the attacks in San Bernardino.  So in an incoming Trump administration, we can expect if not open hostility, at least skepticism to the tech industry’s desire to continue to promote strong encryption.
This raises a dilemma for the tech industry’s dealings with the new administration.  Should it attempt to work with the new administration to develop a legal and technical compromise that will give the government access to information with appropriate safeguards while otherwise protecting the security of that data?
If you answered “yes” to that question, then you were fooled.  That’s because there ultimately is no “compromise” legal position or technology.  Sure, we would LOVE there to be a magic software that could determine the motives and legal authority of the person seeking access and only allow good people in with good motives and pure intent.
Like pulling Excalibur from the rock, right?  But technology doesn’t work like that.  A multiparty master key (with one part held by say the executive branch and another by the judiciary, and like epoxy only works when the two are put together) simply creates a vulnerability that can be exploited by hackers or others.  And even if it doesn’t, why should the US government be the only one with access to the key? What about the Canadians?  Germans? French? Iranians? Sudanese?  North Koreans?  You get the idea.
Imagine if the law mandated that all webcams include a secret access code that the government could input to surreptitiously turn the webcam and microphone on in case the government got a warrant for the webcam.  I think people would soon stop buying devices with webcams.
And with the explosion if Iot devices and wearables, I can certainly think of law enforcement desires to infiltrate your fitbit, crock pot or pacemaker.  But that doesn’t mean that the manufacturer of the device should – or even could – insert a back door into the device for law enforcement purposes.  If there really was a way to just allow “good” people in and keep “bad” people out, we would have a solution for the entire problem of cybersecurity, right?
While it may be productive for people in the tech industry to continue to engage law enforcement on these important issues, neither side should delude itself into thinking that there are any easy solutions – whether policy ones or technical ones.  Every solution likely comes with a host of new problems.   It’s important to keep engaging however, to keep the other side from acting unilaterally.
It’s not likely that Apple, Google, Microsoft, Cisco, and others will willingly weaken the crypto in their products in the future.  It’s also not likely that the CIA, NSA, FBI, DOJ or others would be happy to buy products with such weakened crypto.  With the incoming Trump administration enjoying majorities in the House, Senate, and holding court appointments, we can expect a dramatic shift in position on encryption.  However, like a clipper ship tacking in the wind, that shift may end up to be 360 degrees leading us right back to where we started.

Leave a Reply