In the continuing stream of revelations about reported NSA hacking to protect national interests comes more news. Data moving between data centers operated by the world’s largest Internet email companies allegedly was intercepted and collected for analysis because the encryption protections on the data was bypassed.

The prevailing theories about how the NSA apparently did this varies. Did the NSA receive the decryption keys from the email providers themselves, perhaps under court order? Did the NSA steal the keys? Did the agency use custom built supercomputers to break the keys? Did the NSA use backdoors in encryption chips that they managed to get inserted earlier (remember the Clipper Chip?). Or, did the NSA grab the data before it was encrypted and send it on its way? I offer another theory: the NSA used experimental quantum computers (QCs) to break the keys.  But as with so many theories, there is no proof.

Nevertheless, this leads to a question about the state of the technology when it comes to quantum computing, and more specifically, quantum cryptoanalysis.

Messages or transactions protected with “weak” key encryption can sometimes be broken by brute force attacks. In order to do this, powerful computers, sometimes networked to multiply computing power, jointly work through the key space until the right number or key is found. Increasing key length leads to exponential increases in key strength. “Factoring” is the ability to detect repetitions in vast amounts of ciphertext as clues to the large prime numbers used to encrypt the data. Factoring very large integers isn’t a feasible code cracking technique. Today, this would take multiple lifetimes of the universe to break the protection provided by commercially used algorithms employing recommended key lengths.

Ever faster computers means the key space can be searched more quickly  and the cryptography broken more easily. But because increasing key length strengthens cryptographic protections exponentially, something new is needed, a proverbial quantum leap from today’s computers to those exploiting the strange laws of quantum physics. Researchers first demonstrated that quantum computers could factor numbers far faster than digital computers in 1994. But building a QC isn’t easy.

Quantum computers exploit the counter-intuitive world of the very, very, very small, down to the photon level. They use molecules to represent qubic bits of data, or “qubits” that weirdly can represent a one and a zero simultaneously. A pair of qubits can be 00, 01, 10, and 11 at the same time. This capacity increases exponentially: n qubits can be in 2^n simultaneous states. While a conventional computer can represent 1024 individual numbers with 32 bits of information, a QC can represent over 4 billion numbers with 32 qubits.

Furthermore, any operation performed on a number of qubits is performed on all encoded numbers simultaneously, generating a massive parallel computation in one QC. In a single computational step, a quantum computer with 32 qubits can perform the same mathematical operation on over four billion different numbers. A conventional computer would do this by repeating the same computation over four billion times or using over four billion different processors in parallel. Accordingly, QC processing power can be increased very dramatically resulting in machines operating in the petaflop and exaflop range.

University and national government laboratories have been researching quantum computing for years. D-Wave (Burnaby British Columbia, Canada), born from research at the University of British Columbia, is commercially selling early stage QC.

There are many scientific problems that can be addressed using quantum computing. We expect, but cannot confirm, that interested funding sources for research include intelligence agencies that want to use QCs to decrypt encrypted messages, as well as analyze massive amounts of collected data.

And if QCs can be used to break or decrypt encrypted data, then quantum cryptography will be needed to keep up and protect that data. And so there is a sort of cryptographic arms race underway, albeit in slow motion, hindered by the technical problems involved.

Quantum cryptography research isn’t looking for better encryption algorithms, but rather focuses on Quantum Key Distribution to strengthen encryption. Some approaches are limited in that they require dedicated fiber optic connections for key exchanges, although testing also is being done using satellites and over-the-air links. Companies to watch are ID Quantique (Geneva, Switzerland), MagiQ Technologies (Somerville, Mass) and Australian company QuintessenceLabs Inc., with an office in Mountain View, CA.). While this work would appear secondary to the motivation to decipher secret messages, reported NSA eavesdropping may accelerate interest in both. But keep in mind that potential solutions are immature, expensive and highly complex. It will be many years before either quantum computing, or cost-effective quantum encryption key exchanges, are commercially available.

An effect of quantum computing research is likely to be similar to the effect that recent NSA revelations have had: creating distrust and disbelief that anything can be hidden from well motivated, well funded, interested parties, be they spy agencies or commercial rivals. Companies and government agencies need to convince their business partners that the information they are required to keep secret is adequately protected. One way to provide such reassurance is to demonstrate a realistic understanding of how quantum technologies are advancing.  Furthermore, organizations need to test and re-test the current, and largely adequate, data protection methods used to maintain the privacy of their sensitive data today. Proper implementation is the “key.” Quantum computing and Quantum Key Distribution, while interesting, remain elusively over the horizon.

Leave a Reply