When Confidentiality and Security Collide
Defense researchers are developing exotic cryptography techniques that marry privacy and data sharing.
You might think that homomorphic cryptography, obfuscation techniques and privacy concerns have nothing in common. You would be mistaken.
The Defense Advanced Research Projects Agency (DARPA), a division of the U.S. Defense Department that creates breakthrough technologies, is advancing these complex but intrinsically connected concepts in a series of efforts that could alter the art of making and breaking code.
The holy grail of cryptography could emerge from one of these efforts, known as the Programming Computation on Encrypted Data (PROCEED) initiative, as technology and security evolve, says John Launchbury, director of the agency’s Information Innovation Office (I2O). PROCEED resulted in the ability to perform computations on encrypted data without first decrypting it. The discovery strengthens data protection and makes it more difficult for malware programmers to write viruses, he explains.
“Any computer or web-friendly device connected to the Internet could gain unauthorized access to pools of computing power, applications or files, compromising information security in cloud computing environments,” says Launchbury, who notes that governments increasingly are leveraging cloud services.
PROCEED makes computations on encrypted data practical, fundamentally changing how they occur in untrusted environments. Users maintain control of the decryption key, and no one else can decrypt either data or computational results, Launchbury says. The program builds on the profound 2009 discovery by IBM researcher Craig Gentry, who documented the first fully homomorphic encryption scheme that lets operators compute arbitrary functions over encrypted data without the decryption key.
Before then, professionals had become quite proficient at safeguarding information in transit and at rest, Launchbury says, but they lacked a way to do computations in the cryptographic space that would correspond directly with computations in the plain text space. Gentry achieved a eureka moment, but the scheme was much too time-consuming and incredibly inefficient, he says.
Estimates of computing using Gentry’s method “are something like 100 trillion times slower than doing it in plain text. We often complain if our machine is running at half-speed. At 100 trillion times slower, it would take two seconds to do one logical operation and then another two seconds for another logical operation,” Launchbury says.
Still, Gentry’s discovery laid the groundwork for DARPA’s quest: “Could we take this incredibly slow thing ... and bring it into the kind of speed that if you have a specialized need, it might actually make sense to use it?” Launchbury proposes.
DARPA researchers did. The goal was to improve from a 100 trillion times slowdown to a whopping billion times slowdown. “You might say, ‘Well, that’s still really, really slow.’ On the other hand, the speed difference between a centralized supercomputer and a handheld [device] is about a billion,” Launchbury explains.
Researchers not only cracked the speed code, so to speak, but also outdid themselves. “We got that technique to about 100 million times slower than doing it in the clear,” says Launchbury, using terminology for working on unencrypted data. “We brought it to a place where you could imagine using it. Sometimes ... a little bit of really good cryptography opens up the opportunities for many other uses.”
PROCEED, which concluded in 2015, was “very much a technical success, and it has really helped the space move forward so that there are times when this technology is indeed now usable,” he adds.
DARPA researchers are not resting on their laurels, however. They are working on the related research area of secure multiparty computation (SMC), in which two or more operators perform computations simultaneously, but each maintains data privacy.
The agency ran a pilot study to assess schemes and used as the test bed the universal Advanced Encryption Standard (AES) algorithm that encrypts https extensions for website communications. The hypothetical scenario centered on whether a third party could run computations on encrypted code used to steer satellites operated by separate companies and accurately predict if the satellites might crash. Researchers advanced SMC techniques to allow a third party to step in and run detailed computations on the encrypted steering data without ever having to decrypt it—and at speeds that made it feasible to run the calculations. “It sounds impossible, but the mathematics work out,” Launchbury shares.
Another DARPA effort focuses on obfuscated computation, which expands on homomorphic cryptography. Obfuscation primarily is based on “security through obscurity” strategies, typified by inserting passive junk code into a program’s source code and cluttering it up, according to DARPA. The agency’s SafeWare program aims to develop obfuscation technology that would render software, such as proprietary algorithms, incomprehensible to criminal reverse engineers. Some criminals have found it easy to reverse engineer software, making for lucrative transgressions with national security implications.
Connected closely to this work is the agency’s push to better protect people’s privacy through its Brandeis program. The effort is named after Louis Brandeis, a former U.S. Supreme Court justice who co-wrote the influential 1890 Harvard Law Review article “The Right to Privacy” with Samuel Warren following the development of the portable Kodak camera, which stirred privacy concerns.
The DARPA program’s privacy-protecting approach taps advanced cryptography, differential privacy and machine-learning software in this era of emerging smart cities and uber-connected devices that can track almost every movement. “How can we gather information from lots and lots of devices in a secure way, in a privacy-preserving way, and yet still do interesting computations that may be useful for all sorts of things, from traffic to law enforcement and anti-terrorism?” Launchbury wonders.
He outlines an example of the program’s application using the 2013 Boston Marathon bombing. Envision an app for smartphones that can scan photographs and identify people, he says. Police have a possible suspect but cannot simply disseminate a photograph to the public. They do have the technology, however, to tap the app to find out whether any user’s phone contains photographs of the suspect. “When I say it just like that, that sounds like privacy violations all over the place,” Launchbury shares.
The Brandeis project aims to break the polarity between maintaining privacy and accessing valuable data, particularly if the data is useful for law enforcement, according to DARPA. Rather than balancing the two concerns, Brandeis would create a third option: enabling safe and predictable data sharing that preserves privacy.
In the hypothetical scenario Launchbury describes, it “turns out you can cryptographically process the [suspect’s] photograph to find some information about it—a sort of visual fingerprint that could be sent to the phones. Then, the software on the phone could compare that with [the] cryptographic fingerprint of the faces within their photographs. For most people, it will be no match. But maybe on one or two phones, a message will come to the user saying, ‘The police are looking for a suspect, and it appears the suspect is in this photograph of yours. Are you willing to share this photograph with the police?’”
The potential impact of such technology, and the program itself, could be dramatic, he offers. Imagine if businesses and governments could guarantee data privacy. “Democracy and innovation depend on creativity and the open exchange of diverse ideas, but fear of a loss of privacy can stifle those processes,” Launchbury says. The program marries the best of the two worlds, he reasons.