Encryption is an obscure but critical part of everyday life. That padlock in the address bar of the website you’re visiting represents the ‘s’ after ‘http’ — which stands for the latest version of Transport Layer Security (TLS). Together with Secure Sockets Layer (SSL), which TLS replaced, these digital security technologies allow encrypted communication between two parties, such as web sites or servers, and web browsers.
Like the Internet itself, these technologies were breakthroughs when conceived. Whereas previously, encrypted secure communication required a physical exchange of keys, the new approaches allowed secure communication between parties unknown to each other.
Public-key cryptography, also described as asymmetric encryption, did so through a pair of keys: one public, which can be shared widely, and the other private, which is kept secret.
Common deployments of public key infrastructure (PKI) leverage the Diffie-Hellman key exchange, which stands behind the secure icon in your browser’s address bar; and the RSA algorithm, which is named after its inventors: Ron Rivest, Adi Shamir and Leonard Adleman.
Both of those algorithms originated in the 1970s.
Attribute-Based Encryption: A Brief History
Cryptography is a highly mathematical and esoteric discipline, but most tech-savvy readers have at least a passing familiarity with TLS or SSL. Many have worked at companies that require the use of RSA SecureID authentication tokens (the inventors of the RSA algorithm also set up a company with same the three-letter name.) Less well known is the story of how this field has evolved behind the scenes over the past few decades, and what new approaches are on the horizon.
Public keys were a leap forward, but challenges in managing them led one of the RSA founders, Adi Shamir, to introduce in 1984 the idea of identity-based encryption (IBE). Seven years later, another cryptographer, Stanford University Professor Dan Boneh, proposed a practical implementation of IBE using a variant of the computational Diffie-Hellman problem (cryptographic systems are based upon mathematical problems that are very difficult to solve). This proposal advanced the cause; yet it relied upon a private key generator (PKG) which created certain drawbacks, especially for general use.
In 2005, Amit Sahai, Symantec Chair professor of computer science at the UCLA Samueli School of Engineering and director of the Center for Encrypted Functionalities, and Brent Waters, professor of computer science at University of Texas at Austin and distinguished scientist at NTT Research (then at Princeton), approached the idea from another angle.
In a paper titled “Fuzzy Identity-Based Encryption,” they suggested — and proved — that a secure system was possible using multiple private keys with a single public key. The paper also introduced a class of IBE in which public keys were defined in terms of attributes. In what became known as attribute-based encryption (ABE), decisions to decrypt turned on policies rather than individual identities.
The paper proved influential. Building upon it, in particular with the idea of designating sets of attributes as private, the authors later proposed the broader concept of Functional Encryption (FE). While FE remains in development, ABE has gained speed. In 2018 the European standards body ETSI issued specifications for using ABE to secure access control. Meanwhile, in April 2020, the scientific organization that hosted the conference at which the original Sahai-Waters paper was presented gave it a Test of Time award.
ABE Use Cases
To better appreciate how ABE would differ from the status quo, let’s look at some examples.
Consider a document that needs to be locked down, with limited access. It could be classified intelligence, privileged client information, health care data, intellectual property, etc. In this case, a trusted server is typically used to store the data. To access the document, you connect with the server and display credentials. The server delivers all the data in the clear to you if, and only if your credentials match with the data access policy. This is the classic all-or-nothing model, typically aimed toward a single recipient. It remains the prevailing paradigm.
But about that trusted server, for instance, one with a TLS certificate. Unfortunately, server corruption occurs, making trust more difficult to assume. Corruption comes in various forms: an operator or owner of a third-party cloud server may want to read your data; the operator may be honest but is using hacked software; or the operator may have discarded the physical storage medium, which a bad actor then found and exploited.
Suppose, on the other hand, that the document was encrypted such that it could be stored on an untrusted server. What if the use of retrieved data or files occurred when you — or others with privileges — applied cryptographic keys based on a set of attributes?
Imagine access moved from the realm of software engineering into mathematics, based on the attributes that you and possibly others have, not simply your identity. Attributes could involve belonging to a particular department for a certain amount of time; or being part of a budgeting group within the CFO’s office. That is the kind of enhanced efficiency, security and utility that ABE offers.
Take another scenario involving a ridesharing app. Today the company stores your credit card and personally identifiable information (PII) on a trusted server, accessing it when needed to complete a verified transaction or to use for other authorized purposes.
Within an ABE framework, the company could also encrypt sensitive information and tag it with attributes of the GPS location of the ride, time and driver’s name. Then it could decide how much access to grant employees. Say a policy allows them to read all data that, (1) exists within a certain GPS bounding box of the region and, (2) was created after the employees were hired into their position. The data becomes at once more usable and secure, being subjected to both flexible policies and restricted access.
The Case for ABE and Standards
Are there barriers to deploying ABE? One technical factor involves speed. The time it takes to decrypt data within ABE can take 20 times longer than in standard decryption, depending on the size of policies. However, context here is important. This difference could be on the scale of one millisecond of latency vs. 20 milliseconds, which is undetectable by a user.
A more general rule is that innovation itself takes time. While ABE was first proposed in 2005, context matters. The world has moved beyond the one-to-one, browser-to-website paradigm. In 2018, with 5G, highly distributed IoT systems and recent European privacy laws in mind, the ETSI Technical Committee on Cybersecurity issued two specifications for applying ABE to protect personal data with fine-tuned access controls:
- ETSI TS 103 458, which codifies the high-level requirements for applying ABE for personal identifiable information (PII) and personal data protection in four use cases: IoT devices, wireless LANs, cloud and mobile services.
- ETSI TS 103 532, which addresses an ABE toolkit, trust models, procedures for distributing attributes and keys and an attribute-based access control layer.
According to the ETSI press release at the time, a standard using ABE has several advantages. In the first place, it offers greater security, “because ABE enforces access control at a cryptographic (mathematical) level, it provides better security assurance than software-based solutions.” At the same time, ABE is “space-efficient”, requiring only one ciphertext to handle access control needs of any given data set.
An ABE-based standard also intrinsically supports privacy. “It provides an efficient, secure-by-default access control mechanism for data protection that avoids binding access to a person’s name, but instead to pseudonymous or anonymous attributes.” Both of the ETSI specifications enable compliance with the General Data Protection Regulation (GDPR).
The standards organization described this new scheme as especially relevant in an IoT world, where data are widely distributed yet access must be limited: “ABE offers an interoperable, highly scalable mechanism for industrial scenarios where quick, offline access control is a must, and where operators need to access data both in a synchronous manner from the equipment as well as from a larger pool of data in the cloud.”
Finally, the fine-tuning enabled by ABE allows for introducing access control policies after data has been protected, which “provides forward-compatibility with future business and legal requirements.”
Security Plus Utility
Whether or how soon a new encryption scheme comes to a device near you, one takeaway here is that cryptography is far from a static field.
As a final note, apart from ABE, work on FE is also advancing. The Functional Encryption Technologies project (FENTEC), funded by the EU’s Horizon 2020 research and innovation program, is pressing ahead to develop new FE as an efficient alternative to the all-or-nothing approach of traditional encryption. In academics, the Center for Encrypted Functionalities, which Amit Sahai directs at UCLA, continues to advance the underlying mathematics behind both FE and ABE.
To users of encryption, little appears to have changed, even as cybersecurity threats continue undiminished. In academic labs and R&D shops, however, cryptographers have been busy. Ongoing work surrounding ABE and FE aims to enhance security and privacy, without sacrificing functionalities.
Indeed, the goal is that heightened security can coexist with even more efficiency, flexibility and utility.