Confidential Computing: The Emerging Paradigm for Protecting Data In-Use

Co-authored by NIST NCCoE Computer Scientist, Murugiah Souppaya

 

As computing moves to span multiple environments—from on-prem to public cloud to edge computing—organizations need security controls that can safeguard sensitive intellectual property (IP) and workload data wherever the data resides. Highly regulated applications and mission critical applications need data protection through all its modalities—at-rest, in-transit and in-use, for companies to migrate the data to the cloud where there is a lack of control and visibility in a multitenant environment. As an industry we have generally figured out how to protect data at-rest, and, in-transit. Confidential Computing (CC) is an emerging industry paradigm focused on securing the data in-use.

Today, it is the universal practice in cloud and enterprise that data at rest be protected using strong encryption in local and/or network attached storage. However, when the same data is being processed by the central processing unit (CPU), it is in plain text in memory and is not protected by encryption. Memory contains high value assets such as storage encryption keys, session keys for communication protocols, IP, personally identifiable information (PII) as well as credentials. With container, virtualization and cloud, multi-tenancy brings the additional dimension where virtual machines (VMs) or containers from two different customers could be running on the same machine. For a certain set of sensitive and regulated workloads, there is a desire to protect the data in use from the underlying privilege system stack. Therefore, it is critical that data in memory has comparable protection to data at rest in storage devices. This is the focus of Confidential computing—protecting data in use on compute devices using hardware-based techniques.


Read more at: Intel