Why should you care about privacy computing?

Why should you care about privacy computing? ...

Data is the new oil in today's digital-driven economy. Companies and organizations are eager for new information, leading to an ever-increasing demand for cross-organizational and cross-industry data collaborations. Unfortunately, data collaborations are also hampered by ever-increasing concerns about data safety, privacy, and confidentiality.

For example, the medical industry wants to collaborate across industries to aid in medical research and medicine discovery with the aid of artificial intelligence (AI), but it must also face regulatory and legal restrictions related to patient privacy. In the banking industry, cross-institutional data collaboration is crucial to combat financial crimes, such as money laundering, yet the cost of such collaboration is often prohibitive due to data privacy and confidentiality restrictions.

Wouldn't it be nice to have a technology that can facilitate data collaboration and computation without ever uncovering or jeopardizing the underlying data? Privacy computing technologies (a.k.a. privacy-enhancing technologies) are a broad term used to describe a wide range of hardware and software solutions that can be utilized to extract value from data without risking the privacy and security of the data itself.

Privacy computing has been identified as one of the top strategic technology trends for 2021 by Gartner. In this blog, I will briefly discuss a few of the privacy computing approaches and share my perspective from a deep-tech venture capital perspective.

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers on July 19 and 20-28.

Multiparty computation

MPC is a software-based security protocol in which multiple data owners perform a function over their individual inputs while keeping the input data private. Data security is achieved by shuffling data from individual parties and distributing it across multiple parties for joint computations, all without the need to trust any of the parties (a.k.a. trustlessness).

MPC is an attractive and secure method, although there are still certain limitations in use in many instances. For example, MPC calculations involve a large amount of data exchanges among parties, which are susceptible to network latency and are often limited by the slowest data link among the parties. Startups like Baffle and Inpher, for a few, have successfully developed MPC applications.

Execution in a trustworthy manner

TEE, also known as a trusted enclave, is a hardware-based privacy computing technique that allows for secure communication and encryption outside of the enclave. Intel, AMD, and other chip manufacturers offer various versions of TEE chips.

TEE is a versatile and economical confidential computing technique that can scale relatively easily. The TEE approach is often questioned due to its vulnerability to hardware intrusions and vendor backdoors. Despite these problems, TEE technology has seen good adoption with Microsoft Cloud using Intels SGX solution and Google Cloud with AMDs EPYC processors.

Learning through federated systems

FL is a terrific privacy computing technique that has a focus on data privacy in AI model training. Do you ever wonder how your smartphone's texting application can anticipate the next word you're about to type? Chances are, they've been trained using FL techniques.

FL methods send the keyboard prediction model to the edge devices for training locally, rather than collecting user input data (typed words in this case) to the central server, where it is updated and sent to the edge for further training.

This method by itself isn't exactly secure, since the central server might in theory reverse engineer the original data using the gradient data. FL is often utilized in conjunction with other encryption methods, such as FPGA-based homomorphic encryption, which will be discussed next.

Encryption that is fully homomorphic

Let's talk about FHE, a software-based security protocol that locks user data so that mathematical computations may be performed on the encrypted data without ever having to decrypt it in the first place.

Craig Gentry conceived of the first FHE program in the 1970s, and many FHE schemes have developed with dramatically improved performance and security.

FHE is considered to be one of the most secure protocols that does not require trust in any third parties that touch any part of the data lifecycle: data in transit, data at rest, and data in use. FHE has been proven to be quantum-proof, that is, resistant to quantum computations.

FHE has one major drawback: FHE computations are extremely slow, often 100,000 times slower than computation on cleartext. Although many consider this to be FHE's Achilles heel, a venture investor may see it as an opportunity.

If history proves anything, there may be interesting parallels between FHE and RSA (Rivest-Shamir-Adleman) technology. At its inception in the 1970s, a 1024-bit RSA encryption took more than 10 minutes to complete, making it impractical. Today, RSA is widely adopted in over 90% of secure data transmissions and the same encryption takes less than 0.01 milliseconds on an edge device.

Similarly, software and hardware acceleration might be the key to unlocking all of FHE technologies' full potential. In recent months, several FHE startups successfully raised huge amounts of capital, including software provider Duality and high-performance computing chip developer Cornami*.

Privacy computing techniques include zero-knowledge proof, differential privacy, synthetic data, and other applications, as well as privacy computing techniques. They are essential to addressing seemingly unsolvable disagreements between data collaboration and data safety.

When it comes to data collaboration, early adoption will happen, like in the medical and banking industries.

Privacy computing technologies become more sophisticated and their performance improves, a wider adoption is expected. According to Gartner, half of large enterprises will implement privacy-enhancing computation for processing data in untrusted environments and multiparty data analytics applications by 2025.

This is a new area that presents enormous opportunities for both hardware and software development. I cant wait to see what privacy computing technologies will bring in the future.

Cornami is a business that owns a share in the author's own company.

John Wei is the director of investment at Applied Ventures, LLC.

You may also like: