Confidential computing is the missing third stage in protecting data in hardware-based trusted execution environments (TEEs) that secure data throughout its lifecycle.
These TEEs always protect data by preventing unauthorized access or modification of applications and data during computation. TEEs guarantee data integrity, confidentiality, and code integrity.
Verifiable cloud computing, secure multi-party computation, and data analytics on sensitive data sets can be enabled by confidential computing.
Azure’s GPU based scenarios that require high-performance computing and parallel processing, such as 3D graphics and visualization, scientific simulation and modeling, and AI and machine learning, have also required it.
Scientific simulation and modeling can use confidential computing to run simulations and models on sensitive data like genomic, climate, and nuclear data without exposing the data or code (including model weights) to unauthorized parties.
Confidential computing allows healthcare professionals to analyze medical images like X-rays, CT scans, and MRI scans using advanced image processing methods like deep learning without exposing patient data or proprietary algorithms.
Model builders can hide model weights and intermediate data like checkpoints and gradient updates exchanged between nodes during training with confidential training.
Azure is the sole provider of confidential virtual machines with 4th Gen AMD EPYC processors, SEV-SNP technology, and NVIDIA H100 Tensor Core GPUs in our NCC H100 v5 series.
Containers are essential for confidential AI scenarios because they are modular, accelerate development/deployment, and reduce virtualization overhead, making AI/machine learning workloads easier to deploy and manage.