dstack Logo

Open-Source
Confidential Compute

Deploy secure applications with hardware-guaranteed privacy using TEE technology. Built for confidential AI, private cloud compute, and secure data processing.

Developer Friendly

Docker support means no code changes required. Package your existing applications and deploy them securely in minutes.

Confidential AI Ready

Purpose-built for confidential AI with TEE GPU support. Run private AI models on NVIDIA GPUs with hardware-guaranteed confidentiality.

Enterprise Security

Open source, independently audited by security experts, with secure services out-of-the-box.

Verifiable Computing

Every application comes with cryptographic attestation and a Trust Center for real-time verification.

Simple Deploy Process

Deploy confidential applications with just a few clicks - no complex setup required

1

Paste Docker Compose

Copy your existing docker-compose.yml file - no modifications needed

Step 1
2

Click Deploy

Select your TEE hardware and deployment options

Step 2
3

Application Launched

Your app runs with hardware-guaranteed confidentiality

Step 3

Trust Center

Complete Transparency. Every deployed application comes with a comprehensive Trust Center for full verification.

Source Code Verification

Review the exact source code running in your TEE environment

Hardware Information

Detailed specs of the TEE hardware running your application

Network Configuration

Complete network topology and security settings

Attestation Report

Cryptographic proof of execution environment integrity

Trust Center Interface

Security Audit Report

Independent security audit by zkSecurity team. Review our comprehensive security analysis and recommendations.

Sponsored by

PhalaFlashbotsNEAR
Security Audit Report Cover

zkSecurity Team Audit

Comprehensive security analysis and vulnerability assessment

dstack vs Others

Compare dstack with traditional cloud providers and other confidential computing solutions

dstack

Open-source confidential computing

AWS / GCP / Azure

Cloud providers

Confidential Containers

CNCF project

Others

Alternative solutions

Transparency
Open Source
Proprietary
Open Source
Proprietary
Control
Code Controlled
Vendor Controlled
Developer Controlled
Third-party
Auditability
Audited by zkSecurity
Limited
No audit
None
Performance
<5% Overhead
Varies
Unknown
Unknown
AI Focus
Purpose-built
Generic Cloud
Generic
Limited
FAQ

dstack — FAQ

Everything you need to know about confidential orchestration

1

What is dstack in Phala Cloud?

dstack is Phala's confidential orchestration layer — it manages GPU runners, scheduling, and attestation for AI workloads. Think of it as a trustless Kubernetes that can launch and verify fine-tuning, inference, or agent jobs across Phala's decentralized GPU network.

2

How does dstack differ from a normal cloud orchestrator?

Traditional orchestrators (like Docker Swarm or Kubernetes) manage containers but can't prove what actually ran. dstack extends orchestration with TEE attestation, encrypted storage, and verifiable job logs — so every AI job can be proven to run securely and untampered.

3

What types of workloads can dstack run?

dstack supports any containerized AI workload — from fine-tuning LLMs (PyTorch / TensorFlow) to serving inference APIs or autonomous agents. It handles both CPU and GPU nodes (H100 / H200 / A100 / A10) and integrates with Phala's confidential compute runtime for full isolation.

4

Is dstack open source?

Yes. dstack's core runner and job orchestration framework are open source and being upstreamed to the Phala ecosystem. Developers can self-host it or use Phala Cloud's managed control plane.

5

How do I launch a job with dstack?

You can start any containerized workload with a single command or YAML file. For example: dstack run --gpu H200 --image unslothai/unsloth:latest --mount data:/mnt/data train.py. This spins up a verified GPU node, mounts your encrypted dataset, and runs inside a TEE.

6

Does dstack work with existing ML frameworks?

Yes. dstack is framework-agnostic — it runs jobs that use PyTorch, TensorFlow, JAX, or even custom CUDA code. It automatically configures environment variables for CUDA, NCCL, and secure communication between GPUs.

7

Can I use my own Docker images?

Absolutely. Any OCI-compatible container can be used. You can publish your own image (with dependencies, model code, etc.) and dstack will pull and execute it inside an enclave. Images can also be signed and verified to prevent tampering.

8

How does dstack manage data privacy during training or inference?

All data volumes and environment secrets are encrypted. Decryption keys are only unsealed inside the TEE after successful remote attestation. This ensures that neither operators nor other workloads can access your raw data.

9

How can I verify my dstack job actually ran securely?

Every dstack job produces a Phala Attestation Bundle — a JSON report signed by the enclave hardware. It includes the code hash, image ID, node signature, and timestamps. You can verify it programmatically or attach it to your output artifacts for auditing.

10

Can dstack scale across multiple GPUs or nodes?

Yes. dstack supports distributed GPU training and multi-node clusters using PyTorch DDP or TensorFlow's collective ops. It automatically provisions encrypted interconnects between enclaves, so gradients never leave the secure boundary.

11

What happens if a node fails during training?

dstack checkpoints encrypted model states periodically. If a node fails, a new enclave can resume from the last checkpoint — preserving both progress and confidentiality.

12

How do I monitor or debug jobs in a confidential environment?

dstack streams secure logs to your dashboard, but filters sensitive data. You can see real-time metrics (GPU usage, loss curves, throughput) while ensuring no raw dataset or intermediate tensor is ever exposed.

Ready for Confidential Computing?

Join the open-source community building the future of secure computing. Get started with confidential AI and private cloud applications.