Independent pricing guide. Not affiliated with Databricks, Inc. Always verify at databricks.com/pricing

Updated April 2026

Databricks DBU Pricing:
Complete Rate Tables for Every Workload

A Databricks Unit (DBU) is a normalised measure of processing power. Think of it like a kilowatt-hour for compute. It lets Databricks charge consistently regardless of whether you are running a tiny development instance or a massive GPU cluster. The DBU rate you pay depends on three factors: workload type, pricing tier, and cloud provider.

Compute (Jobs & All-Purpose)

WorkloadAWSAzureGCPType
Jobs Compute$0.15$0.30$0.15Production ETL and scheduled batch workloads
Jobs Compute (Photon)$0.20$0.24$0.20Photon-accelerated batch processing
Jobs Compute (Serverless)SERVERLESS$0.37$0.45$0.37Serverless jobs with zero idle cost
All-Purpose Compute$0.55$0.55$0.55Interactive notebooks and development
All-Purpose Compute (Photon)$0.65$0.65$0.65Photon-accelerated interactive compute
All-Purpose Compute (Serverless)SERVERLESS$0.75$0.95$0.75Serverless interactive notebooks

SQL Warehouses

WorkloadAWSAzureGCPType
SQL Classic$0.22$0.22$0.22Classic SQL warehouse (requires running cluster)
SQL Pro$0.55$0.55$0.69Pro SQL warehouse with query profiling
SQL ServerlessSERVERLESS$0.70$0.70$0.88Serverless SQL with instant scaling

Delta Live Tables

WorkloadAWSAzureGCPType
DLT Core$0.20$0.30$0.20Basic streaming and batch pipelines
DLT Pro$0.25$0.38$0.25Pro pipelines with expectations and monitoring
DLT Advanced$0.36$0.54$0.36Advanced pipelines with change data capture

AI & Model Serving

WorkloadAWSAzureGCPType
CPU Model Serving$0.07$0.08$0.07CPU-based model inference endpoints
GPU Model Serving$0.07$0.08$0.07GPU-based model inference (high DBU/hr instances)
Foundation Model APIsSERVERLESS$0.07$0.08$0.07Pay-per-token for hosted foundation models

Streaming

WorkloadAWSAzureGCPType
Streaming (DLT)$0.20$0.30$0.20Real-time streaming with Delta Live Tables
Streaming (Structured)$0.15$0.30$0.15Structured Streaming on Jobs Compute

Standard Tier Rates (Being Retired)

These lower rates are being phased out. If you are still on Standard tier, plan for the cost increase when migrating to Premium.

WorkloadAWS (Std)AWS (Premium)Increase
Jobs Compute$0.07$0.15+114%
All-Purpose Compute$0.30$0.55+83%
SQL Classic$0.15$0.22+47%

Migration Timeline

  • AWS: Already retired for new customers
  • Azure: October 2026 (no new workspaces after April 2026)
  • GCP: Already retired for new customers

Instance Type to DBU Mapping

Each instance type consumes a specific number of DBUs per hour. Larger instances with more vCPUs and memory consume more DBUs. This table shows the DBU consumption rate alongside the on-demand and spot pricing for common Databricks instance types.

AWS

InstancevCPUsMemoryDBU/hrOn-DemandSpot
i3.xlarge430.5 GB1.00$0.312/hr$0.094/hr
i3.2xlarge861 GB2.00$0.624/hr$0.187/hr
r5.xlarge432 GB1.00$0.252/hr$0.063/hr
r5.2xlarge864 GB2.00$0.504/hr$0.126/hr
m5.xlarge416 GB0.75$0.192/hr$0.048/hr
m5.2xlarge832 GB1.50$0.384/hr$0.096/hr
p3.2xlarge861 GB6.50$3.060/hr$0.918/hr
g4dn.xlarge416 GB0.75$0.526/hr$0.158/hr

AZURE

InstancevCPUsMemoryDBU/hrOn-DemandSpot
Standard_DS3_v2414 GB0.75$0.293/hr$0.059/hr
Standard_DS4_v2828 GB1.50$0.585/hr$0.117/hr
Standard_E4s_v3432 GB1.00$0.252/hr$0.050/hr
Standard_E8s_v3864 GB2.00$0.504/hr$0.101/hr
Standard_NC6s_v36112 GB5.00$3.060/hr$0.612/hr

GCP

InstancevCPUsMemoryDBU/hrOn-DemandSpot
n1-standard-4415 GB0.75$0.190/hr$0.040/hr
n1-standard-8830 GB1.50$0.380/hr$0.080/hr
n1-highmem-4426 GB1.00$0.237/hr$0.050/hr
n1-highmem-8852 GB2.00$0.474/hr$0.100/hr

How to Calculate Your DBU Cost

Calculating your Databricks platform cost involves a straightforward formula. The key is knowing your instance type, cluster size, workload type, and expected runtime hours.

Total Databricks Cost = DBUs consumed x DBU rate per unit

DBUs consumed = (DBU/hr per instance) x (number of instances) x (hours running)

Example 1: Production ETL

  • Workload: Jobs Compute (Premium, AWS)
  • Instance: 8x i3.xlarge (1.0 DBU/hr each)
  • Runtime: 4 hours/day, 30 days/month
  • DBUs = 8 x 1.0 x 4 x 30 = 960 DBUs/month
  • Platform cost = 960 x $0.10 = $96/month
  • Infrastructure = 8 x $0.312 x 4 x 30 = $299/month
  • Total: $395/month

Example 2: Interactive Analytics

  • Workload: All-Purpose Compute (Premium, Azure)
  • Instance: 4x Standard_E4s_v3 (1.0 DBU/hr each)
  • Runtime: 10 hours/day, 22 business days/month
  • DBUs = 4 x 1.0 x 10 x 22 = 880 DBUs/month
  • Platform cost = 880 x $0.48 = $422/month
  • Infrastructure = 4 x $0.252 x 10 x 22 = $222/month
  • Total: $644/month

Remember: these calculations cover only the Databricks platform and basic compute costs. Storage, egress, and networking costs from your cloud provider are additional. Use our cost calculator to model your complete spend.

Frequently Asked Questions

What exactly is a DBU in Databricks?

A Databricks Unit (DBU) is a normalised unit of processing capability used as the billing metric for the Databricks platform. It measures compute consumption independently of the underlying cloud infrastructure. The amount of DBUs consumed per hour depends on the instance type and size you run, while the cost per DBU depends on your workload type, pricing tier, and cloud provider.

How are DBUs calculated?

DBU consumption is calculated as: (DBU/hr per instance) x (number of instances) x (hours running). For example, a 4-node cluster of i3.xlarge instances (1.0 DBU/hr each) running for 8 hours consumes 32 DBUs. Your cost is then 32 DBUs multiplied by the per-DBU rate for your workload type.

Why do DBU rates vary so much between workload types?

Workload types reflect different levels of Databricks platform value and management overhead. Jobs Compute ($0.15/DBU on AWS) is the cheapest because it runs scheduled, non-interactive workloads. All-Purpose Compute ($0.55/DBU) costs more because it provides interactive notebook access. SQL Serverless ($0.70/DBU) is the most expensive per DBU but includes infrastructure costs in the rate.

Does Photon affect DBU rates?

Yes. Photon-enabled workloads have higher per-DBU rates (e.g., $0.20/DBU vs $0.15/DBU for Jobs Compute on AWS). However, Photon typically provides 3-8x query performance improvement, meaning your workloads consume fewer total DBUs despite the higher rate. The net effect is usually a cost reduction of 50-70% for compatible workloads.

When is Standard tier being retired?

AWS and GCP have already stopped offering Standard tier to new customers. Azure stops accepting new Standard tier workspaces in April 2026 and will fully retire existing Standard workspaces by October 2026. All affected customers must migrate to Premium tier, which has higher DBU rates.

What is the cheapest way to run Databricks?

The cheapest production configuration is Jobs Compute on AWS or GCP with spot instances enabled, using storage-optimised instance types (i3 family on AWS). This gives you $0.15/DBU for the platform portion plus heavily discounted infrastructure costs. For development work, Databricks Community Edition is free but severely limited.