Azure Dedicated SQL Pool (formerly SQL Data Warehouse) is a powerful distributed analytics service designed to handle large-scale data warehousing, complex analytics, and business intelligence workloads. With its Massive Parallel Processing (MPP) architecture, Azure Dedicated SQL Pool offers great flexibility to scale both compute and storage resources independently, making it ideal for organizations managing vast amounts of data.
However, managing the cost of Azure Dedicated SQL Pools can be challenging, especially since you’re billed for compute resources on an hourly basis—even when the pool is idle. This means organizations often face the dilemma of managing costs without sacrificing performance. The good news is that you can automate the scaling process of your Dedicated SQL Pool, both scaling up during high-demand periods and scaling down when it’s not needed.
In this blog, we’ll walk through how to automate the process of scaling up and scaling down your Azure Dedicated SQL Pool using Azure Data Factory.
Why Scale Up or Scale Down Azure Dedicated SQL Pool?
Scaling up and scaling down your Azure Dedicated SQL Pool helps optimize both performance and costs. Here’s why you might consider scaling either way:
Why Scale Up?
Scaling up is beneficial when you need to handle higher data volumes or run complex queries that require more compute resources. Here are some common scenarios for scaling up:
- Increased Workload: If your team is running large-scale analytics or processing tasks that require more power, scaling up ensures your SQL Pool can handle the load.
- Peak Demand Periods: During business hours or specific periods when you expect heavy data usage or high query complexity, scaling up ensures optimal performance.
- Faster Query Execution: Scaling up the compute resources speeds up query execution, making it ideal when data processing or analytics tasks are critical and time-sensitive.
Why Scale Down?
Scaling down is just as important for cost management. If your SQL Pool is running at high capacity when it’s not needed, you’re essentially paying for unused resources. Consider scaling down when:
- Idle Periods: After business hours or on weekends when the demand for the SQL Pool is minimal, scaling down can save significant costs.
- Reducing Unused Capacity: If the SQL Pool is consistently underutilized, scaling down can help reduce overhead.
- Budget Management: If you’re trying to optimize costs for the long-term and only need high compute resources intermittently, scaling down is the best way to avoid unnecessary expenses.
Steps to Scale Up and Scale Down Azure Dedicated SQL Pool Using Azure Data Factory
Azure Data Factory provides an automated way to manage your Azure SQL Pool’s performance by allowing you to build pipelines that can scale your pool up or down based on specific triggers. Here’s how you can set it up for both scaling up and scaling down.
You need to have below details before building the Azure Data factory pipeline:
Subscription ID
Resource Group name
SQL Server name
Dedicated SQL Pool name
Access required for the process: Azure Data Factory’s managed Identity should have Contributor access to the SQL server.
Now that everything is sorted, let’s start the process:
First, we are going to build a pipeline to Scale up the Dedicated SQL pool to 200 DWU:
Step 1: Launch Azure Data Factory.
Step 2: Create a new pipeline.

Step 3: Drag Web activity from the General Activities in the canvas and name it ‘ScaleUpDW’.

Step 4: Fill in the details in the setting window.
URL: https://management.azure.com/subscriptions/Enter your Subscription ID/resourceGroups/Enter Resource Group Name/providers/Microsoft.Sql/servers/Enter the Name of the SQL Server/databases/Enter the name of the Dedicated SQL Pool?api-version=2020-08-01-preview
Method: PATCH
Body: {“sku”:{“name”:”DW200c”,”tier”:”DataWarehouse”}}
Authentication: System Assigned Managed Identity
Resource: https://management.core.windows.net
Header:
Name: Content-Type
value: application/json

Step 5: Click Debug.
Step 6: Once the pipeline succeeds, check if the SQL DW has scaled up or not. If you can see the DWU as 200, then the pipeline is working.
Step 7: Click Publish.
How to Scale Down: You can clone the same pipeline and use it to scale down the Dedicated SQL pool to 100 DWU with just a minor change in the Body of the Web Activity.
Body: {“sku”:{“name”:”DW100c”,”tier”:”DataWarehouse”}}
You are done.
Ready to take your skills to the next level? Click here for my Python book, here for my Machine Learning book, and here for my SQL book—now available on Kindle!
Leave a comment