Azure Storage Account – Making it cost-effective can be hard, but using my approach will be a pice of cake. Azure Storage Account is perhaps the most popular Azure service. But have you ever considered the expense of this service? Do you have a tier for your blobs? Don’t you think it would be fantastic if I told you that you could make it cost-effective in only a few minutes?
First, consider the pricing for each 1000 GB storage tier:
Tier | Cost for 1000 GB |
Hot | 21 USD |
Cool | 15 USD |
Cold | 3.6 USD |
Archive | 0.99 USD |
Handling all blobs for the most cost-effective tier is unsustainable. But then comes lifecycle management to the rescue! You can use this functionality to create rules for moving blobs to different tiers. An example rule for tier maintenance is provided below:
Considerably simple, don’t you think? Consider how much money you could save with this straightforward operation on your subscription. When files are no longer required, you can remove them, making cost optimization even more effective.
Rules that can be applied:
- base_blob
- snapshot
- version
You can trigger tier change in relation to the dates:
- last modification
- last access
- creation
- last tier change
Be patient when adding new rules because tier changes are only able to be made once per day.
The base blob contains a feature that causes it to automatically transition from the cool to the hot tier if your file is viewed. You must activate the last access time in the Azure Storage Account in order to achieve this.
If you follow my articles, you know that I’m keen on Terraform, so there you have a terraform script for creation lifecycle management for the Azure Storage Account! Feel free to use and change.
The current version (3.70.0) of azurerm provider, doesn’t have transition from/into a Cold tier. You must wait for a new version. But this transition can be easily set up in the portal.
If you run this terraform locally then remember to set credentials. I use Service Principal, script for creation:
az ad sp create-for-rbac --name api://terraformspn --role Contributor --scopes /subscriptions/##SUBSCRIPTION_ID##
Then set environment variables in your console, I use zsh so for me it will be:
export ARM_CLIENT_ID="##appId##"
export ARM_CLIENT_SECRET="##password##"
export ARM_TENANT_ID="##tenant##"
export ARM_SUBSCRIPTION_ID="##subscription##"
Example solution, has already default parameters set. To run my terraform script run:
terraform init
terraform apply -var-file=default.tfvars -auto-approve