Skip to content

Scripting Downloads and Uploads of Blob Objects from and to Azure Storage Account

Introduction

This document provides a guide on how to script the download and upload of blob objects to and from an Azure Storage Account using Azure CLI and Python.

Prerequisites

  • An active Azure subscription
  • Azure CLI installed
  • Python installed (optional, for Python script)

Using Azure CLI

Setting Up

First log in to your Azure account:

az login -o table

Then set the subscription you want to work with:

az account set --subscription <your_subscription_id>

Finding the Storage Account

If you already know the storage account name you can skip to the end of this section.

Let's start by setting up the working session. First, set some environment variables:

export AZ_SA_RG_NAME="my-resource-group"

If you don't know in which resource group your account is you can list all storage accounts omitting the --resource-group #AZ_SA_RG_NAME argument:

az storage account list --query "[].{name:name,resourceGroup:resourceGroup}" --resource-group $AZ_SA_RG_NAME -otsv

Then, set the storage account name you want to work with:

export AZ_SA_NAME="anAwesomeStorageAccount"

Getting the storage account key

To be able to interact with the storage account, you need to get the storage account key:

export AZ_SA_KEY=$(az storage account keys list --account-name $AZ_SA_NAME --resource-group $AZ_SA_RG_NAME --query "[0].value" -otsv)

Listing Containers

Again, if you already know the container name you can skip to the end of this section.

To list all containers in a storage account, use the following command:

az storage container list --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --query '[].name' -o tsv

Then, set the container name you want to work with:

export AZ_SA_CONTAINER_NAME="json-data"

Listing Blobs

To list all blobs in a container, use the following command:

az storage blob list --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --query '[].name' -o tsv

Set the blob name you want to work with:

export AZ_SA_BLOB_NAME="daily-log-data.json"

Downloading a Blob

To download a blob from an Azure Storage Account, use the following command that assumes that you want to download the blob to the same directory where you are running the command and store it with the same name:

az storage blob download --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --name $AZ_SA_BLOB_NAME --file $AZ_SA_BLOB_NAME

The command from above will produce some output, but it is only for information purposes. The actual download is stored in the file with the same name as the blob.

Uploading a Blob

To upload a blob to an Azure Storage Account, use the following command:

az storage blob upload --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --name $AZ_SA_BLOB_NAME --file $AZ_SA_BLOB_NAME --overwrite

Wrapping up into a single script

The steps above can be wrapped into a single script. Here is an example of bash script that downloads and uploads a blob:

#!/bin/bash

export SKIP_LOGGING=true
export AZ_SA_RG_NAME="logs-devs" # Resource group name, can be skipped if you know the storage account name

# Proxy settings
export no_proxy="*.mycompany.com"
export https_proxy="http://squid-proxy.mycompany.com:80"
export http_proxy="http://squid-proxy.mycompany.com:80"

function login() {
    az login -o table
}

function set_subscription() {
    OIFS=$IFS
    IFS=$'\n'
    select subscription in $(az account list --query "[].{name:name,id:id}" -otsv | sort); do
        az account set --subscription $(echo ${subscription} | awk '{print $2}')
        break
    done
    IFS=$OIFS
}

function set_sa_name() {
    select sa in $(az storage account list --query "[].{name:name,resourceGroup:resourceGroup}" --resource-group $AZ_SA_RG_NAME -otsv | awk '{print $1}'); do
        export AZ_SA_NAME=$sa
        break
    done
}

function get_sa_key() {
    echo 'export AZ_SA_KEY=$(az storage account keys list --account-name $AZ_SA_NAME --resource-group $AZ_SA_RG_NAME --query "[0].value" -otsv)'
    export AZ_SA_KEY=$(az storage account keys list --account-name $AZ_SA_NAME --resource-group $AZ_SA_RG_NAME --query "[0].value" -otsv)
}

function set_container_name() {
    select container in $(az storage container list --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --query '[].name' -o tsv); do
        export AZ_SA_CONTAINER_NAME=$container
        break
    done
}

function set_blob_names() {
    select blob in $(az storage blob list --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --query '[].name' -o tsv | awk '{print $1}'); do
        export AZ_SA_BLOB_NAME=$blob
        break
    done
}

function download_blob() {
    echo 'az storage blob download --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --name $AZ_SA_BLOB_NAME --file $AZ_SA_BLOB_NAME'
    az storage blob download --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --name $AZ_SA_BLOB_NAME --file $AZ_SA_BLOB_NAME
}

function upload_blob() {
    az storage blob upload --account-name $AZ_SA_NAME --account-key $AZ_SA_KEY --container-name $AZ_SA_CONTAINER_NAME --name $AZ_SA_BLOB_NAME --file $AZ_SA_BLOB_NAME
}

function select_operation() {
    select operation in "Download" "Upload"; do
        case $operation in
        Download)
            download_blob
            break
            ;;
        Upload)
            upload_blob
            break
            ;;
        esac
    done
}

[[ $SKIP_LOGGING ]] || login
[[ $SKIP_LOGGING ]] && set_subscription
set_sa_name
get_sa_key
set_container_name
set_blob_names
select_operation

# Final cleanup
unset https_proxy no_proxy http_proxy SKIP_LOGGING AZ_SA_NAME AZ_SA_KEY AZ_SA_CONTAINER_NAME AZ_SA_BLOB_NAME

Conclusion

This document provided a basic guide on how to script the download and upload of blob objects to and from an Azure Storage Account using Azure CLI in bash. This can be a good foundation for more advanced scenarios and options. For more advanced scenarios and options, refer to the official Azure documentation.