Freigeben über


Working with Azure REST APIs from Powershell – Getting page and block blob information from ARM based storage account sample script

Hello Everyone,

I know that this is a recurrent subject in several blogs and some are very good indeed but I would like to write about it anyways, using a recurrent question that I have seen from customers and peers and to be paired with my previous blog Working with Azure Active Directory Graph Api from Powershell, so you have a one stop shop for both APIs from my blogs.

The full script that is mentioned in this blog can be downloaded from the TechNet Script Gallery from here.

Overview

The example I’m posting is a sample script that helps us answer the following questions related to Storage Accounts and blobs in Azure Resource Manager deployment model, what is the status of all VHD files throughout the storage accounts of my subscription? Are they attached to a VM? Which VM? Are they not attached at all?

The process that I’m using here is:

  1. Get an authorization token in order to list all storage accounts from a defined subscription.
  2. Using Azure Resource Manager REST API, list all storage accounts within subscription.
  3. Using Azure Storage Resource Provider REST API, get keys to be able to build the authentication signature when working with containers and blobs.
  4. Using Azure Blob Service REST API, list containers within each storage account, this is the time where we need to create the signature string to authenticate on Blob Service REST API, basically this string follows a well-defined format that is signed with storage account primary or secondary keys with the canonicalized string of the action being taken on a specific storage account, this signed string is called shared key and is fully described at Authentication for the Azure Storage Services article.
  5. Following the same pattern of item 4, we get a list of blobs within each container also using the shared key obtained from the script function called GetAuthSignedStringSa.
  6. For each blob we grab all needed information and build the blob Powershell Custom Object filling up its properties and adding it to a result array.

This sample script is not intended to be a full featured script or cmdlet but it at least will list all page blobs and block blobs (optionally) in all storage accounts that you have in your subscription. It shows two different ways to work with REST API, the resource manager APIs which only needs the authorization token from Azure Active Directory and the blob service API which requires another kind of authentication which is the shared key, a well-formatted string signed with the storage account key (which requires the token to be obtained) using HMAC-SHA256 signature algorithm.

This blog post assumes you already have Azure Powershell 1.0 or greater module installed in your computer, if not, you can install it from https://aka.ms/webpi-azps.

In the same way as Graph API (mentioned in the Graph API blog post) the first thing we need to start working with REST API is to obtain an authorization token by using ADAL (Active Directory Authentication Library).

Let’s get started by first defining a new version (more flexible, can be used with Graph API or REST API) of the same function that I posted previously:

 function GetAuthToken
  {
     param
     (
          [Parameter(Mandatory=$true)]
          $ApiEndpointUri,
          
          [Parameter(Mandatory=$true)]
          $AADTenant
     )
   
     $adal = "${env:ProgramFiles(x86)}\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\Services\" + `
              "Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
     $adalforms = "${env:ProgramFiles(x86)}\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\Services\" + `
                  "Microsoft.IdentityModel.Clients.ActiveDirectory.WindowsForms.dll"
     
     [System.Reflection.Assembly]::LoadFrom($adal) | Out-Null
     [System.Reflection.Assembly]::LoadFrom($adalforms) | Out-Null
     
     $clientId = "1950a258-227b-4e31-a9cf-717495945fc2"
     $redirectUri = "urn:ietf:wg:oauth:2.0:oob"
     $authorityUri = “https://login.windows.net/$aadTenant”
     
     $authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authorityUri
     
     $authResult = $authContext.AcquireToken($ApiEndpointUri, $clientId,$redirectUri, "Auto")
   
     return $authResult
  } 

 

Notice that this version of the function a new parameter was added, ApiEndpointUri, the value of the argument can either be https://management.core.windows.net/ for Azure Management API or https://graph.windows.net for Graph API, the focus of this blog post is the first one.

I’m only including/commenting the most important parts of the script; the fully functional script can be downloaded from here (link to a blob in my storage account which is using another type of authentication method called Shared Access Signature (SAS) tokens where I provide read only access directly to my blob for a specific period of time.

I defined a bunch of auxiliary functions mainly to help obtaining specific information from an URI used in the REST API calls to avoid lots of repetitive code. I’m not pasting the code here but it is in the full script version mentioned in the previous paragraph. Mainly the function names are:

  • GetContainerName
  • GetStorageAccountName
  • GetBlobName
  • GetRestApiParameters

Continuing with our example on main parts of the script on how to use REST API calls with Powershell, the next code snippet will obtain the authorization token that is used on all calls to the Resource Manager API:

 Defining Azure Management API endpoint, if working with Graph Api, change to graph.windows.net
 $ApiEndpointUri = "https://management.core.windows.net/"
  
 # Getting authentication token
 $token = GetAuthToken -ApiEndPointUri $ApiEndpointUri -AADTenant $AzureAdTenant

After we get this token, we build the $header hashtable that is used by Invoke-RestMethod cmdlet and we perform the api call:

 # Defining Rest API header to be used to query storage account resources
  $header = @{
      'Content-Type'='application\json'
      'Authorization'=$token.CreateAuthorizationHeader()
  }
   
  # Obtaining the list of all storage accounts in the subscription
  $uriSAs = "https://management.azure.com/subscriptions/${subscriptionid}/resources?`$filter=resourceType eq 'Microsoft.Storage/storageAccounts'&api-version=2016-02-01"
  $storageAccounts = (Invoke-RestMethod -Uri $uriSAs -Headers $header -Method Get).value 

If you take a look on the $header hashtable you will be able to see the authorization token:

image

Also, lets examine $storageAccounts results, this is the list of all storage accounts based on Azure Resource Manager deployment model of my subscription, those came from the REST API call to Resource Manager provider:

image

At this point we have our main iteration list to go through, in the real sample script execution it will iterate over all storage accounts and all containers and list all blobs, just for the sake of simplicity I’m executing parts of the script and focusing on one storage account, called pmcstorage05, so I’m tweaking my $uriSAs URI object to filter the results for just this storage account, follows the modified version of it to get exactly that storage account and executing the API call again:

 $uriSAs = "https://management.azure.com/subscriptions/${subscriptionid}/resources?`$filter=resourceType eq 'Microsoft.Storage/storageAccounts' and name eq 'pmcstorage05'&api-version=2016-02-01"

The result now is a single storage account, the one I defined in my REST API query:

image

From this point, most of the calls will be directed to the Blob Service REST API which will use a different authentication method but before going into this, we need a last Resource Manager REST API call, the call that will get the authentication keys from that storage account. Following code snippet defines the query (which is now a modified version from the original script), notice that the Invoke-RestMethod cmdlet now uses the POST method instead of previous GET operations on other parts of the code:

 # Getting authentication key for current storage account
  $uriListKeys = "https://management.azure.com/$($storageAccounts[0].Id)/listKeys?api-version=2015-05-01-preview"
  $keys = Invoke-RestMethod -Uri $uriListKeys -Headers $header -Method Post

The $keys object with key1 and key2 attributes as we can see below, any of these keys can be used to build the singed Shared Key to perfom the Blob REST API calls:

image

The original published sample script at the aforementioned link at this point would list all containers inside the storage account (with exception of special $root which is not covered in this blog), but to make this blog as small and objective as possible, we are also going to tweak the next calls to use the vhds container directly so we skip some code.

Lets define our REST API call URI to list all blobs with their metadata (if available):

 $uriListBlobs = "https://$($storageAccounts[0].name).blob.core.windows.net/vhds/?comp=list&include=metadata&restype=container"

Where

  • comp=list => the list operation inside the vhds container
  • include=metadata => indicates the API to get any associated metadata (we need this when a page blob is attached to a VM)
  • restype=container => indicates that this operation is on a container resource type
  • For more information of available options, please refer to Blob Service REST API

The value of the URI will be like this:

image

Now comes the part that was trickier to work with (just my opinion), this is the part where we build the string that must be signed with the storage account key. For that I built a function to that work for us, outlined below:

 function GetAuthSignedStringSa
  {
      param
      (
          [Parameter(Mandatory=$true)]
          [string]$uri,
   
          [Parameter(Mandatory=$false)]
          [string]$key
      )
   
      # Building Authorization Header for Storage Account
   
      $saName = GetStorageAccountName -uri $uri
      $containerName = GetContainerName -uri $uri
   
      # Time in GMT
      $resourceTz = [System.TimeZoneInfo]::FindSystemTimeZoneById(([System.TimeZoneInfo]::Local).Id)
      [string]$currentDateTimeUtc = Get-Date ([System.TimeZoneInfo]::ConvertTimeToUtc((Get-Date).ToString(),$resourceTz)) -Format r
   
      # String to be signed with storage account key
      $signatureSb = New-Object System.Text.StringBuilder
      $null = $signatureSb.Append("GET`n`n`n`n`napplication/xml`n`n`n`n`n`n`nx-ms-date:$currentDateTimeUtc`nx-ms-version:2015-02-21`n/$saName/$containerName")
      
      if ($containerName -ne $null)
      {
          $null = $signatureSb.Append("/")
      }
   
      $restParameters = GetRestApiParameters -uri $uri
   
      if ($restParameters -ne $null)
      {
          foreach ($param in $restParameters)
          {
              $null = $signatureSb.Append("`n$($param.Replace('=',':'))")   
          }
      }
   
      # Signing string with SA key UTF8 enconded with HMAC-SHA256 algorithm
      [byte[]]$singnatureStringByteArray=[Text.Encoding]::UTF8.GetBytes($signatureSb.ToString())
      $hmacsha = New-Object System.Security.Cryptography.HMACSHA256
      $hmacsha.key = [convert]::FromBase64String($key)
      $signature = [Convert]::ToBase64String($hmacsha.ComputeHash($singnatureStringByteArray))
   
      return  @{
          'x-ms-date'="$currentDateTimeUtc"
          'Content-Type'='application\xml'
          'Authorization'= "SharedKey $saName`:$signature"
          'x-ms-version'='2015-02-21'
      }
  }

What basically this function do is build and sign the following string that is derived from the URI that we defined above (remember those little functions that I just mentioned? They basically do some string operations to get what we need):

image

This string seems odd, it contains some empty lines, this is because they follow the format, each line have a meaning, defined in Authentication for the Azure Storage Services document.

After it builds this string, it signs it with HMAC-SHA256 algorithm using UTF8 encoding and builds another hashtable for us just like this:

image

In possession of this header, we can go and perform our REST API call to obtain the list of blobs located at that container in that storage account. The following code snipet performs the call and converts the response to XML (there is no JSON return for this particular REST API):

 $responseText = Invoke-RestMethod -Uri $uriListBlobs -Headers $headerBlobs -Method Get -ContentType application/xml
 [xml]$responseXml = $responseText.Substring($responseText.IndexOf("<"))

 

The XML schema of this response looks like this (this schema is defined at List Blobs article):

 <?xml version="1.0" encoding="utf-8"?>
  <EnumerationResults ServiceEndpoint="https://myaccount.blob.core.windows.net/"  ContainerName="mycontainer">
    <Prefix>string-value</Prefix>
    <Marker>string-value</Marker>
    <MaxResults>int-value</MaxResults>
    <Delimiter>string-value</Delimiter>
    <Blobs>
      <Blob>
        <Name>blob-name</name>
        <Snapshot>date-time-value</Snapshot>
        <Properties>
          <Last-Modified>date-time-value</Last-Modified>
          <Etag>etag</Etag>
          <Content-Length>size-in-bytes</Content-Length>
          <Content-Type>blob-content-type</Content-Type>
          <Content-Encoding />
          <Content-Language />
          <Content-MD5 />
          <Cache-Control />
          <x-ms-blob-sequence-number>sequence-number</x-ms-blob-sequence-number>
          <BlobType>BlockBlob|PageBlob|AppendBlob</BlobType>
          <LeaseStatus>locked|unlocked</LeaseStatus>
          <LeaseState>available | leased | expired | breaking | broken</LeaseState>
          <LeaseDuration>infinite | fixed</LeaseDuration>
          <CopyId>id</CopyId>
          <CopyStatus>pending | success | aborted | failed </CopyStatus>
          <CopySource>source url</CopySource>
          <CopyProgress>bytes copied/bytes total</CopyProgress>
          <CopyCompletionTime>datetime</CopyCompletionTime>
          <CopyStatusDescription>error string</CopyStatusDescription>
        </Properties>
        <Metadata>   
          <Name>value</Name>
        </Metadata>
      </Blob>
      <BlobPrefix>
        <Name>blob-prefix</Name>
      </BlobPrefix>
    </Blobs>
    <NextMarker />
  </EnumerationResults>

 

And since this is converted to an XML object in Powershell, we can dig into it and obtain all information needed. For example the list of returned blobs:

 $responseXml.EnumerationResults.Blobs.Blob

image

And from this example we see that only blob called win1020162262259.vhd contains a Metadata object, I can tell in advance that it means that this VHD is attached to a VM, so lets use this line of code to check what properties it have:

 ($responseXml.EnumerationResults.Blobs.Blob | ? {$_.name -eq "win1020162262259.vhd"}).Metadata

 

Finally we see to which VM this VHD is attached to and disk type (if it is a DataDisk or OSDisk), etc.

image

 

Executing the script

Basically these are the most important explanations for this example, now, how to execute the full script? What type of output it provides? Can I filter it?

To execute the script, follow these steps:

  1. Download from here (TechNet Gallery)

  2. Extract it anywhere

  3. Open a Powershell command prompt

  4. Optionally authenticate to azure resource manager with

     Add-AzureRmAccount
    
  5. Type user username and password of an Azure AD account or MSA account that has admin rights to the subscription or one or more resource groups (when using RBAC – Role based access control)
    image

  6. Switch to the folder where you extracted the script and run the following command line (you must know beforehand your subscription ID and your Azure Active Directory tenant name)

     $blobs = .\Get-AzureRmBlobInfo.ps1 -AzureAdTenant <tenantname>.onmicrosoft.com -SubscriptionId <subscription id>
    
  7. That will return the blob objects to a list called $blobs, just type $blobs and you will get a list of all of them:
    image

Final question, can you play with these objects? Yes, to get a list of non-attached page blobs (VHD file for example) just use this command line:

 $blobs | ? {$_.VHDAttached -eq $false -and $_.BlobName -like "*.vhd"}

I have a bunch of them (screen shot of just first 4):

image

For a list of all available REST APIs, please refer to Azure Reference documentation.

Well, that’s it for this blob, I mean blog Smile, have fun and be careful if deleting any blob that comes as not attached, be very sure you don’t need it before deleting it Winking smile.

Comments

  • Anonymous
    April 22, 2016
    Hi,

    Thanks for the awesome article. I am trying use something similar but in a non-interactive session, but for the life of me, I cann't get it to work. Any ideas how you could adapt what you have here to work non-interactively, without Windows prompting you for a userid/password.
  • Anonymous
    April 23, 2016
    Hi Satk,

    You can try looking at this "Authenticating a service principal with Azure Resource Manager" athttps://azure.microsoft.com/en-us/documentation/articles/resource-group-authenticate-service-principal/#authenticate-with-certificate---powershell which explains how to work with service principals for automation which I guess is your case.

    Regards

    Paulo.
  • Anonymous
    May 22, 2016
    Hi Paulo,I have used your GetAuthToken function to invoke Azure API, but it doesn't work anymore. First of all "urn:ietf:wg:oauth:2.0:oob" is no longer accepted as redirect uri. When I change it to my application's redirect uri (for example https://myauthapp.azure.com) it accepts the uri, but the token request return error: AADSTS90014: The request body must contain the following parameter: client_secret or client_assertion. But I don't want to pass client_secret, I want to sign in and run the code on my behalf like it used to work. Do you know how the code should be changed in order work again?RegardsKacper
  • Anonymous
    May 22, 2016
    Nevermind. I forgot that I have to use well known clientId for azure powershell. I changed it to my application's clientId and this was the issue.
    • Anonymous
      May 27, 2016
      Glad you sorted out Kacper :-).