Share via


Automate Azure VM Storage

The Problem

One of the best things about Azure, and the cloud in general, is we can automate most anything, and we are going to look at how to automate Azure VM Storage.  This allows us to come up with some outside-of-the-box solutions.  I had a customer with a road block that we were able to work around by automating some things with their Azure Virtual Machines.  Their challenge was that they wanted to move their test and development environments to Azure, but the storage cost was prohibitive.  They needed premium storage to mimic their production environment, but it was not financially viable for test and development so they were going to keep it all on premises.  During our conversations I learned that they only test between 8am and 5pm, Monday through Friday.  My suggestion was that we put their databases on cheaper storage during off times and only premium when they are actively using it.

The Plan

Here is where the automation comes into play.  My challenge was to see if I could find a way to automate moving everything between standard and premium storage.  Here is my plan:

  • Create a Storage Account
    • Create a container for my backups
    • Create a container for my Custom Script Extensions and SQL scripts
    • Create a credential for SQL to access my Storage Account
  • Create an Azure Automation Account
    • Create Runbooks
  • Add Premium Storage Drive
    • Create a Premium Azure Disk in my Storage Account
    • Add the Premium disk to my VM
    • Initialize and format the new disk in the VM
  • Move the Database
    • Restore the database from blob storage to the new Premium Drive
  • Remove Premium Storage Drive
    • Backup database to blob storage
    • Restore database to the standard disk
    • Remove the Premium disk from the OS
    • Remove the Premium disk blob from Azure

This looks like a good plan if I do say so myself. I'll jump out on a limb and place a bet that it works!

Storage Account for Backups

We need to get some things prepared before we jump into any automation.  The first thing we need is a storage account where we can store backups of our database.  As we move between standard and premium storage, we'll use this to both backup to and restore from.  We can use an existing account or create a new one.  Once we have the account we need to create a container to house the backups.  The other thing we have to think about is creating a credential in SQL to access this container.  We only need to do this once so let's go ahead and do that now. Here is what our code looks like so far:

 

#install-module AzureRM
import-module AzureRM

# Define global variables for the script. Make sure to change these to the names you want to use on your account.
$prefixName = 'storagetest' # used as the prefix for the name for various objects
$subscriptionID='MySubID' # the name of subscription name you will use
$locationName = 'South Central US' # the data center region you will use
$storageAccountName= $prefixName + 'allbackups' # the storage account name you will create or use
$containerName= $prefixName + 'server1backups' # the storage container name to which you will attach the SAS policy with its SAS token
$policyName = $prefixName + 'sqlbackups' # the name of the SAS policy
$resourceGroupName= 'VMStorageTestRM' # Name of the resource group you will create or use

# adds an authenticated Azure account for use in the session
Login-AzureRmAccount

# set the tenant, subscription and environment for use in the rest of
Set-AzureRmContext -SubscriptionID $subscriptionID

# create a new resource group - comment out this line to use an existing resource group
New-AzureRmResourceGroup -Name $resourceGroupName -Location $locationName

# Create a new storage account - comment out this line to use an existing storage account
New-AzureRmStorageAccount -Name $storageAccountName -ResourceGroupName $resourceGroupName -Type Standard_RAGRS -Location $locationName

# Get the access keys for the ARM storage account
$accountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -AccountName $storageAccountName).Value[0]

# Create a new storage account context using a storage account
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys

# Creates a new container in blob storage
$container = New-AzureStorageContainer -Context $storageContext -Name $containerName
$cbc = $container.CloudBlobContainer

# Sets up a Stored Access Policy and a Shared Access Signature for the new container. Note that it expires in 10 years.
$permissions = $cbc.GetPermissions();
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$policy.SharedAccessStartTime = $(Get-Date).ToUniversalTime().AddMinutes(-5)
$policy.SharedAccessExpiryTime = $(Get-Date).ToUniversalTime().AddYears(10)
$policy.Permissions = "Read,Write,List,Delete"
$permissions.SharedAccessPolicies.Add($policyName, $policy)
$cbc.SetPermissions($permissions);

# Get the Shared Access Signature for the policy
$policy = new-object 'Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy'
$sas = $cbc.GetSharedAccessSignature($policy, $policyName)
Write-Host 'Shared Access Signature= '$($sas.Substring(1))''

# Outputs the Transact SQL to the clipboard and to the screen to create the credential using the Shared Access Signature
Write-Host 'Credential T-SQL'
$tSql = "CREATE CREDENTIAL [{0}] WITH IDENTITY='SHARED ACCESS SIGNATURE', SECRET='{1}'" -f $cbc.Uri,$sas.Substring(1)
$tSql | clip
Write-Host $tSql

Once the script completes, you will get 2 pieces of output.  The first is the Shared Access Signature.  The second is the TSQL you will need to create a credential in SQL Server to access this container.  You do not even have to copy and paste it.  The code automatically puts it in your clip board, you just need to go paste it into SSMS connected to your SQL Server. Here is an example of what it will look like:

 

CREATE CREDENTIAL [https://MyStorageAccountName.blob.core.windows.net/MyStorageContainerName] WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'YourSecretWillBeHere';

Add Premium Disk

The next step is to add a premium disk to our VM.  This piece of code is going to be placed into an Azure Automation Account Runbook.  I thought this part would be really easy and it is easy to add the disk to the VM in Azure, but what gets tricky is formatting it within the VM.  I don't want to login to the VM, initialize the disk, and format it.  We are automating this for a reason after all.

The good news is we have something called Custom Script Extensions that will take a PowerShell script sitting in a blob container and execute it locally on the VM.  It took some time to finesse this part out, but it's a really cool power to have, and it allows us to do pretty much anything we want.  I'm not going to walk through that, but you can go here to read more about Azure Custom Script Extensions.

In the code for our new Runbook you will see that I use the Custom Script Extension to call, "4_Initialize_Format_VM_Disk.ps1".  That script does exactly what it says in the name and we'll look at that in the next section.  Here is the code to create a Premium disk in Azure and add it to our SQL VM.

 

# Define global variables for the script
$subscriptionID='MySubID' # the ID of subscription name you will use
$resourceGroupName= 'VMStorageTestRM'
$myVMName = "VMStorageTest" #The name of our Azure VM running SQL Server
$locationName = 'South Central US'
$fileName = '4_Initialize_Format_VM_Disk.ps1' #Used with our Custom Script Extension to initialize and format the drive within the VM
$scriptName = '4_Initialize_Format_VM_Disk.ps1' #Used with our Custom Script Extension to initialize and format the drive within the VM
$storageAccountName= 'storagetestallbackups' #This is the same storage account we created in the previous step
$typeHandlerVersion = '1.8'
$storageContainer = 'vmscripts' #This is a different container in our storage account to hold our Custom Script Extensions

$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}

# Add the premium drive to the VM
$VirtualMachine = Get-AzureRmVM -ResourceGroupName $ResourceGroupName -Name $myVMName
Add-AzureRmVMDataDisk -VM $VirtualMachine -Name "premiumdisk1" -LUN 2 -Caching ReadWrite -DiskSizeinGB 1 -CreateOption Empty -StorageAccountType PremiumLRS
Update-AzureRmVM -ResourceGroupName $ResourceGroupName -VM $VirtualMachine

# Get the access keys for the storage account
$accountKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -AccountName $storageAccountName).Value[0]

# Set and execute the custom script extension. Initializes and formats drive then backs up restores DB to new drive
Set-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -Location $locationName -VMName $myVMName -Name $scriptName -Run $scriptName `
-TypeHandlerVersion $typeHandlerVersion -StorageAccountName $StorageAccountName -StorageAccountKey $accountKey -FileName $fileName -ContainerName $storageContainer
Update-AzureRmVM -ResourceGroupName $resourceGroupName -VM $VirtualMachine

#Good to keep it clean and remove our extension. The extension also has a password so we really want to cleanup here.

Remove-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -Name $scriptName -VMName $myVMName -Force

Initialize and Format

It's time to write that Custom Script Extension that will execute locally on our SQL VM to initialize and format our newly added Premium Disk.  First we need to grab a reference to our new drive (formatting the C:\ drive is still considered really bad form).  Since the disk is brand new we can identify it by its partition type being "raw".  After that we grab a letter from the alphabet, initialize it, format it, and assign the letter.  We also create folders on the new drive to house our data and log files.

This script is also going to set us up for the next step of restoring our database out of blob storage and onto the new drive.  For this we will create a new folder on the VM to house a SQL script that will do the restore for us.  Next we will download that script from our storage account, and finish by executing it.  We are going to call the SQL script "5_Azure_Backup_Restore_To_Premium.sql" and we will discuss that in the following section.  For now, let's look at our code to initialize our new VM disk, format it, and set the stage for the next script.

 

# Setup variables
$storageAccountName = "storagetestallbackups"
$resourceGroupName = "VMStorageTestRM"
$subscriptionId = "YourAzureSubscriptionID"

# Now that we added the disk in Azure we need to initialize and format it in the OS
$disks = Get-Disk | Where partitionstyle -eq 'raw' | sort number

$letters = 71..89 | ForEach-Object { [char]$_ }
$count = 0
$labelprefix = "data"

foreach ($disk in $disks) {
$driveLetter = $letters[$count].ToString()
$disk |
Initialize-Disk -PartitionStyle MBR -PassThru |
New-Partition -UseMaximumSize -DriveLetter $driveLetter |
Format-Volume -FileSystem NTFS -NewFileSystemLabel ($labelprefix + $count) -AllocationUnitSize 65536 -Confirm:$false -Force
$count++
}

# We need to add the directories we plan to restore to before we do the restore
New-Item G:\Data -type directory
New-Item G:\Log -type directory

# Create a directory to store our local SQL script
New-Item C:\Scripts -type directory -Force

############ This script is executing locally on the VM as a Custom Script Extension so we need to authenticate to Azure ##################
$pwd = "%SupersweetPassword%" #Don't panic. This script is stored in a secured storage container and removed from the VM after it runs.
#$azureAdApplication.ApplicationId.Guid #If you are running this in the same session as the setup this will give you the application ID in case you forgot it
$accountId = "YourAccountID"
$tenantId = "YourTenantID" #You get this from just running Login-AzureRmAccount and logging in at the prompt.
$login = $AccountId.ToString() + "@microsoft.onmicrosoft.com"

#Create Credentials
$pass = ConvertTo-SecureString $pwd -AsPlainText –Force
$cred = New-Object -TypeName pscredential –ArgumentList $login, $pass

#Login
Login-AzureRmAccount -Credential $cred -ServicePrincipal –TenantId $tenantId
###############################################################################################

# Set context to the desired subscription
Set-AzureRmContext -SubscriptionID $subscriptionID

# Get the access keys for the storage account
$accountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -AccountName $storageAccountName).Value[0]

# Create a new storage account context using an ARM storage account
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys

# Download the script from Azure to our new folder
$storageContext | Get-AzureStorageBlobContent -Container vmscripts -Blob 5_Azure_Backup_Restore_To_Premium.sql -Destination 'C:\Scripts' -Force

# Run our SQL script
Invoke-Sqlcmd -InputFile 'C:\Scripts\5_Azure_Backup_Restore_To_Premium.sql' -Username 'AzureAutomation' -Password 'AA123'  #You will need to use a SQL account in your target instance

Move the Database

The previous script kicked off "5_Azure_Backup_Restore_To_Premium.sql" at the very end, which handles moving our database.  Remember that the database currently exists and is running on the standard storage drive.  I chose to back it up using the "TO URL" option of the "BACKUP DATABASE" command and backup the database to Azure blob storage.  Next I restore it from Azure blob storage to my new drive.  This is not the only way I could have done this, but I like the consistency and the fact that I am backing up my database every time I make a major change.  I could have just as easily backed it up to a local drive and restored from there.  I even could have detached it and attached it, but it is less risky to have a backup using checksum.  Let's see what this script looks like.

 

/* Put the DB in SIMPLE Recovery */
ALTER DATABASE [Example] SET RECOVERY SIMPLE
GO

/* Perform a Full Backup */
BACKUP DATABASE [Example]
TO URL = 'https://storagetestallbackups.blob.core.windows.net/storagetestserver1backups/Example_Full_1.bak'
WITH FORMAT, INIT, COMPRESSION, CHECKSUM
GO

/* Restore the full backup to the new drive */
USE [master]
GO

ALTER DATABASE [Example] SET SINGLE_USER WITH ROLLBACK IMMEDIATE

RESTORE DATABASE [Example]
FROM URL = 'https://storagetestallbackups.blob.core.windows.net/storagetestserver1backups/Example_Full_1.bak'
WITH MOVE 'Example' TO 'G:\Data\Example.mdf',
MOVE 'Example_Log' TO 'G:\Log\Example_Log.ldf',
GO

ALTER DATABASE [Example] SET MULTI_USER
GO

Reverse It

Things are looking really good now.  We have successfully created a Runbook that adds a Premium storage disk to our VM and moves our database to the new drive.  That's great because we can manually kick it off in the morning or schedule it to run automatically so we are ready to do our testing or development work for the day.  The last thing we need to do is reverse all of this and put everything back on a standard storage disk. Let's work on that.

We have already created our storage account and created a credential to access it, so we don't have to do all that work again.  We just need to move the database back to the standard storage drive and get rid of the premium drive.  We'll start by creating our second Runbook to handle this.

Remove the Premium Disk

This Runbook is again a chain of scripts to accomplish what we want to achieve.  The code will make use of our Custom Script Extension to fire off another script that will execute locally on the VM.  The local script will download a SQL script to move our database. Once that is done our Runbook code will remove the Premium drive from our VM.  Here is what the Runbook code looks like:

 

# Define global variables for the script
$resourceGroupName= 'VMStorageTestRM'
$myVMName = "VMStorageTest"
$storageAccountName = 'storagetestallbackups'
$locationName = 'South Central US'
$scriptName = '4-1_Execute_Restore_To_Standard.ps1'
$fileName = '4-1_Execute_Restore_To_Standard.ps1'
$typeHandlerVersion = '1.8'
$storageContainer = 'vmscripts'

#Connect to Azure
$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName

"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}

# Get the access keys for the storage account
$accountKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -AccountName $storageAccountName).Value[0]

# Grab VM object reference
$VirtualMachine = Get-AzureRmVM -ResourceGroupName $ResourceGroupName -Name $myVMName

# Set the custom extension
Set-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -Location $locationName -VMName $myVMName -Name $scriptName -Run $scriptName `
-TypeHandlerVersion $typeHandlerVersion -StorageAccountName $StorageAccountName -StorageAccountKey $accountKey -FileName $fileName -ContainerName $storageContainer

# Update the VM.  This executes the custom script extension on the VM. The extension uses a SQL script to move the DB to standard storage
Update-AzureRmVM -ResourceGroupName $resourceGroupName -VM $VirtualMachine

#We want to remove our script extension from the VM here

Remove-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $myVMName -Name $scriptName -Force

Update-AzureRmVM -ResourceGroupName $resourceGroupName -VM $VirtualMachine

#This will detach the disk from the VM and the OS will automatically remove it
Remove-AzureRmVMDataDisk -VM $VirtualMachine -Name "premiumdisk1"
Update-AzureRmVM -ResourceGroupName $ResourceGroupName -VM $VirtualMachine

#We detached it above, but that doesn't remove the VHD blob from Azure.
Remove-AzureRmDisk -DiskName "premiumdisk1" -ResourceGroupName $resourceGroupName -Force

Move the Database

The code above in our Runbook calls a Custom Script Extension so let's inspect it and see what it does.

 

# Setup variables
$storageAccountName = "storagetestallbackups"
$resourceGroupName = "VMStorageTestRM"
$subscriptionId = "YourSubscriptionID"

# Create a directory to store our local SQL script
New-Item C:\Scripts -type directory -Force

############ Use this section in other scripts to authenticate going forward ##################
$pwd = "%SupersweetPassword%"
#$azureAdApplication.ApplicationId.Guid #If you are running this in the same session as the setup this will give you the application ID in case you forgot it
$accountId = "YourAccountID"
$tenantId = "YourTenantID" #You get this from just running Login-AzureRmAccount and logging in at the prompt.
$login = $AccountId.ToString() + "@microsoft.onmicrosoft.com"

#Create Credentials
$pass = ConvertTo-SecureString $pwd -AsPlainText –Force
$cred = New-Object -TypeName pscredential –ArgumentList $login, $pass

#Login
Login-AzureRmAccount -Credential $cred -ServicePrincipal –TenantId $tenantId
###############################################################################################

# Set context to the desired subscription
Set-AzureRmContext -SubscriptionID $subscriptionID

# Get the access keys for the ARM storage account
$accountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -AccountName $storageAccountName).Value[0]

# Create a new storage account context using an ARM storage account
$storageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $accountKeys

# Download the script from Azure to our new folder
$storageContext | Get-AzureStorageBlobContent -Container vmscripts -Blob 5-1_Azure_Backup_Restore_To_Standard.sql -Destination 'C:\Scripts' -Force

# This runs a SQL script to backed and restore the DB to the Standard storage disk
Invoke-Sqlcmd -InputFile 'C:\Scripts\5-1_Azure_Backup_Restore_To_Standard.sql' -Username 'AzureAutomation' -Password 'AA123'

The code here is pretty simple compared to what we wrote to add the Premium drive.  All we do is download a SQL script from our storage account and execute it on the VM.  The SQL script will move the database to standard storage.  It should be pretty obvious what the SQL script looks like, but let's take a peak.

Restore to Standard

Nothing surprising here.  We backup our Example database to our blob storage account and then restore it using the WITH MOVE operation to put it back on our standard storage drive.  Totally straightforward, so here it is:

 

/* Put the DB in SIMPLE Recovery */
ALTER DATABASE [Example] SET RECOVERY SIMPLE
GO

/* Perform a Full Backup */
BACKUP DATABASE [Example]
TO URL = 'https://storagetestallbackups.blob.core.windows.net/storagetestserver1backups/Example_Full_1.bak'
WITH FORMAT, INIT, COMPRESSION, CHECKSUM
GO

/* Restore the full backup */
USE [master]
GO

ALTER DATABASE [Example] SET SINGLE_USER WITH ROLLBACK IMMEDIATE

RESTORE DATABASE [Example]
FROM URL = 'https://storagetestallbackups.blob.core.windows.net/storagetestserver1backups/Example_Full_1.bak'
WITH MOVE 'Example' TO 'F:\Data\Example.mdf',
MOVE 'Example_Log' TO 'F:\Log\Example_Log.ldf',
GO

ALTER DATABASE [Example] SET MULTI_USER
GO

Food for Thought

Here is a list of things that could be altered to streamline this process if you were so inclined.

  • The code to create the credential for SQL to access your storage accont could just be called through the Custom Script Extension instead of the copy and paste method we used here.
  • When I initialize the disk in the VM I know what drive letter I want because I know what my VM looks like.  It would be better to check the highest currently used drive letter and grab the next one.  This was a contrived example so I knew I wanted G:\.
  • Instead of using separate scripts for the SQL code, this code could be brought into the Custom Script Extension.  This would allow for the above change of dynamically choosing the drive letter.
  • We backed up the database to blob storage and then restored it.  We could have restored it to a local drive (the standard storage drive most likely) and restored from there.  This would be faster with a larger database. Mine was really small so it was no big deal and I like the idea of backups right before a major change plus getting them off of the box I am working on. If the box fails then my backups are no good.

Conclusion

This customer wanted to move to Azure for their test and development systems, but they wanted to do it in the most cost efficient way.  Only paying for what you use is one of the bigger value propositions of the cloud.  Sometimes we have to think outside the box to accomplish our goal and that is exactly what we just accomplished in this post.  It might seem like a lot of work, but I just gave you a giant leap in the right direction.  Enjoy automating your Azure VMs!