Creating a Custom SQL Server VM Image in Azure

Recently I had the opportunity to work on a project were I needed to create a custom SQL Server image for use with Azure VMs.  The process was a little more challenging than I initially anticipated.  I think this is mostly because I was not familiar with the process of preparing a SQL Server image.  Perhaps this isn’t much of a challenge for an experienced SQL Server DBA or IT Pro.  For me, it was a great learning experience.

Why a Custom SQL Server Image?

The Azure VM image gallery already contains a SQL Server image.  It’s very easy to create a new SQL Server VM using this image.  However, doing so has a few important trade-offs to consider:

  • Unable to fully customize the base install of SQL Server.  This is a template/image after all – you get a VM configured the way the image was configured.
  • Unable to use your own SQL Server license.  If your company has an Enterprise Agreement (EA) with Microsoft, it’s likely there is already some SQL Server licenses built into that agreement.  Depending on the details, it may be significantly cheaper to use the licenses from the EA instead of paying the SQL Server VM image upcharge from Azure.

The Basic Steps

There are 6 basic steps to creating a custom SQL Server VM image for use in Azure.

  1. Provision a new base Windows Server VM
  2. Download the SQL Server installation media
  3. Run SQL Server setup to prepare an image
  4. Configure Windows to complete the installation of SQL Server
  5. Capture the image and add it to the Azure VM image gallery
  6. Create a new VM instance using the custom SQL Server image

The basic idea here is to create a base VM, customize it with a SQL Server image, capture the VM to create an image, and then provision new VMs using that captured VM image.

Create_SQL_VM_Image_Azure 2

Let’s dive into each of these in a little more detail.

Note: the terminology here can be a little confusing. When referring to the VM used to create the template/image, I’ll use the term “base VM”. When referring to the VM created from the base VM, I’ll use the term “VM instance”.

1. Provision a new base Windows Server VM

There are multiple ways to create a Windows Server VM in Azure.  Creating a VM via the Azure management portal and PowerShell are probably the two most popular options.  Be sure to check out this tutorial to learn how to do so via the portal. For the purposes of this post, I’ll do so via PowerShell.

$img = Get-AzureVMImage `
	| where { ( $_.PublisherName -ilike "Microsoft*" -and $_.ImageFamily -ilike "Windows Server 2012 Datacenter" ) } `
	| Sort-Object -Unique -Descending -Property ImageFamily `
	| sort -Descending -Property PublishDate `
	| select -First(1)

$vmConfig = New-AzureVMConfig -Name "sql-1" -InstanceSize Small -ImageName $img.ImageName |
    Add-AzureProvisioningConfig -Windows -AdminUsername "[admin-username-here]" -Password "[admin-password-here]" 

New-AzureVM -ServiceName "SQLServerVMTemplate" -VMs $vmConfig -Location "East US" -WaitForBoot

 2. Download the SQL Server installation media

With the base Windows Server 2012 VM created, we can now get ready to prepare (sysprep) the SQL Server installation.  To do that, we need to get the SQL Server installation media onto the machine.  The easiest way I found to do this was to leverage Azure blob storage.

  1. Upload the SQL Server ISO file to Azure blob storage
  2. Remote Desktop (RDP) into the base VM
  3. From the VM, download the SQL Server ISO file to the local disk
  4. Mount the SQL Server ISO file to the VM
  5. Copy the ISO contents (not the ISO file itself) to the VM’s C:\ drive.  For example, use C:\sql

The SQL Server installation media files need to be copied to the local C: drive so it can be used later to complete the SQL Server installation (when provisioning the actual SQL Server VM instance).

3. Run SQL Server setup to prepare an image

In order to prepare the (sysprep’d) SQL Server VM image (which we can use as a template for future VMs), we need to run the SQL Server installation and instruct it to prepare an image – not run the full installation.  An easy way to do this is with a SQL Server configuration file, an example of which I’ve included below.


;SQL Server 2012 Configuration File
; Specifies a Setup workflow, like INSTALL, UNINSTALL, or UPGRADE. This is a required parameter.
; Detailed help for command line argument ENU has not been defined yet.
; Parameter that controls the user interface behavior. Valid values are Normal for the full UI, AutoAdvance for a simplified UI, and EnableUIOnServerCore for bypassing Server Core setup GUI block.
; Specifies setup not display any user interface.
; Specifies setup to display progress only, without any user interaction.
; Specifies whether SQL Server Setup should discover and include product updates. The valid values are True and False or 1 and 0. By default SQL Server Setup will include updates that are found.
; Specifies features to install, uninstall, or upgrade. The list of top-level features include SQL, AS, RS, IS, MDS, and Tools. The SQL feature will install the Database Engine, Replication, Full-Text, and Data Quality Services (DQS) server. The Tools feature will install Management Tools, Books online components, SQL Server Data Tools, and other shared components.
; Specifies the location where SQL Server Setup will obtain product updates. The valid values are "MU" to search Microsoft Update, a valid folder path, a relative path such as .\MyUpdates or a UNC share. By default SQL Server Setup will search Microsoft Update or a Windows Update service through the Window Server Update Services.
; Displays the command line parameters usage
; Specifies that the detailed Setup log should be piped to the console.
; Specifies that Setup should install into WOW64. This command line argument is not supported on an IA64 or a 32-bit system.
; Specifies the root installation directory for shared components.  This directory remains unchanged after shared components are already installed.
INSTALLSHAREDDIR="C:\Program Files\Microsoft SQL Server"
; Specifies the root installation directory for the WOW64 shared components.  This directory remains unchanged after WOW64 shared components are already installed.
INSTALLSHAREDWOWDIR="C:\Program Files (x86)\Microsoft SQL Server"
; Specifies the Instance ID for the SQL Server features you have specified. SQL Server directory structure, registry structure, and service names will incorporate the instance ID of the SQL Server instance.
; Specifies the installation directory.
INSTANCEDIR="C:\Program Files\Microsoft SQL Server"

There are two steps in this process:

  1. Copy the ConfigurationFile.ini file (from your local PC) to the same location as the SQL Server installation media (i.e. c:\sql) on the base VM.
  2. Run SQL Server setup to prepare an image.  From a command prompt (on the base VM), navigate to the C:\sql folder and then execute the following command:
Setup.exe /ConfigurationFile=ConfigurationFile.ini /IAcceptSQLServerLicenseTerms=true

 4. Configure Windows to complete the installation of SQL Server

At this point the base VM should have an “installation” of SQL Server that is not fully completed. The SQL Server bits are in place, but they’re not configured for a full server install . . . at least not yet. The final configuration of SQL Server will take place when the VM instance (of which this template/image is the base) is provisioned and boots up for the first time. This is accomplished by using a CMD file with the following content:

REM All commands will be executed during first Virtual Machine boot
  1. On your local PC, save the file as SetupComplete2.cmd
  2. RDP / log into the base VM
  3. Copy the SetupComplete2.cmd from your local PC file to the c:\Windows\OEM folder on the base VM
  4. Change the value for the SQLSYSADMINACCOUNTS value to be that of the administrative account created on the VM (or better yet – the local Administrators group account)
  5. If needed, supply the SQL Server product ID (PID) value.

When Windows starts on the new VM instance for the first time, the SetupComplete2.cmd file should automatically run.  It is invoked by the SetupComplete.cmd file already on the machine.


5. Capture the image and add it to the Azure VM image gallery

At this point a base SQL Server VM has been created and the groundwork laid to complete the install. Now it is time to create the VM image from the base VM, and do to that you sysprep and capture the base VM.  Please follow the guide on How to Capture a Windows Virtual Machine to Use as a Template.

6. Create a new VM using the custom SQL Server image

With a new custom VM image template available in the VM image gallery, you can provision a new VM instance using that custom template.  Upon first boot, the newly provisioned VM should complete the full SQL Server installation as laid out in your SetupComplete2.cmd file.  Please follow the guide on How to Create a Custom Virtual Machine for more information on creating the VM from the template.


Closing Thoughts

One of the quirks I noticed when preparing the base SQL Server image is that it was not possible to prepare the image with SQL Server Management Studio (SSMS).  I would have to do the install after the newly provisioned VM instance is created. Not hard, but time consuming (an annoying if doing this on multiple VM instances).  I later learned that SQL Server 2012 Cumulative Update 1 does allow for preparing a SQL Server image with SSMS installed.  I’ve included a link below that describes the process for creating a SQL Server image with CU1.

In the end, this process really is not all that hard.  Time consuming?  Yes!  The worst part (at least for me) was really just understanding how the SQL Server installation and sysprep process works.  Once I wrapped my head around that, the process was a lot smoother.


Helpful Resources

While I was learning how to create a custom SQL Server VM image, the following resources were very helpful:


I would like to thank Scott Klein for his assistance in verifying these steps.  His help was extremely valuable to ensure I was doing this the right way.

Inside Azure Diagnostics – Pittsburgh Tech Fest (June 2014)

Today I had the pleasure to present at Pittsburgh Tech Fest.  I presented a new session on how to best utilize Azure diagnostics with Cloud Services (web/worker roles).  This was my second time presenting at Pittsburgh Tech Fest, and again it was a great time!

I had several people ask for my presentation.  Cool – no problem.  You can get my slides from my account.

BONUS: If you want to know more about Azure diagnostics, please be sure to check out my new series at

BUILD 2014 Downloads

I don’t know about you, but I’m still trying to catch up on all the Azure hotness Microsoft dropped at BUILD a few weeks ago.  On day 2 of BUILD I was sitting a few rows from the front in the keynote hall at BUILD.  As Scott Guthrie continued with his keynote address, I just kept thinking that the hits just keep coming (in a good way).  Plus, it’s going to take a while to digest and really learn all these great new Azure features.  Exciting!


I love the fact that Microsoft makes pretty much all the BUILD sessions available to watch via Channel 9.  This is just a great resource for those that either couldn’t make it to BUILD, or if there, couldn’t attend all the sessions in person (you’d have to clone yourself to make that happen).

I travel a lot and being connected to the internet to watch all the sessions sometimes just isn’t possible.  For me, it is very handy to have the session videos and related PowerPoint slides downloaded to my local PC.  Downloading video and slides for each session would be quite painful.  PowerShell to the rescue!

The PowerShell script below is one that I’ve been using for a few years now to download material from BUILD, TechEd, and other major Microsoft events that have content hosted on Channel 9.  I made a few recent tweaks to handle some quirks with the way BUILD 2014 slides were being shared. There’s still some room for improvement, but for the most part it gets the job done.  And that job is downloading all the Azure related sessions and supporting videos to my local computer.



Getting the Windows Azure PowerShell Cmdlets

I’m a big fan of using the Windows Azure PowerShell cmdlets for managing my Windows Azure resources.  They’re incredibly helpful, especially when I need to repeatedly take care of multiple tasks.

There are two ways to get the Windows Azure PowerShell cmdlets – the heavy way and the light way.

Get All the Things!

I would assume most people get the Windows Azure PowerShell cmdlets by taking the path of least resistance.  They go to the Windows Azure downloads page, click the link to “Install” the cmdlet, and let the Web Platform Installer (WebPI) do its thing.  That’s all fine, but it does a lot.  What’s a lot?  Let’s see . . .

Azure PowerShell Web PI

Trying to install the cmdlets via WebPI will also attempt to install a bunch of other software you might not expect, such as:

  • Windows Azure Emulator
  • Microsoft SQL Express LocalDB Edition
  • IIS 8.0 Express
  • Windows Azure Storage Tools
  • Windows Azure Authoring Tools
  • Windows Azure SDK

If you’re an IT Pro / systems admin type of person, that’s probably more than you would expect, or need.

Get Just the One Thing!

If you want just the Windows Azure PowerShell cmdlets, the path of absolute least resistance is to download the latest release from GitHub –  From here you can find links to download via Web PI (we already know what that will bring with it), or get the MSI by itself.


Download and run the MSI and you start off with something pretty familiar and basic.


Windows Azure Cache Options

This week (January 27th – 31st) is Windows Azure week.  As part of Windows Azure week, I was honored to publish a blog for Microsoft Press where I highlighted the various cache options available in Windows Azure.  In this post I discuss both In Role Cache for Cloud Services (web and worker roles) and the new Cache Service for use with Cloud Services, Web Sites, and VMs.  The post goes into detail on how to set up both options, along with some suggestions on which to select for your scenario.  I had a lot of fun writing the post.  I hope you’ll enjoy reading it.

You can read the post on the Microsoft Press blog at

Windows Azure Week

During this week Microsoft is hosting a series of events to help developers and architects understand the features available in Windows Azure, and techniques for how to use them effectively.

  • Monday, January 27th – Getting Started with Windows Azure Today Jump
  • Tuesday, January 28th – Designing Applications for Windows Azure
  • Wednesday, January 29th – Building Modern Web Applications
  • Thursday, January 30th – Windows Azure IT Pro IaaS
  • Friday, January 31st – Mobile Apps to IoT: Connected Devices with Windows Azure

This is sure to be a great series of webinars lead by some of Microsoft’s top cloud experts.  Learn more and register today at

CodeMash 2014

Last Friday wrapped up the 8th CodeMash conference. CodeMash is probably . . . no, it is . . . my favorite conference to attend each year. CodeMash is packed with the most passionate technologists I know. Many attendees are from the MidWest, but an increasing amount are from all over the U.S. and even the world! CodeMash is a great way to re-energize and kick off a new year.

I was fortunate this year to again be a speaker at CodeMash. I gave two presentations – “More Cache with Less Cash” and “Windows Azure Mobile Services – The Perfect Partner”. I would like to thank everyone that attended the sessions. I know I had a good time, and I hope you did as well – and hopefully learned a few things along the way. A few people have asked for my slides. You can find slides for both sessions below.

Azure Management Studio and Blob Container ACLs

Here’s a quick tip I recently learned from my friend, and Cerebrata evangelist, Mike Wood . . .

When using Cerebrata’s Azure Management Studio to work with Windows Azure blob storage, you need to explicitly tell AMS to retrieve the access permissions for blob containers.  By default, AMS will only list the containers.


To make this tool even more useful, instruct AMS to retrieve the permissions when listing the containers.

  1. Under the “Tools” menu at the top, select “Options”.
  2. In the “Options” window, go to the “Blob” tab.
  3. In the “Blob behaviour” section, check the box next to “Fetch Blob Container ACL when listing Blob Containers”.


This will instruct AMS to get the permissions and thus color highlight the containers.  Ooooh . . . pretty colors!


How is AMS getting this info to know what the permission is for the containers?  Simple – it’s just calling into the Windows Azure storage API.  The Get Container ACL operation retrieves the blob container permissions.  If we open Fiddler and take a look at the requests AMS is making, we can see 4 requests for my example – one to retrieve the list of containers and one for each of the containers.

Note: I instructed AMS to not use HTTPS for connections.  I did this only to make visualizing the requests/responses in Fiddler easier.  I would recommend HTTPS for most production scenarios.


Let’s look at just the request to get the ACL for the “myimages” container.


Notice the “x-ms-blob-public-access” response header.  It is set to “Container”.  This indicates full public read access for the container and blob data.  Therefore, AMS colors this container a lovely shade of green.

A word to the wise – there is a reason why this option is not enabled in Azure Management Studio by default.  If the storage account contains many containers, this could result in many extra requests to Windows Azure to determine the ACL – one roundtrip per container.  This could slow down your time to view all the containers.  If you just have a few containers, it’s probably not a big deal as this request/response dance happens pretty fast.

So that didn’t exactly turn out to be a quick tip.  But, still hopefully useful.  Enjoy!