Office Add-In Development: The Need for Office.js

Recently I had the opportunity to do a little (and I mean “little” in the most literal sense) Office add-in development. If you follow me on this blog or Twitter, you know that I work primarily in the Azure space. Building an Office add-in is certainly new to me, but I figured it would be kind of fun to learn about something new. This journey started as a result of a partner that I worked with who was having problems getting their add-in to load correctly in both Word Online and Word (on their desktop).  Odd . . . right?  It should work the same when hosted in both Office Online apps and Office.

I knew from prior reading, that the “new” Office add-in model was (to likely over simplify) essentially creating a web application that would be hosted within an Office desktop (e.g. Word 2016) or Office online (e.g. Word online) application. If the add-in needed to interact with the Word, Excel, Outlook environment in any way, there is a JavaScript API to do so. Seems easy enough.

To get started, I did a little Binging (just sounds odd . . . but whatever) for Office add-in development. I wanted to start with “official” Microsoft documentation, so I ignored results from and blogs (wait . . what if it is my blog . . .this blog . . .)  One of the top results was the “Office Add-ins platform overview” page at  Sweeta!

Reading . . .reading . . . reading

Finally, I get to something that looks like code  . . .well at least a picture of code, at  This section states that the minimal version of an add-in is a “static HTML page that is displayed inside an Office application, but doesn’t interact with either the Office document or any other Internet resource.”  I can do that!  I’ll simply write a “Hello World” app to just see it work, and then go from there.  The page includes a picture of two components needed for a “Hello Word” Office add-in, the manifest file and the web page.  Bam!


I created the most basic HTML page . .  . which just happened to look exactly like the one in the picture. I then proceeded to read up on how to debug the add-in via sideloading the add-in via Office and Office Online.  This is easy!

I started by sideloading the add-in via Word (desktop). Hot damn . . . it worked!!


I then decided to push my luck and try it on Office Online.  FAIL!


Ok . . . so hit the “RETRY” button? Wait for the spinny icon to go away . . . . FAIL, again!


Ok . . . . um. . . . WTF?  Hit “START”?  Sure . . what’s the worst that could happen?  It worked!!


This is where, as a developer, I start to question my life choices. I also realize there must be something else going on causing this funkiness. I tried the normal browser rotation game – Chrome, Edge, IE – all produced the same results. Hmmmm.

I decided to do some more reading. I found this page,, which promised to give me a working sample. I followed it, and it worked. So, what was different (besides being a whole lot more code)?  As I dug around the generated code, I noticed the home.html file contained two JavaScript references, one for jQuery and one for office.js. I could understand the jQuery one, but what’s this office.js thing?  It must be the API used to interact with Office apps. Ok . . . but my “Hello World” app isn’t doing anything with Office apps.  What gives?

This lead me to discover the “Understanding the JavaScript API for Office” page.  I noticed this block of seemingly normal text:

“All pages within an Office Add-ins are required to assign an event handler to the initialize event, Office.initialize. If you fail to assign an event handler, your add-in may raise an error when it starts. Also, if a user attempts to use your add-in with an Office Online web client, such as Excel Online, PowerPoint Online, or Outlook Web App, it will fail to run. If you don’t need any initialization code, then the body of the function you assign to Office.initialize can be empty, as it is in the first example above.”

YES! YES!!  That would have been good information a few hours ago . . . . as on that “Hello World” page with the simple HTML example code.  Put some big, red blinky arrows around this, for Pete’s sake!

I proceeded to add the most basic office.js initialization code to my sample “Hello World” page (which just happens to be hosted on Azure Web Apps, with a local Git continuous deployment option . . . see, I had to get Azure in here somewhere).


I proceeded to reload my add-in in Word online. It worked!  The simple console.log() statements showed up in the JavaScript console too. Yippie!!



That’s a lovely story . . .what’s the point?

If the documentation example seems too simple to be true, it probably is?  Eh . . . sometimes.  The point here is that for Office add-ins, you are absolutely required to reference office.js and at least set up an empty initialization block.  Doing so helps Office in setting up some of the logical infrastructure needed for the add-in environment. It’s like magic. Only magic isn’t real. Mostly.

Connect to Azure SQL Database by Using Azure AD Authentication

One of the great features recently added to Azure SQL Database is the ability to authenticate to Azure SQL Database using Azure Active Directory. This provides an alternative to exclusively using SQL credentials. By leveraging Azure AD authentication, you can greatly simplify management of database permissions by continuing to use existing identities, as well as leveraging AD groups.

The article here does a decent job of explaining the basics of how Azure AD authentication with SQL Database works, and the steps needed to do so. One area that it doesn’t yet cover is obtaining an Azure AD authentication token and using that token to authenticate with SQL Database.  Actually  . . . it does, sort of. The article assumes a certificate will be used. That isn’t always the case. In the following sections, I will show you how to obtain an Azure AD authentication token for a user (in Azure AD directory), and use that token for authentication with SQL Database.


Before we get started, be sure to follow steps 1 through 6 in the Connecting to SQL Database or SQL Data Warehouse By Using Azure Active Directory Authentication article.  I consider these the prerequisite steps. You just have to do it.


The process to use an Azure AD authentication token with SQL Database can be broken down into several distinct steps:

Let’s review each of these in a bit more detail.

Register an application in Azure AD

For the purposes of this example, let’s keep it simple and use a native (console) application. We’ll use the (new) Azure Portal here. Similar steps can be done in the classic Azure portal as well.

  1. Navigate to the Azure Active Directory section
  2. Select App registrations, and then the + Add button
  3. On the resulting Create blade, provide a friendly name, select an application type of Native, and provide a redirect URL (which is largely irrelevant in this scenario for a native console application)


Add Azure SQL Database to the list of APIs which will require permission from your application

This step is very important. You will need to add Azure SQL Database to the list of APIs / applications which will be granted delegated permission via your application. Failure to do this will result in an error message similar to the following:

“AADSTS65001: The user or administrator has not consented to use the application with ID ‘{your-application-id-here}’. Send an interactive authorization request for this user and resource.”

Before you can continue, you need to have followed the prerequisites steps stated at the top of this post. You especially need to be sure you have created an Azure AD contained database user. If you fail to do that, you will not see “Azure SQL Database” in the list (as specified below).

Setting the permission is fairly easy via the Azure portal.

  1. Select the newly created application (in this case, it was ContosoConsole6)
  2. On the Settings blade, select Required permissions.
  3. Add a new required permission and select Azure SQL Database as the API. You’ll want to search for “azure” to get “Azure SQL Database” to appear in the list.

select azure sql database delegated permission.png

Be sure to select the checkbox for “Access Azure SQL DB and Data Warehouse”.


I should point out that even after adding the delegated permission as shown above, you will still get the previously mentioned error. We’ll solve that next.

Consent to allow your application to access Azure SQL Database

That error you received before about the administrator having not consented to use application is something we need to get past. To do so, you can force a one-time consent dialog so you can consent to your application delegating access to Azure SQL Database. You’ll need to craft a URL in the following format (wrapped for readability):

Paste that URL into your favorite browser window and go. You should be prompted to log into your Azure subscription. Do so as a Global Administrator for your Azure AD tenant.


With the above steps complete, you should now be able to write the code for the sample app to obtain and use the Azure AD authentication token with Azure SQL Database.

Create a .NET 4.6 console application

Make sure your console project targets .NET Framework 4.6. The SqlConnection.AccessToken property used to set the Azure AD authentication token is available in .NET 4.6 only.


Add the Active Directory Authentication Library (ADAL) to the project via NuGet

Since you’ll be working with Azure AD, you’ll want to use ADAL to make getting the Azure AD authentication token easy.

Add ADAL Nuget.png

Add code to obtain an Azure AD authentication token

Finally! Some code! First, in my example, I set up a few constants which represent information about Azure AD and the resource for which I want to obtain an authentication token. For SQL Database, the resource is

private const string AadInstance = "{0}";
private const string ResourceId = "";

You’ll need a client ID as part of the call to AcquireTokenAsync(). The client ID is the Application ID for your app registered in Azure AD.  The Azure AD tenant value is the friendly name for your Azure AD tenant (e.g. Thanks to using ADAL, the code to get the authentication token is very easy – just two lines of code:

string clientId = ConfigurationManager.AppSettings["ClientId"];
string aadTenantId = ConfigurationManager.AppSettings["Tenant"];
AuthenticationContext authenticationContext =
  new AuthenticationContext(string.Format(AadInstance, AadTenantId));

AuthenticationResult authenticationResult =

You’ll notice the above code calls a method GetUserCredential()  to obtain a UserCredential object. This represents the Azure AD user for which to obtain the authentication token. In this example, the user details are hard coded. Yes, this is bad. I’m showing it here strictly as an example. In future posts, I’ll show a few other ways to (better) handle the user details.

private static UserCredential GetUserCredential()
   string pwd = ConfigurationManager.AppSettings["UserPassword"];
   string userId = ConfigurationManager.AppSettings["UserId"];

   SecureString securePassword = new SecureString();

   foreach (char c in pwd) { securePassword.AppendChar(c); }

   var userCredential = new UserPasswordCredential(userId, securePassword);

   return userCredential;

If all goes well, after executing the AcquireTokenAsync() method you should receive an  Azure AD (JWT) authentication token as part of the resulting AuthenticationResult object. For fun, paste it in at to decode it. You should get something similar to the screenshot below.


Add code which uses Azure AD authentication token to authenticate with SQL Database

This is the easy part. You just need some code which gets a basic database connection string, and then sets the SQL connection to use the previously obtained authentication token.

The database connection string is going to be very basic, containing nothing more than the data source (your Azure SQL Database server name), the database name, and a connection timeout.

Data Source=[your-server-name-here];
Initial Catalog=[your-db-name-here];Connect Timeout=30

The following code to query the database is ultra-simple. I just want to make sure the connection is successful and I can execute a basic command. This command will return the name of the connected user . . . which should correspond to that of the specified Azure AD user.

var sqlConnectionString = ConfigurationManager.ConnectionStrings["MyDatabase"].ConnectionString;
using (SqlConnection conn = new SqlConnection(sqlConnectionString))
  conn.AccessToken = authenticationResult.AccessToken;

  using (SqlCommand cmd = new SqlCommand("SELECT SUSER_SNAME()", conn))
    var result = cmd.ExecuteScalar();

That’s it. Done.


As you can hopefully see, the steps to using an Azure AD authentication token with Azure SQL Database are not especially complicated. There are a few minor hoops to jump through, but nothing too serious.

It is worth pointing out that this solution does take a dependency on Azure AD. In the unlikely event that Azure AD is unavailable, then you may be unable to access your database resources using your Azure AD credential. I think it is a wise strategy to also keep a few SQL identities in place and have a process ready to use those  – just in case. After all, designing robust applications for the cloud is about designing for failure. Always be prepared.

You can find the full source code for this project on my personal GitHub repository.

My New Azure Book

Last week Microsoft Press released the latest update to my book, Microsoft Azure Essentials: Fundamentals of Azure, Second Edition.   As the name indicates, this is an update to Microsoft Azure Essentials: Fundamental of Azure released in February 2015. I worked on both editions with my friend Robin Shahan.

The first edition of the book was relatively popular, with over 200,000 copies either downloaded or distributed in print. Thank you for the amazing support! I’m very happy so many people found the book to be helpful.fundamentalsofazure2e-thumbnail

When the Microsoft Press editor that worked on the first book initially approached me about working on an update, I was honestly a bit hesitant. Writing a technical book is a lot of work . . . especially a book on a platform such as Azure that seems to be constantly evolving. However, after many discussions with Robin, and my wife (getting spousal approval was very important as this effort would require many nights and weekends away from my family, which added a new baby to the party earlier this year), I decided to write the update. Thankfully Robin agreed to join in on the fun too. She’s a lot of fun to work with!

My thinking going into this process was there would be a few significant updates, but mostly it would be changing a few screenshots, updating a product name or two (i.e. Azure Web Sites became Azure Web Apps), and maybe including a bit of content around new services like Azure Service Fabric. It wouldn’t be that much work. Wrong!

Azure is a fast-moving platform. In the nearly 13 months since the first edition of the book was released to when I got started on the second edition, many things had changed. The second edition contains quite a few significant updates. The sections on Azure Web Apps, Azure Virtual Machines, Virtual Networks, Storage, and management tooling all received major edits. Key concepts such as Azure Resource Manager, use of the (new) Azure Portal, and the (almost entire) removal of Azure Cloud Services also received increased focused in the second edition. We also added a new chapter on Additional Azure Services which provides a brief introduction to several other Azure services Robin and I felt were important to understand. There are so many valuable services in the Azure platform, but we couldn’t discuss all of them in the book (or else we still might be working on the book, and it would be huge). These are services we personally felt were of significance for the broad majority of people. (Robin adds more info on the changes in her comment here.)

I would be remiss if I didn’t once again thank the Azure experts that volunteered their precious time to help review the book. Many of these people helped on the first edition too. Their expertise and honest feedback was critical in writing this book. Thank you!

I hope you find the second edition of the book to be a valuable asset to your journey to the public cloud with Microsoft Azure!



Work on Your ARM Strength

Last month I had the privilege to speak at one of the many excellent technology conferences in the Midwest, StirTrek. It’s always fun to speak, and attend, StirTrek. Not only are the sessions great, but afterwards there is a showing of one of the summer’s biggest movies. This year the movie was Captain America: Civil War.

My session this year was “Work on Your ARM Strength”. The goal with the session was to provide some guidance on how to write Azure Resource Manager (ARM) templates. ARM templates are a key part of working with Azure, and it is important to understand how to write, debug, and deploy them.

StirTrek did an excellent job in recording this year’s sessions. You can view all the session on StirTrek’s YouTube channel. You can watch my “Work on Your ARM Strength” session here, or via the embedded video below. Enjoy!

Retrieving Resource Metrics via the Azure Insights API


There are many options for configuring monitoring and alerting for Azure resources. The Azure portal will show some default metrics. It is also possible to enable more advanced or custom diagnostic settings. The links below provide additional details on enabling diagnostics and monitoring for two popular Azure compute resources, Azure Web Apps and Azure Virtual Machines. Visual Studio Application Insights can also be leveraged to monitor the usage and performance of applications.

As previously mentioned, the Azure portal will show some default metrics for many Azure resources. For example, the screenshot below shows the monitoring tile for an Azure Web App, which has been configured to display three key values: Average Response Time, CPU Time, and Requests.



The metric values displayed on this tile are not retained indefinitely. For an Azure Web App, the retention policy is as follows:

  • Minute granularity – 24 hour retention
  • Hour granularity – 7 day retention
  • Daily granularity – 30 day retention

By using the Azure Insights API it is possible to programmatically retrieve the available default metric definitions (the type of metric such as CPU Time, Requests, etc.), granularity, and metric values. With the ability to programmatically retrieve the data, comes the ability to save the data in a data store of your choosing. For example, that data could be persisted to Azure SQL Database, DocumentDB, or Azure Data Lake. From there you could perform whatever additional analysis is desired.

It should be noted that the Azure Insights API is not the same as Application Insights.

Besides working with various metric data points, the Insights API allows you to manage things like alerts, autoscale settings, usage quotas, and more. Check out the full list via the Azure Insights REST API Reference documentation.

The remainder of this post will discuss using the Insights API to learn more about the default metrics available for Azure resources.

Investigating Available Metrics via the Insights REST API

There are three basic steps for working with the Insights REST API:

  1. Authenticate the Azure Insights request
  2. Retrieve the available metric definitions
  3. Retrieve the metric values

The first step is to authenticate the Azure Insights API request. As the Azure Insights API is an Azure Resource Manager based API, it requires authentication via Azure Active Directory (Azure AD). The easiest way (in my opinion at least) to set up authentication is by creating an Azure AD service principal and retrieve the authentication (JWT) token. The sample script below demonstrates creating an Azure AD service principle via PowerShell. For a more detailed walkthrough, please reference the guidance at—powershell. It is also possible to create a service principle via the Azure portal.

Create a Service Principle
  1. # Instructions at
  2. $pwd = “[your-service-principle-password]”
  3. $subscriptionId = “[your-azure-subscription-id]”
  4. Login-AzureRmAccount
  5. Select-AzureRmSubscription -SubscriptionId $subscriptionId
  6. $azureAdApplication = New-AzureRmADApplication `
  7.                         -DisplayName “Collier Web Metrics Demo” `
  8.                         -HomePage https://localhost/webmetricdemo” `
  9.                         -IdentifierUris https://localhost/webmetricdemo” `
  10.                         -Password $pwd
  11. New-AzureRmADServicePrincipal -ApplicationId $azureAdApplication.ApplicationId
  12. New-AzureRmRoleAssignment -RoleDefinitionName Reader -ServicePrincipalName $azureAdApplication.ApplicationId
  13. $subscription = Get-AzureRmSubscription -SubscriptionId $subscriptionId
  14. $creds = Get-Credential -UserName $azureAdApplication.ApplicationId -Message “Please use your service principle credentials”
  15. Login-AzureRmAccount -Credential $creds -ServicePrincipal -TenantId $subscription.TenantId

Once the authentication setup step is complete, it is possible to execute queries against the Azure Insights REST API. There are two helpful queries:

  1. List the metric definitions for a resource
  2. Retrieve the metric values

Details on listing the metric definitions for a resource is documented at For an Azure Web App, the metric definitions should look similar to example screenshot below.



Once the available metric definitions are known, it is easy to retrieve the required metric values. Use the metric’s name ‘value’ (not the ‘localizedValue’) for any filtering requests (e.g. retrieve the ‘CpuTime’ and ‘Requests’ metric data points). The request / response information for this API call do not appear as an available task at However, it is possible to do so, and the request URI is very similar to that of listing the metric definitions.

Method Request URI

For example, to retrieve just the Average Response Time and Requests metric data points for an Azure Web App for the 1 hour period from 2016-02-18 20:26:00 to 2016-02-18 21:26:00, with a granularity of 1 minute, the request URI would be as follows:{subscription-id}/resourceGroups/collierwebapi2016/providers/Microsoft.Web/sites/collierwebapi2016/metrics?api-version=2014-04-01&$filter=%28name.value%20eq%20%27AverageResponseTime%27%20or%20name.value%20eq%20%27Requests%27%29%20and%20timeGrain%20eq%20duration%27PT1M%27%20and%20startTime%20eq%202016-02-18T20%3A26%3A00.0000000Z%20and%20endTime%20eq%202016-02-18T21%3A26%3A00.0000000Z

The result should be similar to that of the screenshot below.



Using the REST API is very helpful in terms of understand the available metric definitions, granularity, and related values. This information can be very helpful when using the Azure Insights Management Library.

Retrieving Metrics via the Insights Management Library

Just like working with the REST API, there are three basic steps for working with the Insights Management Library:

  1. Authenticate the Azure Insights request
  2. Retrieve the available metric definitions
  3. Retrieve the metric values

The first step is to authenticate by retrieving the JWT token from Azure AD. Assuming the Azure AD service principle is already configured, retrieving the token can be as simple as shown in the code sample below.

Get Auth Token
  1. private static string GetAccessToken()
  2. {
  3.     var authenticationContext = new AuthenticationContext(string.Format(“{0}”, _tenantId));
  4.     var credential = new ClientCredential(clientId: _applicationId, clientSecret: _applicationPwd);
  5.     var result = authenticationContext.AcquireToken(resource: “”, clientCredential: credential);
  6.     if (result == null)
  7.     {
  8.         throw new InvalidOperationException(“Failed to obtain the JWT token”);
  9.     }
  10.     string token = result.AccessToken;
  11.     return token;
  12. }


The primary class for working with the Insights API is the InsightsClient. This class exposes functionality to retrieve the available metric definitions and metric values, as seen in the sample code below:

Get Metric Data
  1. private static MetricListResponse GetResourceMetrics(TokenCloudCredentials credentials, string resourceUri, string filter, TimeSpan period, string duration)
  2. {
  3.     var dateTimeFormat = “yyy-MM-ddTHH:mmZ”;
  4.     string start = DateTime.UtcNow.Subtract(period).ToString(dateTimeFormat);
  5.     string end = DateTime.UtcNow.ToString(dateTimeFormat);
  6.     StringBuilder sb = new StringBuilder(filter);
  7.     if (!string.IsNullOrEmpty(filter))
  8.     {
  9.         sb.Append(” and “);
  10.     }
  11.     sb.AppendFormat(“startTime eq {0} and endTime eq {1}”, start, end);
  12.     sb.AppendFormat(” and timeGrain eq duration'{0}'”, duration);
  13.     using (var client = new InsightsClient(credentials))
  14.     {
  15.         return client.MetricOperations.GetMetrics(resourceUri, sb.ToString());
  16.     }
  17. }
  18. private static MetricDefinitionListResponse GetAvailableMetricDefinitions(TokenCloudCredentials credentials, string resourceUri)
  19. {
  20.     using (var client = new InsightsClient(credentials))
  21.     {
  22.         return client.MetricDefinitionOperations.GetMetricDefinitions(resourceUri, null);
  23.     }
  24. }


For the above code, the resource URI to use is the full path to the desired Azure resource. For example, to query against an Azure Web App, the resource URI would be:



It is also possible to query the metric data for a classic Azure Virtual Machine – just change the request URI to be appropriate for the classic VM:



To find the resource URI for a desired resource, one approach is to use the tool. Simply browse to the desired resource and then look at the URI shown, as in the screenshot below.



For the full code sample, please see my GitHub repository available at

Thanks to Yossi Dahan’s blog post at for the inspiration.


Deploy a WordPress Azure Web App with an Alternative MySQL Database

I was recently presented with an interesting question about Azure Web Apps, WordPress, and MySQL. While not necessarily a “hard” question, the answer wasn’t as readily available as I first anticipated. I thought I would share my experience here in hopes of helping others.

The Question

How can you deploy a WordPress site using Azure Web Apps that uses a MySQL database instance that is not a ClearDB database available in the Azure subscription?


Normally when you create a WordPress site using Azure Web Apps you are presented with an option to select an existing ClearDB MySQL database, or create a new one. But what if you don’t want to use an existing instance or create a new one? What if you want to use a MySQL database instance deployed to an Azure VM or you have a ClearDB MySQL database that doesn’t show in the Azure Portal (e.g. one of the ClearDB Basic offerings)?


The Answer(s)

Like most technology related questions (or life in general), there are a few ways to solve this challenge. There is the “easy” way, and there is the more powerful, yet slightly more complicated, some would argue the “right” way.

The Easy Way

The easiest approach is to create a WordPress site with Azure Web Apps and select either an existing ClearDB database or create a new ClearDB database. Once the WordPress site is deployed, you can then change the database connection string in the wp-config.php file to be the database you want (e.g. a ClearDB Basic instance or a MySQL instance on an Azure VM).

  1. Let the WordPress site be deployed, but do not complete the installation. In other words, once the site is deployed, browsing to the site’s URL should result in the standard WordPress default installation prompt.
  2. WordPress_Default_Install_1
  3. Open the Kudu console by going to If you’re already signed into the Azure Portal, you should proceed through without any authentication challenge. Otherwise you’ll be challenged for your authentication credentials.
  4. Navigate to the Debug console (via the menu on the top). Browse to the \site\wwroot\ directory.
  5. Kudo_wpconfig_1
  6. Edit the wp-config.php file by clicking on the pencil icon to the left of the file name. Doing so will switch to an edit view for the file. Don’t click on the delete icon. . . that’d be a bad thing.
  7. Within the wp-config.php file, change the DB_NAME, DB_USER, DB_PASSWORD, and DB_HOST values to be that of the desired database. Save the file.
  8. WordPress_Default_Install_wpconfig_1
  9. Now reload your site – This should load the default WordPress installation page prompting you to complete the WordPress installation.
  10. Complete the installation. This should use the database setting as configured in the wp-config.php file to finish the WordPress installation.
  11. If you created a free ClearDB database to start with, feel free to delete that ClearDB database.

The Alternative

And now the real fun begins! In this alternative approach, an Azure Resource Manager (ARM) template can be used to create the WordPress site on Azure Web Apps and wire up a database of your choosing. To make this happen you will need the ARM template and a MySQL database of your choosing.

To get the ARM template, my first thought was that I could download the template that the Azure Portal is using and simply modify the database connection details to be what I wanted. Wrong. The templates I tried turned out to be a bit more complicated that I wanted. However, they did provide a good start and helped me understand what I needed to do.

If you’re curious, you can get the templates by invoking the PowerShell script below.

# Retrieve all available items
$allGalleryItems = Invoke-WebRequest -Uri "" | ConvertFrom-Json

# Get all items published by WordPress
$allGalleryItems | Where-Object { $_.PublisherDisplayName -eq "WordPress" }
$allGalleryItems | Where-Object { $_.Identity -eq "WordPress.WordPress.1.0.0" }

# Save default template for all items under directory "C:\Templates"
$allGalleryItems | Foreach-Object {
$path = Join-Path -Path "C:\templates" -ChildPath $_.Identity
New-Item -type Directory -Path $path -Force

$.Artifacts | Where-Object { $.type -eq "template" } | ForEach-Object {
$templatePath = Join-Path -Path $path -ChildPath ( $_.Name + ".json" )

(Invoke-WebRequest -Uri $_.Uri).Content | Out-File -FilePath $templatePath

(original PowerShell sample from

Using the ARM template obtained from the gallery sample as inspiration, I created a new ARM template. You can get the full sample on my GitHub repo at

"resources": [
"apiVersion": "2014-06-01",
"name": "[parameters('hostingPlanName')]",
"type": "Microsoft.Web/serverfarms",
"location": "[resourceGroup().location]",
"tags": {
"displayName": "HostingPlan"
"properties": {
"name": "[parameters('hostingPlanName')]",
"sku": "[parameters('sku')]",
"workerSize": "[parameters('workerSize')]",
"numberOfWorkers": 1
"apiVersion": "2014-06-01",
"name": "[variables('webSiteName')]",
"type": "Microsoft.Web/sites",
"location": "[resourceGroup().location]",
"tags": {
"[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "Resource",
"displayName": "Website"
"dependsOn": [
"[concat('Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
"properties": {
"name": "[variables('webSiteName')]",
"serverFarm": "[parameters('hostingPlanName')]"
"resources": [
"apiVersion": "2014-11-01",
"name": "connectionstrings",
"type": "config",
"dependsOn": [
"[concat('Microsoft.Web/sites/', variables('webSiteName'))]"
"properties": {
"defaultConnection": {
"value": "[variables('dbConnectionString')]",
"type": 0
"apiVersion": "2014-06-01",
"name": "web",
"type": "config",
"dependsOn": [
"[concat('Microsoft.Web/sites/', variables('webSiteName'))]"
"properties": {
"phpVersion": "5.6"
"name": "MSDeploy",
"type": "extensions",
"location": "[resourceGroup().location]",
"apiVersion": "2014-06-01",
"dependsOn": [
"[concat('Microsoft.Web/sites/', variables('webSiteName'))]",
"[concat('Microsoft.Web/Sites/', variables('webSiteName'), '/config/web')]"
"tags": {
"displayName": "WordPressDeploy"
"properties": {
"packageUri": "",
"dbType": "MySQL",
"connectionString": "[variables('dbConnectionString')]",
"setParameters": {
"AppPath": "[variables('webSiteName')]",
"DbServer": "[parameters('databaseServerName')]",
"DbName": "[parameters('databaseName')]",
"DbUsername": "[parameters('databaseUsername')]",
"DbPassword": "[parameters('databasePassword')]",
"DbAdminUsername": "[parameters('databaseUsername')]",
"DbAdminPassword": "[parameters('databasePassword')]"

The most relevant section is the MSDeploy resource extension (around line 60). It is this extension that deploys WordPress and gets the default database connection string set up. You provide the database server name, database name, database username and database password as input parameters to the ARM template. The ARM template will use those parameters to construct a database connection string in the proper format (set in a variable in the template).

Once the template is created, it can be deployed with a few lines of PowerShell:


#NOTE - Ensure the correct Azure subscription is current before continuing. View all via Get-AzureRmSubscription -All
#Select-AzureRmSubscription -SubscriptionId "[your-id-goes-here]" -TenantId "[your-azure-ad-tenant-id-goes-here]"

$ResourceGroupName = "dg-wordpress-001"
$ResourceGroupLocation = "East US"
$TemplateFile = "azuredeploy.json"
$TemplateParametersFile = "azuredeploy.parameters.json"

Test-AzureRmResourceGroupDeployment -ResourceGroupName $ResourceGroupName `
-TemplateFile $TemplateFile `
-TemplateParameterFile $TemplateParametersFile `

# Create or update the resource group using the specified template file and template parameters file
New-AzureRmResourceGroup -Name $ResourceGroupName -Location $ResourceGroupLocation -Verbose -Force -ErrorAction Stop

New-AzureRmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseName + '-' + ((Get-Date).ToUniversalTime()).ToString('MMdd-HHmm')) `
-ResourceGroupName $ResourceGroupName `
-TemplateFile $TemplateFile `
-TemplateParameterFile $TemplateParametersFile `
-Force -Verbose

The reason I like this approach is that it is very clear what is being deployed. I can customize the template however I like, adding or removing additional resources as needed. Plus, I don’t have to go through that “create a database just to delete it” dance.

For instance, I can envision a version of this ARM template that may optionally set up a MySQL database on an Azure VM. Oh . . . look here, Someone already did mostly just that! That template could be modified to have some options to allow for the creation of a database in a few different configurations. Thanks for saving me some work. Naturally I found this after I went through all the work above. Go figure!  🙂

Join me as a Cloud Solution Architect

Earlier this year I started a new chapter in my career as I joined Microsoft as a Cloud Solution Architect (CSA). Working for Microsoft has been a goal of mine for long time (since sometime in high school probably). Working for Microsoft in a role that allows me to continue working with Azure and solving interesting problems . . .  well, that’s just a lot of fun!

It is no secret that Microsoft is moving full steam ahead with Azure. The pace of innovation is mind blowing!  Part of my job as a CSA is to keep up on all the latest technologies in Azure – understanding how they really work and how to best leverage the various services/features to solve problems. It is a tough job, but someone has to do it. It is a fun job too!

The Azure platform is incredibly large. There are so many different services that is is in reality impossible to be an expert in all of the services. In that light, Microsoft has recently created a new role, the Data Solution Architect (DSA), that compliments the Cloud Solution Architect role. The DSA role focuses on data platform technologies such as Azure SQL Database, Machine Learning, HDInsight, etc. That allows the CSA role to focus on more of the compute and infrastructure services. Naturally, there is a bit of crossover in some areas . . . that is OK.

If you have ever wanted to work in Azure – in any capacity – now is a great time to join Microsoft. There are 292 current job openings currently showing on the Microsoft Careers site.

My team is currently hiring for both the Cloud Solution Architect and Data Solution Architect role. I currently work in SMS&P (that’s Microsoft speak for ‘Small and Midmarket Solutions & Partners’) in Central region for Microsoft. Current open positions include:

  • DSA – Austin, TX
  • CSA – Austin, TX
  • CSA – Minneapolis, MN
  • DSA – St. Louis, MO
  • DSA – Nashville, TN
  • CSA – Houston, TX
  • DSA – Chicago, IL
  • DSA – Milwaukee, WI
  • DSA – Indianapolis, IN

If you want to see the official job description for a CSA or DSA, check out the links below:

If a CSA or DSA sounds like something you would be interested, please contact me.

Being a CSA or DSA at Microsoft is a pretty cool job. But don’t just take my word for it; check out Walter Myer’s video below where he talks about his experience as a CSA.