Subscribe: Winterdom
Added By: Feedage Forager Feedage Grade B rated
Language: English
application insights  application  azure  create  event grid  event  insights  microsoft eventgrid  microsoft  new  parameters  resource 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Winterdom


by dæmons be driven - a site by Tomas Restrepo


Creating Event Grid Subscriptions

Mon, 21 Aug 2017 00:00:00 +0000

A few days ago, I wrote about using Azure Resource Manager (ARM) templates to deploy Azure Event Grid. That sample showed how to create a new Event Grid Topic resource. This basically gives you an URL you can publish custom events to and have them routed to one or more event subscribers. However, one of the very powerful features in Event Grid is not custom topics, but subscribing to events published by the Azure fabric itself; that is, events published by Resource Manager Providers. As of these writings, only a few providers support Event Grid, but this number is sure to grow in the coming months. Supported Event Publishers What Azure resource manager providers support Event Grid? An easy way to find this out is to ask Azure itself. To do this, we can leverage the excellent ArmClient tool. Resource Managers that support publishing events through Event Grid are called Topic Types, and we can query these: armclient get /providers/Microsoft.EventGrid/topicTypes?api-version=2017-06-15-preview If the command succeeds, we should see something like this: { "value": [ { "properties": { "provider": "Microsoft.Eventhub", "displayName": "EventHubs Namespace", "description": "Microsoft EventHubs service events.", "resourceRegionType": "RegionalResource", "provisioningState": "Succeeded" }, "id": "providers/Microsoft.EventGrid/topicTypes/Microsoft.Eventhub.Namespaces", "name": "Microsoft.Eventhub.Namespaces", "type": "Microsoft.EventGrid/topicTypes" }, ... ] } You can also use the Azure CLI command az eventgrid topic-type list on version 2.0.14 or later. Knowing what event publishes exists is only half the story, though. We also want to know what type of events a publisher supports. These are called Event Types in Event Grid, and we can query those as well. For example, let’s say we want to find out events supported by the Microsoft.Resources.ResourceGroups topic type: armclient get /providers/Microsoft.EventGrid/topicTypes/Microsoft.Resources.ResourceGroups/eventTypes?api-version=2017-06-15-preview If the command succeeds, we should see an output similar to the following: { "value": [ { "properties": { "displayName": "Resource Write Success", "description": "Raised when a resource create or update operation succeeds.", "schemaUrl": "TBD" }, "id": "providers/Microsoft.EventGrid/topicTypes/Microsoft.Resources.ResourceGroups/eventTypes/Microsoft.Resources.ResourceWriteSuccess", "name": "Microsoft.Resources.ResourceWriteSuccess", "type": "Microsoft.EventGrid/topicTypes/eventTypes" }, ... ] } The equivalent Azure CLI command would be az eventgrid topic-type list-event-types --name Microsoft.Resources.ResourceGroups. Now let’s see how we can subscribe to events published by the Azure fabric. Event Hub Namespaces Currently, you can only subscribe to events published at the Event Hub Namespace level, not an individual Event Hub itself. For this we’d use the Microsoft.EventHub.Namespaces topic type to create a nested resource of type Microsoft.EventGrid/eventSubscriptions: { "apiVersion": "2017-06-17-preview", "name": "[concat(parameters('eventHubNamespaceName'), '/Microsoft.EventGrid/', parameters('subscriptionName'))]", "type": "Microsoft.EventHub/namespaces/providers/eventSubscriptions", "tags": { "displayName": "Webhook Subscription" }, "dependsOn": [ "[concat('Microsoft.EventHub/Namespaces/', parameters('eventHubNamespaceName'))]" ], "properties": { "destination": { "endpointType": "WebHook", "properties": { "endpointUrl": "[parameters('webhookUrl')]" } }, "filter": { "includedEventTypes": [ "All" ], "subjectBeginsWith": "", "subjectEndsWith": "", "subjectIsCaseSensitive": false } } } Notice that the type property has the format /providers/&[...]

Deploying an Event Grid + WebHook with ARM

Thu, 17 Aug 2017 00:00:00 +0000

Azure Event Grid was announced a couple of days ago for building event-driven architecture. While the initial preview is a bit limited in the number of supported event publishers, it has tons of promise and I was immediately intrigued by the possibilities! Being a fan of Azure Resource Manager templates, I soon tried to figure out how you could automate the creation of Event Grid resources. After some trial and error and some research, I was able to come up with an initial template to create: An Event Grid topic resource A WebHook-based event subscription. You can find the complete sample template here. The sample template first declares some simple parameters and variables: { "parameters": { "eventGridName": { "type": "string", "minLength": 1 }, "webhookName": { "type": "string", "minLength": 1 }, "webhookUrl": { "type": "string", "minLength": 1 }, "webhookEventTypes": { "type": "array", "defaultValue": [ "All" ] }, "webhookPrefixFilter": { "type": "string", "defaultValue": "" }, "webhookSuffixFilter": { "type": "string", "defaultValue": "" }, "webhookCaseSensitive": { "type": "bool", "defaultValue": false }, "webhookLabels": { "type": "array", "defaultValue": [ "" ] } }, "variables": { "apiVersion": "2017-06-15-preview" } } Creating the Event Grid topic is relatively simple; it’s a resource of type Microsoft.EventGrid/topics. We just need to supply the right API version (2017-06-15-preview in this release), the grid name, and one of the supported Azure regions: { "apiVersion": "[variables('apiVersion')]", "name": "[parameters('eventGridName')]", "type": "Microsoft.EventGrid/topics", "location": "[resourceGroup().location]", "tags": { "displayName": "EventGrid" }, "properties": { "name": "[parameters('eventGridName')]" } } Creating the WebHook subscription took a bit of work to figure out. Eventually realized this needed to be a nested resource of type Microsoft.EventGrid/topics/providers/eventSubscriptions, and it needed a special name in the pattern /Microsoft.EventGrid/. You can then specify the WebHook URL, and the filter of which events you want delivered to it: { "apiVersion": "[variables('apiVersion')]", "name": "[concat(parameters('eventGridName'), '/Microsoft.EventGrid/', parameters('webhookName'))]", "type": "Microsoft.EventGrid/topics/providers/eventSubscriptions", "tags": { "displayName": "Webhook Subscription" }, "dependsOn": [ "[concat('Microsoft.EventGrid/topics/', parameters('eventGridName'))]" ], "properties": { "destination": { "endpointType": "WebHook", "properties": { "endpointUrl": "[parameters('webhookUrl')]" } }, "filter": { "includedEventTypes": "[parameters('webhookEventTypes')]", "subjectBeginsWith": "[parameters('webhookPrefixFilter')]", "subjectEndsWith": "[parameters('webhookSuffixFilter')]", "subjectIsCaseSensitive": "[parameters('webhookCaseSensitive')]" }, "labels": "[parameters('webhookLabels')]" } } End result of applying the template in the portal shows are hook created correctly: I also added some outputs to the ARM template so that it returns the topic endpoint URL of the created Event Grid, as well as the first access key: "outputs": { "eventGridUrl": { "type": "string", "value": "[reference(parameters('eventGridName')).endpoint]" }, "eventGridKey": { "type": "string", "value": "[listKeys(resourceId('Microsoft.EventGrid/topics', parameters('eventGridName')), variables('api[...]

VSTS Build hanging with XUnit tests

Tue, 15 Aug 2017 00:00:00 +0000

I was setting up a simple demo on Visual Studio Team Services (VSTS) today and ran into an oddity.The sample project I was using had a mixture of MSTest and XUnit-based tests. This would run just fine in Visual Studio, but after setting up a hosted build in VSTS, I noticed that the build would seem to hang after apparently running all tests, so I had to cancell builds.

Looking at the logs, I eventually found this line, which was not expected:

2017-08-15T16:48:44.4074176Z Information: [ 00:00:00.8781599]   Starting:    Microsoft.VisualStudio.QualityTools.UnitTestFramework

Suspecting this could be related, I modified the build definition. In the Test Assemblies task, I modified the Test assemblies property to include the following line:


This tells the test runner not to try to attempt to discover or run tests in this DLL.

Surprisingly, this worked and allow builds to complete normally after running all tests.


Trying out PowerShell 6.0 in containers

Mon, 14 Aug 2017 00:00:00 +0000

I’ve been meaning to give a try to the PowerShell Core 6.0 builds on Linux recently. The PowerShell team offers some nice pre-built Docker images you can use to test it, so that’s great. I thought that would be a simple, but cool scenario to try out the new Azure Container Instances service! Of course, you cannot use the pre-built image directly for this, as you wouldn’t have a direct way to connect to the hosted container, so I decided to create a new image that not only had PowerShell Core 6.0 in it, but also have SSH built-in. This would be useful to quickly spin up a PowerShell-on-Linux environment for demos and simple tests. So the first thing to do is to create a new empty directory for our new container files: mkdir powershell cd powershell Then, using a text editor (Vim, is my poison of choice) I created a new dockerfile in this directory, with the following contents: FROM microsoft/powershell RUN apt-get update && apt-get install -y openssh-server RUN mkdir /var/run/sshd ENV newuser tomasr # add new user RUN useradd -m ${newuser} RUN usermod -aG sudo ${newuser} # copy SSH keys RUN mkdir -p /home/${newuser}/.ssh ADD authorized_keys /home/${newuser}/.ssh/authorized_keys # set right permissions RUN chown ${newuser}:${newuser} /home/${newuser}/.ssh RUN chown ${newuser}:${newuser} /home/${newuser}/.ssh/authorized_keys RUN chmod 700 /home/${newuser}/.ssh/authorized_keys EXPOSE 22 CMD ["/usr/sbin/sshd", "-D"] This dockerfile does a few simple things: Inherits from the official PowerShell docker image Installs SSHD Creates a new user called tomasr Copies my authorized_keys file to the new user dir and sets the right permissions Starts SSHD Now, let’s build ourselves a new docker image: docker build -t powershell . If everything works, you should see a message similar to Successfully built 34f996c46a23. Now, before I can use this image with Azure Container Instances, I need to push this to some image repository. I’ll use Azure Container Registry, which I have already provisioned on my Azure account by this point. So let’s provide docker with the credentials to my registry, tag the image, and then push it to the registry: sudo docker login -u -p sudo docker tag powershell sudo docker push Once this is complete, we can use the Azure CLI to create a new container instance called p4 using our new image. We’ll give it a public IP address listening on port 22: az container create --name p4 --image --resource-group blogtopics --ip-address public --port 22 --registry-username --registry-password After a few minutes, we can check if our deployment is complete and our container is running, using the following command: az container show --name p4 --resource-group blogtopics If the container is ready, we should see something like: "ipAddress": { "ip": "", "ports": [ { "port": 22, "protocol": "TCP" } ] }, "location": "eastus", "name": "p4", "osType": "Linux", "provisioningState": "Succeeded", When the container is fully deployed, we just need to use an SSH agent with the right private key to connect and run powershell on it: Once we’re done playing with our environment, we can just delete the container instance: az container delete --name p4 --resource-group blogtopics Conclusion Using containers, we’ve built a simple way to quickly provision a new machine to do simple PowerShell demonstrations. It’s not terribly useful, but still a fun use of technology! [...]

Using Azure Functions to create AppInsights release annotations

Tue, 08 Aug 2017 00:00:00 +0000

I recently had the opportunity to discuss with a customer one cool feature in Application Insights: Release Annotations. As the article above shows, you can easily create Release Annotations using Visual Studio Team Services release pipelines. In this post, I’d like to provide an alternative way to implement release annotations for applications hosted on Azure App Service, by using the WebHooks support in Kudu, and Azure Functions. Context Let’s assume that you already have a Web application deployed to a Azure App Service, and have hooked up some sort of automated deployment story. For example, you may be using the local git repository to deploy a new version of your app through a git push command. Let’s also assume you already have an Application Insights resource you’re using to monitor this application. Step 1: Obtain Application Key The first step in automating the creation of release annotations is going to be to prepare our Application Insights resource by obtaining a new Application Key. We can do this from the Azure Portal, by finding our Application Insights resource, and selecting the API Access option on the left-hand menu: Press the Create API key option at the top-left. Provide a description for your new key (such as “Create Annotations”). Select the “Write annotations” permission. Press the “Generate key” button. Once the key has been generated, copy the value. If you don’t, you’ll have to create a brand new key! Also, note the Application ID value for your Application Insights resource, displayed alongside your new API key. We will need both to create a new annotation later on. Step 2: Create the Function App Create a brand new Function App in your Azure Subscription. For this demo, I’ll create a new one in a Consumption Plan: Now, create a new Function. For simplicity, I’m going to use the “Webhook + API” template in JavaScript: The request sent by Kudu is going to contain some useful properties for our new function: id: the deployment ID. siteName: The name of our Web App. You could use this, for example, to have a single function support deployments for multiple applications. message: the deployment message. For a Git-based deployment, this could be part of the commit message. authorEmail: For Git-based deployment, this would be the author of the last commit. You can find a complete list of the properties here. Creating the release annotation requires a few things besides the Application Insights application id and key: The name you want to give to the release annotation. We’ll use a combination of the site name and the deployment ID. The time the release happened. We’ll just use the current date/time. A list of arbitrary properties: We’ll store the commit message, as well as the author. Our function code would look something like this: var request = require('request'); module.exports = function (context, req) { var deploymentId =; var siteName = req.body.siteName; var releaseName = siteName + '_' + deploymentId; var appId = process.env.APPINSIGHTS_APPID; var apiKey = process.env.APPINSIGHTS_APIKEY; var releaseProperties = { ReleaseName: releaseName, Message: req.body.message.trim(), By: req.body.authorEmail }; context.log('Creating a new release annotation: ' + releaseName); var body = { Id: deploymentId, AnnotationName: releaseName, EventTime: (new Date()).toISOString(), Category: 'Deployment', // the Properties part contains a JSON object as a string Properties: JSON.stringify(releaseProperties) }; var options = { url: '' + appId + '/Annotations?api-version=2015-11', method: 'PUT', headers: { 'X-AIAPIKEY': apiKey }, body: body, json: true }; [...]

Viasfora v3.6 Released

Sun, 06 Aug 2017 00:00:00 +0000

Today I pushed to the Visual Studio Gallery version v3.6 of my Viasfora Visual Studio Extension.

This version includes a few minor bugfixes, and some new features, such as:

  • Export/Import settings
  • Export/Import color themes
  • Support for JavaScript template literals

If you run into any issues, or have any feature suggestion, please create a new issue on GitHub!


Creating an Azure WebApp through ARM and Node

Fri, 04 Aug 2017 00:00:00 +0000

I recently posted an article on how to create an Azure WebApp on AppService with an associated Application Insights resource using Azure Resource Manager (ARM) templates. I’ve been playing the past couple of days with the Azure SDK for Node, and thought I’d write how to accomplish the same thing through code. Let’s recap what we want to create from our previous article: Create a new resource group Create an Application Insights resource Create a new App Service Plan Create a new WebApp on this ASP Configure the AppInsights InstrumentationKey as an appSettings Deploy the Application Insights site extension Our first step using the SDK is to import the ms-rest-azure package and login to our Azure account using an interactive login: const MsRest = require('ms-rest-azure'); // ... login() { return MsRest.interactiveLogin({ domain: this.tenantId }); } This will give us a credentials object we can use later for the rest of the calls. Once this is successful, we can use the azure-arm-resource package to create the resource group. All we need to specify is the name of the new resource group, and the Azure region we want to use: const ResourceManagement = require('azure-arm-resource'); // ... createResourceGroup(resourceInfo) { let group = { location: this.location }; let rgmanagement = new ResourceManagement.ResourceManagementClient(resourceInfo.credentials, this.subscriptionId); return rgmanagement.resourceGroups.createOrUpdate(this.resourceGroupName, group) } The next step is to create a new Application Insights resource in our resource group. Like in our ARM template sample, we want to provision a tag that will later link it to the WebApp once we create it. The azure-arm-insights package, unfortunately, does not support managing components resources, so we need to use the generic azure-arm-resources package to create a new resource of type `microsoft.insights/components’: createAppInsights(resourceInfo) { let envelope = { location: this.location, properties: {}, tags: { } }; let webAppId = `${}/providers/Microsoft.Web/sites/${this.webAppName}` envelope.tags[`hidden-link:${webAppId}`] = 'Resource'; let management = new ResourceManagement.ResourceManagementClient(resourceInfo.credentials, this.subscriptionId); return management.resources.createOrUpdate( this.resourceGroupName, // resource group 'microsoft.insights', // provider namespace '', // parent resource 'components', // resource type this.appInsightsName, // resource name '2014-04-01', // api version envelope); } We can now create the App Service Plan and the WebApp using the azure-arm-website package. Creating the ASP is relatively straightforward; we just need the name, location, tier and capacity. Creating the application takes a little bit more work, and for this we will need: The ID of the App Service Plan created The object returned by our previous step to get the InstrumentationKey of the Application Insights resource: const WebAppManagementClient = require('azure-arm-website'); // ... createHostingPlan(resourceInfo) { var info = { location: this.location, sku: { name: this.webPlanTier, capacity: this.webPlanCapacity } }; let wam = new WebAppManagementClient(resourceInfo.credentials, this.subscriptionId); return wam.appServicePlans.createOrUpdate(this.resourceGroupName, this.webPlanName, info); } createWebApp(resourceInfo) { var envelope = { name: this.webAppname, location: this.location, kind: 'web', serverFarmId:, properties: { }, siteConfig: { appSettings: [ { [...]

Updating Viasfora Themes

Wed, 02 Aug 2017 00:00:00 +0000

A few days ago, I wrote about Themes coming to Viasfora in v3.6. One of the issues that some users will face is that I had to make a breaking change to fix something I was unhappy about for a long time: The classification names used by Viasfora were very inconsistent.

Renaming classification names means that when you update from v3.5 to v3.6, you will lose any customizations made to editor colors used by Viasfora. You can customize everything by hand again, but this can be very annoying. Ewen Wallace noticed this yesterday.

So to minimize the impact of having to go to this, I made a simple PowerShell script that can help you when upgrading. You can find the full code for the script in a GitHub gist.

To use this script and migrate your custom colors, do the following:

  • In Visual Studio, use the Tools -> Import and Export Settings menu option.
  • Select the default Export selected environment settings option.
  • Deselect everything, and only mark Options / Environment / Fonts and Colors
  • Select the destination file name and export your settings.

Once you’ve exported your settings to a .vssettings file, download the PowerShell Script, and save it as a .PS1 file.

  • Open a new PowerShell console
  • Execute the script like this:
.\Convert-ViasforaTheme.ps1 -VSSettings "path_to_vssettings_file" -ThemeFile "path_to_new_theme_file"

The result will be a JSON file that you can then import into Viasfora from the Tools -> Options -> Viasfora -> Import/Export.


Deploying a WebApp with Application Insights using ARM

Tue, 01 Aug 2017 00:00:00 +0000

I saw a question yesterday about deploying a WebApp on Azure App Service with Application Insights using an ARM template, and thought that would make for a good sample. The sample ARM template can be found here as a Visual Studio 2017 project. Creating both the WebApp and the Application Insights resources independently is no problem and should be relatively straight forward for anyone familiar with ARM. However, creating them fully integrated takes just a little bit more work. Creating the Application Insights Resource The first step we want to take when creating the template should be to create the AppInsights resource. If you use the Visual Studio wizard for creating an ARM template, you’ll notice that it does this backwards: It forces the AppInsights resource to be dependent on the WebApp being created. However, this is not necessary, and in fact we want to do it the other way around. In just a moment, it will become obvious why. So let’s create the Application Insights resource: { "apiVersion": "2014-04-01", "name": "[parameters('appInsightsName')]", "type": "Microsoft.Insights/components", "location": "East US", "tags": { "[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('webSiteName'))]": "Resource", "displayName": "AppInsightsComponent" }, "properties": { "applicationId": "[parameters('appInsightsName')]" } } Here we’re creating new resource of type microsoft.insights/components. The only interesting bit is the tags: Notice that we add a new tag with the name matching the resource id of the WebApp we’re going to create. We can do this even if the WebApp has not been created yet, because we already know what the resource ID is going to look like, and also because the platform will not validate that this represents a valid resource. The reason we add this tag, is that this creates the link between the WebApp and Application Insights resources. This allows the portal experience to work so that when you navigate to the Application Insights option in the WebApp left-side menu, it shows you the data from the right AppInsights resource. Creating the WebApp The second step is creating the App Service Plan and the Web App. There are plenty of good examples of this already, so I won’t go into all the details. There are, however, three interesting aspects I want to highlight. Setting the right order We want to make sure that the WebApp is created after the AppInsights resource, which we do by adding an explicit dependency next to the ASP: "dependsOn": [ "[resourceId('Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]", "[resourceId('microsoft.insights/components/', parameters('appInsightsName'))]" ], The reason we want to do this is that we want to capture the InstrumentationKey for the brand new AppInsights resource, and create the APPINSIGHTS_INSTRUMENTATIONKEY appSetting: "resources": [ { "apiVersion": "2015-08-01", "name": "appsettings", "type": "config", "dependsOn": [ "[resourceId('Microsoft.Web/Sites', parameters('webSiteName'))]" ], "properties": { "APPINSIGHTS_INSTRUMENTATIONKEY": "[reference(concat('microsoft.insights/components/', parameters('appInsightsName'))).InstrumentationKey]" } }, ... } Adding the Application Insights site extension If our WebApp is going to host a .NET Application, we also want to make sure that the Application Insights site extension is deployed in Kudu for our app. This adds the necessary profiler so that we get full dependency traces. We can do this with yet another nested resource in our ARM template: { "apiVersion": "2015-08-01", "name": "Microsoft.ApplicationInsights.AzureWebSites", "type": "siteextensions", "de[...]

Using Application Insights Cohorts

Mon, 31 Jul 2017 00:00:00 +0000

A while ago I added some basic telemetry to Viasfora based on Azure Application Insights. This telemetry contains no Personally Identifiable Information (PII) of note (*), and it is only used to track:

  • Viasfora getting loaded into Visual Studio
  • Some features being used

All of this is done by emiting custom Events using AppInsights. You can see the full code here.

One of the things I use this telemetry for is finding out which Visual Studio versions are seeing the most use amongst Viasfora users. One feature that was recently added to Application Insights can make this even easier to analyze: User Cohorts.

User cohorts allow you define groups of users based on PageView or custom Events in the telemetry, and their properties. In our case, every custom event we generate contains a HostVersion property, with the major Visual Studio version:


Based on this, we can easily use Cohorts to create a group for users running Visual Studio 2017, for example:


We can save this cohort definition with a name (such as ‘VS2017 Users’), and then use it in other analysis. For example, I can now go to the Users page, and easily create a report of users running VS2017:


Notice we’re filtering the report to what Viasfora versions are being used by VS2017 users in the last 30 days. The results are quite interesting:


Here we can see that, at the time of this writing, most VS2017 users where indeed using the most recent release (3.5.139). We can also see that a few users are running older versions, which is somewhat unexpected, since Visual Studio 2017 will update extensions automatically by default.

Repeating the same analysis for VS2013 becomes a snap, as we already have a cohort define for them:


Notice how much more varied the Viasfora versions are; VS2013 do not seem to update extensions all that often!


Cohorts are a useful feature in Application Insights that can make analysis simpler and faster, by grouping users based on some common feature.


(*) Telemetry can also be disabled from the Tools -> Options page for Viasfora.