Subscribe: Winterdom
http://feeds.feedburner.com/Commonality
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
app  azure  certificate  create  key vault  key  microsoft  new  parameters  resource  service principal  service  sql  vault 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Winterdom

Winterdom



by dæmons be driven - a site by Tomas Restrepo



 



Logic Apps KeyVault Connector - Part 1

Sun, 15 Oct 2017 00:00:00 +0000

Azure Logic Apps now supports writing custom connectors, which are just custom REST APIs for which you can customize the experience so that they feel like the built-in Logic Apps connectors. I wanted to give try my hand at writing one, so decided on a simple use case: Writing a connector that provides a way to retrieve a secret stored in Azure Key Vault. This part 1 in a series of articles on my experience writing this. Update 2017-10-17: A few details on the post have been updated due to me confusing summary and x-ms-summary. Connector Implementation In this post, I’d like to talk a bit about the connector implementation. I decided to implement this as an ASP.NET Core 2.0 WebAPI application that uses the Microsoft.Azure.KeyVault library to query the secret information. For my initial test, all I implemented was two simple operations: List available secrets in the selected Key Vault store Get the value of a specified secret (for the latest version, or for a specific one) For the most part, implementation is fairly straightforward, but there are a couple of interesting things that are worth discussing. OpenAPI Extensions The connector uses the Swashbuckle library to generate the OpenAPI (Swagger) description for the API. A Logic Apps custom connector has an extended OpenAPI definition that includes things such as: Operation descriptions Additional metadata for object members and operations (such as descriptions) Dynamic parameters and schemas You can either write the connector OpenAPI description by hand and upload it, or start with a standard definition and then customize it in the Azure Portal. I quickly found out that every time you point Logic Apps to your updated OpenAPI definition after changes will overwrite all your customizations, so you’d need to do them all over again. This makes sense, since the idea is that you’d normally upload an OpenAPI definition that already contains the necessary extensions, but if not, it quickly gets really annoying. What I did to simplify this process somewhat was writing a couple of Swashbuckle extensions to produce at least a working definition right from the start. For every operation in your connector, you should at least have a description and an x-ms-summary field. You’ll also want to correctly name your operation as you want it to appear in the Logic Apps designer by setting the operationId. I ended up writing my operations like this: [HttpGet()] [SwaggerOperation(OperationId = "List Secrets")] [Summary("Lists all secrets")] [Description("Lists the secrets stored in Key Vault")] public async Task> Get(String vaultName) Notice how I use the DescriptionAttribute to specify the longer operation description, and the SummaryAttribute to add the value for summary. Both are processed by custom IOperationFilter extensions for Swashbuckle. Here’s the code for handling the summary: [AttributeUsage(AttributeTargets.Method | AttributeTargets.Parameter)] public class SummaryAttribute : Attribute { public String Summary { get; private set; } public SummaryAttribute(String summary) { this.Summary = summary; } } public class SummaryFilter : IOperationFilter { public void Apply(Operation operation, OperationFilterContext context) { var apiDesc = context.ApiDescription; var summary = apiDesc.ActionAttributes() .OfType() .FirstOrDefault(); if ( summary != null ) { operation.Summary = summary.Summary; } } } The code for handling the description is very similar to this. All that remained was registering these extensions during startup, like this: services.AddSwaggerGen(c => { c.SwaggerDoc("v1", new Info { Title = "KeyVaultController", Version = "v1" }); c.OperationFilter(); c.OperationFilter(); }); I’ve been looking at a similar mechanism for adding support for x-ms-dynamic-values, but there are two reasons I haven’t done so yet: It’[...]



Azure Managed Service Identity - Querying in ARM Template

Wed, 04 Oct 2017 00:00:00 +0000

In a previous post I was lamenting not having a way to obtained the managed service identity generated for an Azure resource, such as a Azure SQL logical server or a Web App from the Azure Resource Manager (ARM) template itself.

The issue was that the reference() function in an ARM template only returns the properties part of the resource definition, and the identity property is defined outside of that (at the same level as the resource id or the location).

It is now possible to do this thanks to a new parameter introduced in the reference() function in ARM. The new definition is:

reference(resourceName | resourceIdentifier, [apiVersion], ['Full'])

Notice the new, optional 'Full' parameter. When this is specified, the reference() function returns the complete resource definition, and not just the properties section. So we can obtain the generated identity easily using something like this:

{
    "outputs": {
        "sqlIdentity": {
            "type":"string",
            "value": "[reference(concat('Microsoft.Sql/servers/', parameters('sqlServerName')), '2015-05-01-preview', 'Full').identity.principalId]"
        }
    }
}
(image)



Unable to locate registry entry for adalsql.dll file path

Wed, 27 Sep 2017 00:00:00 +0000

A couple of days ago I was testing with a customer using Azure Active Directory integrated authentication to Azure SQL Database through the SQL Server ODBC drivers.

On one test machine, we kept getting an error similar to this:

Microsoft ODBC Driver 13 for SQL Server : SQL Server Network Interfaces: Unable to locate the registry entry for adalsql.dll file path. Verify that Active Directory Authentication Library for SQL Server is properly installed.

We checked, and adalsql.dll was present in both C:\Windows\System32 and C:\Windows\SysWOW64, as expected. We also tried downloading the standalone library installer but that would not install, since the library was already in the machine.

Looking around, I realized the problem was not that the library was missing, but that it somehow got installed without it getting registered in the Windows Registry correctly.

To fix this, we created the following registry keys:

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSADALSQL]
"TargetDir"="C:\\WINDOWS\\system32\\adalsql.dll"

[HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\MSADALSQL]
"TargetDir"="C:\\WINDOWS\\system32\\adalsql.dll"

If your system drive is anything other than C:, replace the path accordingly.

After we added these missing keys, the error went away and authentication worked correctly.

(image)



Permissions needed to create a Web App on an ASE

Tue, 26 Sep 2017 00:00:00 +0000

Documenting this here in case I run into this again. While working with a customer that is taking advantage of App Service Environment to host internal applications on Azure with connectivity to their Express Route connection, we ran into an issue when trying to setup permissions so that selected users could create new Web Apps on an existing App Service Plan on the ASE.

For App Service Environment v1, it appears the easiest way to setup the right permissions are:

  • Grant the Reader role on the App Service Environment itself.
  • Grant the Web Plan Contributor role on the App Service Plan.
  • Grant the Web Site Contributor role on the resource group you will create the Web App on.

I was a little concerned about granting the Web Plan Contributor role as it grants way too many permissions on the App Service Plan (ASP), but on v1, this is not so much of an issue as there is no direct monetary impact from the ASP.

I did try setting up a custom role instead, and the least permissions I was able to test that still worked were:

Microsoft.Authorization/*/read
Microsoft.Insights/alertRules/*
Microsoft.Insights/components/*
Microsoft.ResourceHealth/availabilityStatuses/read
Microsoft.Resources/deployments/*
Microsoft.Resources/subscriptions/resourceGroups/read
Microsoft.Support/*
Microsoft.Web/certificates/*
Microsoft.Web/listSitesAssignedToHostName/read
Microsoft.Web/deploymentLocations/read
Microsoft.Web/serverFarms/read
Microsoft.Web/serverFarms/*/read
Microsoft.Web/serverFarms/write
Microsoft.Web/sites/*

This is not much of an improvement, since it appears you still require write permissions on the App Service Plan, but it was still interesting to test.

This has changed for App Service Environment v2, so I can’t comment on that.

(image)



Azure Managed Service Identity Library

Tue, 19 Sep 2017 00:00:00 +0000

A few days ago, the preview of Managed Service Identity for Azure was released, opening up some interesting possibilies to access other Azure resources from your application in a secure manner. App Service is one of the services adding support for managed service identity, including a nice library to make it easier to use. You can find some documentation for the Microsoft.Azure.Services.AppAuthentication library on the sample here. I spent some time today looking at it, and would like to share some thoughts about it. The public surface of the library is fairly simple, and it is easy to use for the purpose it was built. When used on App Service, it is very straightforward to use: It will automatically pick up the MSI_ENDPOINT and MSI_SECRET environment variables and get the token for the resource you specify. While testing, noticed that the MSI_ENDPOINT variable pointed to http://127.0.0.1:[port]/MSI/token/, where the [port] seems to be dynamically assigned for each app. One interesting question that came up was how to support developing and debugging the application on your local dev workstation when using this library, and it is supported. The basis of this is that the library can be configured to use a mechanism other than MSI to generate the token. You can modify the default behavior either by explicitly passing a connection string to the AzureServiceTokenProvider constructor, or by storing it in the AzureServicesAuthConnectionString environment variable. There are several methods supported of authentication supported: Service Principal Id + Secret: To use this method, set the connection string to something like: RunAs=App; TenantId=; AppId=; AppKey= While this is probably the easier method to implement, it’s not very secure, as it will require you to have your service principal credentials in plain text somewhere. So it should not be the preferred method for development. Service Principal Id + X509 Certificate: To use this method, set the connection string to something like: RunAs=App; TenantId=; AppId=; CertificateStoreLocation=[CurrentUser | LocalMachine]; [CertificateThumbprint= | CertificateSubjectName=] This adds a little bit of complexity to the process. You now need to: Install the authentication certificate on the certificate store Redeploy when the certificate expires It is somewhat more secure than using application keys as you don’t need to have credentials in plain text. To mitigate the risk of losing control of the certificate, you could use a different Service Principal per team member, but it adds quite a bit of work to the process. Your Azure CLI credentials: This is an interesting alternative, based on using whatever credentials you’re using on the Azure CLI to access your Azure subscription. To use this method, set the connection string to: RunAs=Developer; DeveloperTool=AzureCLI Internally, this will call az account get-token to acquire the token for the specified resource, using the credentials (access token) stored by the Azure CLI for your subscription This has the advantage of not having to store more credentials and no plain text, but it’s only as secure as your Azure CLI token. When I first looked into this, I thought one disadvantage would be that you’d end up accessing the resources under your own account, but then I remembered that the Azure CLI does allow you to sign on using a service principal, so this is not really an issue. There is, however, one disadvantages I can see: The Azure CLI allows you to have multiple tokens to different accounts stored. So which one is used by this library will depend on what your active subscription is (i.e. whatever you configured with az account set). That can easily cause your application to break if you change the active account and forget to set it back to the correct one later on. [...]



Deploying a Key Vault-based TDE protector for Azure SQL

Thu, 07 Sep 2017 00:00:00 +0000

Azure SQL now supports setting up Transparent Data Encryption while bringing your own encryption keys. This is easy to setup in the Azure Portal, but I wanted to try setting this up in an automated manner, preferably leveraging ARM templates as much as possible. This turned out to be a bit more complex than I expected, but totally doable. Here are the necessary steps: Creating the Key Deploying the Server Granting Key Permissions Creating the Protector The ARM templates for this post can be found on GitHub. Creating the Key When using the BYOK (bring your own key) feature, you need to store your encryption key in an Azure Key Vault. So let’s first create one: In the Azure Portal, locate your Key Vault Click on the Keys icon Click the + Add button Select the Generate option, the name, and other required properties Click the Create button to complete the process You could do the same in a script with an Azure CLI command like the following: az keyvault key create --vault-name $vaultName --name $keyName --size 2048 --protection software Once you’ve created the key, make sure to grab the name of the current version of the key. You can do this in the portal, or by using the command line: az keyvault key show --vault-name $vaultName --name $keyName --query key.kid The URI returned will be something similar to https://.vault.azure.net/keys//. We’re interested in the last segment. Deploying the Server The next step is to use an ARM template to deploy the Azure SQL Server resource. At this point, we are not actually creating the TDE protector, as we need to do something else first. Here’s the core of the template to create the server: { "name": "[parameters('sqlServerName')]", "type": "Microsoft.Sql/servers", "location": "[resourceGroup().location]", "apiVersion": "2015-05-01-preview", "dependsOn": [], "tags": { "displayName": "SQL Logical Server" }, "identity": { "type": "SystemAssigned" }, "properties": { "administratorLogin": "[parameters('sqlServerAdminLogin')]", "administratorLoginPassword": "[parameters('sqlServerAdminLoginPassword')]" } } The interesting bit here is the identity property. When we set it to SystemAssigned, SQL will go to the Azure Active Directory Tenant associated with the Azure Subscription and create a new Service Principal named RN_. This principal will be setup with X509 certificate credentials for authentication. I’m unsure at this point if it’s possible to create this Service Principal manually, as it would simplify things somewhat. Once the server is created, we’ll see this identity reflected in the resource properties: While we’re at it, let’s create a new database on the server, and enable TDE. The latter is done by creating a nested resource of type transparentDataEncryption: { "name": "[parameters('sqlDbName')]", "type": "databases", "location": "[resourceGroup().location]", "apiVersion": "2014-04-01-preview", "dependsOn": [ "[resourceId('Microsoft.Sql/servers', parameters('sqlServerName'))]" ], "tags": { "displayName": "SQL DB" }, "properties": { "collation": "[parameters('sqlDbCollation')]", "edition": "[parameters('sqlDbEdition')]", "maxSizeBytes": "1073741824", "requestedServiceObjectiveName": "[parameters('sqlDbRequestedServiceObjectiveName')]" }, "resources": [ { "comments": "Transparent Data Encryption", "name": "current", "type": "transparentDataEncryption", "apiVersion": "2014-04-01-preview", "properties": { "status": "Enabled" }, "dependsOn": [ "[parameters('sqlDbName')]" ] } ] } At this point, TDE w[...]



Authenticating to SQL Azure with delegated tokens

Thu, 31 Aug 2017 00:00:00 +0000

In a previous post, I discussed how to authenticate to an Azure SQL database from a Web Application (running in Azure App Service) using an Azure Active Directory Service Principal. For this I used a certificate stored in Key Vault to authenticate the principal and obtain a token I could present to SQL. You can find the updated code for this post on GitHub. In this post, let’s expand this a bit further: I will add authentication support to the Web Application by federating through OpenId Connect to Azure Active Directory, and then delegate the user credentials all the way to the database. Configure new users in SQL Since I want to allow other Azure AD users to connect to the database, I need to grant them permissions to it. To do this, I just follow the steps we used in my previous post, and use the T-SQL commands: Create the user: CREATE USER [user@tenant] FROM EXTERNAL PROVIDER Grant permissions: ALTER ROLE Setting up the AD Application So far, I’ve been using an Azure Active Directory Service Principal to authenticate to SQL. In order to be able to setup OpenId Connect on our Web App, I also need an Azure AD application to go with it. So what I want is to configure my existing Service Principal so that I can use it for this purpose as well. Note: using a separate AD application registration won’t work because doing the On_Behalf_Of delegation will fail. To do this, I’m going to open the Azure AD portal and find the Service Principal in the App Registrations section of the portal. Here, I open the Reply URLs page and add the application host (i.e. https://.azurewebsites.net/): Granting Sign-In permission For users to be able to sign-in, I need to give the application the right permissions. I do this by opening the Required permissions page, and clicking on the + Add button. Here, I select Windows Azure Active Directory (Microsoft.Azure.ActiveDirectory) as the API, and check the “Sign in and read user profile” option under “Delegated Permissions”: Then click save and confirm all changes. Granting Access to SQL There is one other permission that I need to grant to the application, so that we can delegate credentials to SQL Server. Again, click on the + Add button, and select Azure SQL Database as the API. Then, check the “Access Azure SQL DB and Data Warehouse” option under the “Delegated Permissions” section: Then save all the changes. As a final step, I locate the Application Id property in the application properties; we’ll need this value in a moment. Enabling Authentication Now I can enable federated authentication. I could do this by leveraging the Easy Authentication feature in App Service, but since I’ll need to change the application code later on, I’ll do this in code just as easy. First, let’s add references to the following NuGet packages into the project: Microsoft.Owin.Host.SystemWeb Microsoft.Owin.Security.OpenIdConnect Microsoft.Owin.Security.Cookies Now I can write our startup code to enable authentication: using Microsoft.Owin.Security; using Microsoft.Owin.Security.Cookies; using Microsoft.Owin.Security.OpenIdConnect; using Owin; using System; using System.Configuration; namespace UsingTokenAuthApp { public class AuthConfig { public static void RegisterAuth(IAppBuilder app) { var clientId = ConfigurationManager.AppSettings["APP_CLIENT_ID"]; var tenant = ConfigurationManager.AppSettings["AAD_TENANT_ID"]; var authority = $"https://login.microsoftonline.com/{tenant}"; app.SetDefaultSignInAsAuthenticationType(CookieAuthenticationDefaults.AuthenticationType); app.UseCookieAuthentication(new CookieAuthenticationOptions()); var options = new OpenIdConnectAuthenticationOptions { Authority = authority, ClientId = cl[...]



Token authentication to SQL Azure with a Key Vault Certificate

Tue, 29 Aug 2017 00:00:00 +0000

In a previous post, I presented a PowerShell script to create a new Service Principal in Azure Active Directory, using a self-signed certificate generated directly in Azure Key Vault for authentication. Now, let’s try using it for somethig useful. All the code and samples for this article can be found on GitHub. We can use the Key Vault certificate in a Web Application deployed to Azure App Service to authenticate to Azure Active Directory using our Service Principal, and then obtain a token to connect to SQL Azure. This saves us from having to store passwords anywhere in our configuration, since Key Vault and App Service will provide us with easy, secure access to our authentication certificate. In order to do this, we need a few things: Our Service Principal identity in AAD, with the Key Vault certificate A SQL Azure Database A SQL Server (Azure) login based on our AAD Service Principal, with permissions on the database in question. A Web App deployed with our Key Vault certificate An ASP.NET App that will use the certificate to authenticate to AAD, then use the token to connect to SQL. Let’s get started! Creating the Database For this sample, I’m going to create a new Azure SQL Server logical server, then deploy a new, blank database on it. We’ll also set up the server firewall to allow connections from other Azure resources. Since we want to use Azure Active Directory authentication, we also need to setup our new server to have an AzureAD admin user. For this we need both the username (user@domain) and the object id of the account in the domain. As usual, let’s use Azure Resource Manager (ARM) Templates for this, by creating a resource of type Micosoft.Sql/servers/administrators: { "name": "[parameters('sqlServerName')]", "type": "Microsoft.Sql/servers", "location": "[resourceGroup().location]", "apiVersion": "2014-04-01-preview", "dependsOn": [], "tags": { "displayName": "SQL Logical Server" }, "properties": { "administratorLogin": "[parameters('sqlServerAdminLogin')]", "administratorLoginPassword": "[parameters('sqlServerAdminLoginPassword')]" }, "resources": [ { "name": "activedirectory", "type": "administrators", "location": "[resourceGroup().location]", "apiVersion": "2014-04-01-preview", "dependsOn": [ "[resourceId('Microsoft.Sql/servers', parameters('sqlServerName'))]" ], "properties": { "administratorType": "ActiveDirectory", "login": "[parameters('sqlServerAdAdmin')]", "sid": "[parameters('sqlServerAdAdminObjectId')]", "tenantId": "[subscription().tenantId]" } }, { "name": "AllowAllWindowsAzureIps", "type": "firewallrules", "location": "[resourceGroup().location]", "apiVersion": "2014-04-01-preview", "dependsOn": [ "[resourceId('Microsoft.Sql/servers', parameters('sqlServerName'))]" ], "properties": { "startIpAddress": "0.0.0.0", "endIpAddress": "0.0.0.0" } }, { "name": "[parameters('sqlDbName')]", "type": "databases", "location": "[resourceGroup().location]", "apiVersion": "2014-04-01-preview", "dependsOn": [ "[resourceId('Microsoft.Sql/servers', parameters('sqlServerName'))]" ], "tags": { "displayName": "SQL DB" }, "properties": { "collation": "[parameters('sqlDbCollation')]", "edition": "[parameters('sqlDbEdition')]", "maxSizeBytes": "1073741824", "requestedServiceObjec[...]



Azure AD Service Principal with a Key Vault Certificate

Mon, 28 Aug 2017 00:00:00 +0000

It is often useful to create Azure Active Directory Service Principal objects for authenticating applications and automating tasks in Azure. While you can authenticate a Service Principal using a password (client secret), it might be better to use an X509 certificate as an alternative. You still need to find a way to keep the certificate secure, though. That’s where Azure Key Vault comes in, allowing you to store the authentication certificate in a secure manner. An application could then obtain the certificate from Key Vault as needed, or if it’s running in Azure, there might be ways to provision the certificate automatically so that we don’t need to copy stuff around. You could obtain a certificate from any valid certification authority and store it safely in Key Vault. However, Key Vault can also generate self-signed certificates, which might be good enough for many scenarios. Here is a useful PowerShell script that will create a new self-signed certificate directly in Key Vault. Then it will create a new service principal in the subscription tenant, with the new certificate for authentication. [CmdletBinding()] param( [Parameter(Mandatory = $true)] [String]$keyVaultName, [Parameter(Mandatory = $true)] [String]$principalName, [Parameter()] [int]$validityInMonths = 12 ) function New-KeyVaultSelfSignedCert { param($keyVault, $certificateName, $subjectName, $validityInMonths, $renewDaysBefore) $policy = New-AzureKeyVaultCertificatePolicy ` -SubjectName $subjectName ` -ReuseKeyOnRenewal ` -IssuerName 'Self' ` -ValidityInMonths $validityInMonths ` -RenewAtNumberOfDaysBeforeExpiry $renewDaysBefore $op = Add-AzureKeyVaultCertificate ` -VaultName $keyVault ` -CertificatePolicy $policy ` -Name $certificateName while ( $op.Status -ne 'completed' ) { Start-Sleep -Seconds 1 $op = Get-AzureKeyVaultCertificateOperation -VaultName $keyVault -Name $certificateName } (Get-AzureKeyVaultCertificate -VaultName $keyVault -Name $certificateName).Certificate } $certName = "SPCert-$principalName" $cert = New-KeyVaultSelfSignedCert -keyVault $keyVaultName ` -certificateName $certName ` -subjectName "CN=$principalName" ` -validityInMonths $validityInMonths ` -renewDaysBefore 1 Write-Verbose "Certificate generated $($cert.Thumbprint)" $certString = [Convert]::ToBase64String($cert.GetRawCertData()) New-AzureRmADServicePrincipal -DisplayName $principalName ` -CertValue $certString ` -EndDate $cert.NotAfter.AddDays(-1) The script assumes you’ve already signed in to your Azure Subscription using Login-AzureRMAccount. Let’s try executing the script: If we go into the Key Vault in the Azure Portal, we can see the new certificate generated: We can also query the new Service Principal and verify that it is indeed setup with certificate-based authentication: Get-AzureRmADSpCredential -ObjectId 37800b1f-5d17-461b-80a3-c4a8df10b319 StartDate EndDate KeyId Type --------- ------- ----- ---- 8/28/2017 2:29:54 AM 8/27/2018 2:29:49 AM 8a3250a4-4383-4a5f-acdc-4deafd930e6d AsymmetricX509Cert I’ll show some useful scenarios for this in a follow-up post. [...]



Creating Event Grid Subscriptions

Mon, 21 Aug 2017 00:00:00 +0000

A few days ago, I wrote about using Azure Resource Manager (ARM) templates to deploy Azure Event Grid. That sample showed how to create a new Event Grid Topic resource. This basically gives you an URL you can publish custom events to and have them routed to one or more event subscribers. However, one of the very powerful features in Event Grid is not custom topics, but subscribing to events published by the Azure fabric itself; that is, events published by Resource Manager Providers. As of these writings, only a few providers support Event Grid, but this number is sure to grow in the coming months. Supported Event Publishers What Azure resource manager providers support Event Grid? An easy way to find this out is to ask Azure itself. To do this, we can leverage the excellent ArmClient tool. Resource Managers that support publishing events through Event Grid are called Topic Types, and we can query these: armclient get /providers/Microsoft.EventGrid/topicTypes?api-version=2017-06-15-preview If the command succeeds, we should see something like this: { "value": [ { "properties": { "provider": "Microsoft.Eventhub", "displayName": "EventHubs Namespace", "description": "Microsoft EventHubs service events.", "resourceRegionType": "RegionalResource", "provisioningState": "Succeeded" }, "id": "providers/Microsoft.EventGrid/topicTypes/Microsoft.Eventhub.Namespaces", "name": "Microsoft.Eventhub.Namespaces", "type": "Microsoft.EventGrid/topicTypes" }, ... ] } You can also use the Azure CLI command az eventgrid topic-type list on version 2.0.14 or later. Knowing what event publishes exists is only half the story, though. We also want to know what type of events a publisher supports. These are called Event Types in Event Grid, and we can query those as well. For example, let’s say we want to find out events supported by the Microsoft.Resources.ResourceGroups topic type: armclient get /providers/Microsoft.EventGrid/topicTypes/Microsoft.Resources.ResourceGroups/eventTypes?api-version=2017-06-15-preview If the command succeeds, we should see an output similar to the following: { "value": [ { "properties": { "displayName": "Resource Write Success", "description": "Raised when a resource create or update operation succeeds.", "schemaUrl": "TBD" }, "id": "providers/Microsoft.EventGrid/topicTypes/Microsoft.Resources.ResourceGroups/eventTypes/Microsoft.Resources.ResourceWriteSuccess", "name": "Microsoft.Resources.ResourceWriteSuccess", "type": "Microsoft.EventGrid/topicTypes/eventTypes" }, ... ] } The equivalent Azure CLI command would be az eventgrid topic-type list-event-types --name Microsoft.Resources.ResourceGroups. Now let’s see how we can subscribe to events published by the Azure fabric. Event Hub Namespaces Currently, you can only subscribe to events published at the Event Hub Namespace level, not an individual Event Hub itself. For this we’d use the Microsoft.EventHub.Namespaces topic type to create a nested resource of type Microsoft.EventGrid/eventSubscriptions: { "apiVersion": "2017-06-17-preview", "name": "[concat(parameters('eventHubNamespaceName'), '/Microsoft.EventGrid/', parameters('subscriptionName'))]", "type": "Microsoft.EventHub/namespaces/providers/eventSubscriptions", "tags": { "displayName": "Webhook Subscription" }, "dependsOn": [ "[concat('Microsoft.EventHub/Namespaces/', parameters('eventHubNamespaceName'))]" ], "properties": { "destination": { "endpointType": "WebHook", "properties": { "endpointUrl": "[parameters('webhookUrl')]" } }, "filter": { "included[...]