Subscribe: Winterdom
http://feeds.feedburner.com/Commonality
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
api management  api  application  azure  connector  create  i’ll  logic apps  logic  management  new  oauth  portal  step 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Winterdom

Winterdom



by dæmons be driven - a site by Tomas Restrepo



 



Using Azure AD B2C with API Management

Fri, 17 Nov 2017 00:00:00 +0000

In a previous post, I discussed how to setup OAuth2 authorization in API Management using Azure Active Directory. This time I’d like to show something very similar, but using Azure AD B2C instead. Once again, I’ll assume you already have an API implemented and configured in API Management. I’ll use the same PQR service I used last time as an example. Step 1: Creating the B2C Sign-in Policy Before we can integrate with Azure AD B2C, we need to create a new sign-in policy that we can use to obtain a token later on. Using the Azure Portal AAD B2C module, I’ll create a new Sign-i policy named b2c-apim-pqr supporting local accounts, as well as Facebook. I also enable basic application claims to include in the token, such as first/last name and email addresses. Step 2: Creating the applications Like in the previous post, we need to create two applications of type Web App / Web Api. One for our PQR API, and another for the API Management Portal instead. I’ll create the PQR API app first: I’ll then create the portal application: As usual, we’ll update the Reply URLs of the portal application once we create the authorization server in API Management. Now, I need to grant the portal application permissions to the api one: I’ll also generate a new key for the portal application, and make note of both the ApplicationId, as well as the new key. Step 3: Creating the OAuth2 Authorization Server Back in API Management, it’s time to create our OAuth2 authorization server. This will be pretty much the same as last time, with a few minor changes. I’ll call this aad-b2c-oauth2-pqr: I’ll configure this to support both the Authorization code and Implicit grant types, and will configure the authorization/token endpoint URLs to point to our new B2C sign-in policy. Notice that both the authorization and token endpoint URLs use the same format used by the normal Azure AD OAuth2 flow, but with the sign-in policy name in the p query string parameter I’ll also add the resource parameter to point to the apim-pqr application we created in the first step: Finally, I need configure the client id and secret based on the application id and key for the apim-portal application I created in step 1. With B2C, we also need to provide a default scope, otherwise obtaining the token will fail. Here we use the scope to ask for user_impersonation (delegation) to the apim-pqr application. By this point, I’ll also have the redirect_uri for our API Management OAuth2 service, so I’ll copy this value and add it as a valid Reply URL in the apim-portal application. Step 4: Configure the API Now I’ll setup my PQR API in API Management to require authorization using the new OAuth2 configuration: Again, I’ll add a policy to the API that validates the token: f498336e-d99f-xxxx-xxxx-22e3f7d87e56 The only interesting bits here is that the openid-config element should point to the OpenId metadata endpoint for our Sign-in policy, and the audience of the generated token will use the Application Id rather than the App ID URI. Now, I should be able to obtain a token from the Developer Portal for my API and test the API: [...]



Protecting APIs with OpenId Connect in API Management

Sat, 11 Nov 2017 00:00:00 +0000

In my last post, I outlined a customer scenario for protecting an API through OAuth2 in Azure API Management. I mentioned in it that I had been unsuccessful at using OpenId Connect, rather than raw OAuth2. After some more testing, and some help, I was able to get this working, and wanted to share how I did it. Once again, I’ll assume you already have an API implemented and configured in API Management. I’ll use the same PQR service I used last time as an example. Step 1: Creating the Azure AD Application The first step is to create the Azure AD application. In this case, we will not be creating 2 separate applications like last time; we only need one. In the Azure Portal, I’ll go over to my Azure AD instance and add a new application registration. I’ll call this one aad-oidc-pqr: After creating the application, there are a few things we need to change: We need to mark the application as multi-tenant. Otherwise, the developer portal in API Management} will not work correctly, and you will get an error similar to AADSTS70001: Application 'xxxxx-xxxxx-xxxxx-xxxxx' is not supported for this API version. (Optional) Enable the OAuth2 implicit flow. We can use the application properties window to mark the application as multi-tenant: To enable the OAuth2 implicit flow, we need to edit the application manifest: Now, copy the application ID, and generate a new Key for the application. Make a note of both as we will need these in a moment. We’ll come back later to the application to configure the reply URLs. Note: Another alternative is creating the Azure AD app as a converged application, but I was only able to make it work with the implicit grant flow. Step 2: Configure OpenId Connect Authorization Back in API Management, we can configure a new OpenId Connect Authorization service. Using the Azure Portal, we will find this under the OpenId Connect option, and in the Publisher Portal it will be under Security -> OpenId Connect. For this part, we’ll need: The OpenId Connect metadata for our Azure AD tenant, which will be in the form https://login.microsoftonline.com/{tenant}/.well-known/openid-configuration. The id of the Azure AD application we created in step 1 The matching key for the application Once we’ve created the OpenId Connect Authorization Service in API Management, we need to go back to the Azure AD Application, and add both the authorization code grant and implicit grant redirect URIs to the Reply URLs collection of our application: Step 3: Configure API Just like in my previous post, I need to configure my PQR API to require OpenId Connect authorization: And again, I want to setup a policy on my API to validate the JWT token: 24f98265-c230-4668-a40b-11aa1b02c29c This is exactly the same as last time, only that when using OpenId Connect, the audience in the token will contain the Application Id, rather than the App ID URI of the Azure AD application. Step 4: Test! At this point, we should be able to use the API Management Developer portal to test that OpenId Connect works with our API: [...]



Protecting APIs with OAuth2 in API Management

Thu, 09 Nov 2017 00:00:00 +0000

I’ve been playing a lot lately with Azure API Management. Recently, a customer asked me about the following scenario: They wanted to expose a Web API through API Management API Management should enforce and validate that an OAuth2 token was provided by the caller The underlying API did not know (or care) about the OAuth2 token. There is an article on the API Management documentation about this very topic, but that one assumes that the Web API itself is setup to accept OAuth2 tokens, which is a bit of a more complex scenario. While good, I found the article a bit confusing to follow, so I thought I’d document here the steps I followed to test the customer scenario. I’ll assume we already have an API implemented and published in API Management and that we want to use Azure Active Directory as the OAuth2 provider. For this article, I’ll use an API I called PQR in API Management. Step 1: Register the Azure AD applications The first step would be to register a new Azure AD application to represent our API. I’ll create a new application like this: Next create a second application, which we’ll call apim-portal. This one will be used to represent the API Management Developer Portal, so that we can test our APIs from it. Once the application is created, we need to generate a new key for it: Make a note of both the application Id and the new key, as we’ll need this in a moment. Also, grant permissions to apim-portal to call the apim-pqr application. You could also create the application easily using the az ad sp create command in the Azure CLI. Step 2: Adding the OAuth2 authorization server Now we can configure a new OAuth2 authorization server in our API Management instance. We’ll do this on the new experience in the Azure Portal. First, go into the OAuth 2.0 section in the portal, and click the + Add button. I will call this instance aad-oauth2-pqr: We don’t need the client registration URL for now, and besides, there doesn’t seem to be a pretty way to configure it for AAD. I’ll leave only the Authorization code grant type enabled for now, and I’ll configure the authorization endpoint URL for my AAD Tenant: The authorization URL will be in the form https://login.microsoftonline.com/{tenantid}/oauth2/authorize. Then we can configure the Token endpoint URL, as well as adding the value of the resource parameter. The latter should be configured with the value of the App ID URI field of the apim-pqr application we created in the first step: The token URL will be in the form https://login.microsoftonline.com/{tenantid}/oauth2/token. We’ll leave the default scope empty for now, and configure the Client ID/Secret that we copied after creating the apim-portal application in step 1: After this, just click the Create button to save the changes. Note that you can also do this using the Publisher Portal, with almost the exact same user experience. Before moving on to the next step, there is something missing. In the screenshot above, notice there’s a field with the redirect_uri to be used with API Management. We need to copy this URL, and add it to the Reply URLs of our apim-pqr application in Azure Active Directory. Otherwise, authentication with fail later on. Step 3: Configure the API to use OAuth2 authorization The next step is to configure our PQR API so that API Management knows that invoking the API requires an OAuth2 token. On the Publisher Portal, we can modify this from the Security tab of the API properties. On the Azure Portal, we’d configure this from the API settings under the Security headline: Notice that doing this doesn’t actually cause API Management to enforce that an OAuth2 token is provided at all. Instead, it adjusts the Developer Portal experience so that you can acquire the OAuth2 token from the authorization server when trying out one of the API’s operations. Step 4: Validating the OAuth2 token At this point, you can still invoke the API through API Management just [...]



Decoding Application Gateway Certificates

Thu, 02 Nov 2017 00:00:00 +0000

Recently, I wanted to write a PowerShell script that would check expiration on the certificates assigned for SSL/TLS on Azure Application Gateway resources. Obtaining the certificates is easy through the SslCertificates property of the Application Gateway instance. However, it took me a while to figure out how to actually extract the base64-encoded data into an X509Certificate2 instance. Turns out that the certificate is returned in PKCS7 format (also known as P7B), so you need to use the SignedCms class to decode it. Some sample code: function Test-CertExpiresSoon($cert) { $span = [TimeSpan]::FromDays(30) $today = [DateTime]::Today return ($cert.NotAfter - $today) -lt $span } function Decode-Certificate($certBytes) { $p7b = New-Object System.Security.Cryptography.Pkcs.SignedCms $p7b.Decode($certBytes) return $p7b.Certificates[0] } $gateways = Get-AzureRmApplicationGateway foreach ($gw in $gateways) { foreach ($cert in $gw.SslCertificates) { $certBytes = [Convert]::FromBase64String($cert.PublicCertData) $x509 = Decode-Certificate $certBytes if (Test-CertExpiresSoon $x509) { [PSCustomObject] @{ ResourceGroup = $gw.ResourceGroupName; AppGateway = $gw.Name; CertSubject = $x509.Subject; CertThumbprint = $x509.Thumbprint; CertExpiration = $x509.NotAfter; } } } } A PKCS7 envelope can contain multiple certificates, so I might have to revisit this later on in case that is relevant, but it was not an issue in the original scenario. [...]



Logic Apps KeyVault Connector - Part 3

Thu, 26 Oct 2017 00:00:00 +0000

This is part 3 of my series of articles on implementing a custom Logic Apps connector. Read part 1 and part 2. By this point, I’ve implemented and deployed the WebApi application that implements the custom connector, and configured the necessary applications and permissions in Azure Active Directory. It’s time to create the custom connector itself. A Logic Apps custom connector is an Azure resource of type Microsoft.Web/customApis. I’m unsure how much of the creation of such a resource can be automated through ARM or other tools, so I’ll create it manually for now. Step 1: Create the resource The first step is to create the new resource of type Logic Apps Connector: Then provide the following information: A couple of details that are important here: The name of the custom connector resource is important. By default, this is the name you will see in the Logic Apps designer when you try to use it. I have not found a way, so far, of customizing it after creation. The Azure region you choose is important. You will only be able to use the connector from Logic Apps created in the same region. Step 2: Configuring the connector Once the custom connector is created, open the resource and click the Edit button at the top. That will open the editor where we can define the connector information, security configuration, and the connector operations. The easiest way to start is by uploading an existing OpenApi (Swagger) definition for your connector. As I mentioned in Part 1 of the series, I already customized the Swagger generation in the connector code so that it can be used quickly. So I’ll start by pointing the connector to the Swagger definition: This will initialize the connector operations and other information. Then, we can add an icon, background color, description, and the base URL where the connector WebApi is located: After clicking the Continue button, we’ll move to the security section. Here, Logic Apps recognizes that our connector needs OAuth2 authentication. We need to click the Edit button at the bottom and customize it by adding: Client Id: This is the Application Id of our KeyVault Connector Client application we created in Azure AD. Client secret: This is the key we generated for the application. Scope: This is the App ID URI of our KeyVault Connector application. The PowerShell script I presented in the part 2 will register this as https://[tenant-domain]/[appname]: Notice here the Redirect URL field. This will be empty until we update the connector. We’ll discuss this in a moment. Again, we click the Continue button to move to the final tab (Definition). Here, you can customize the definition of the operations that are implemented by the connector. We don’t need to customize anything here for now, as our original Swagger definition already works as is. To complete creation of the connector, press the Update Connector button at the top! Step 3: Update the redirect URI Now that the connector is fully defined, go back to the Security tab and find out the value for the Redirect URL field. Since I created the connector in the East US region, it ends up being https://logic-apis-eastus.consent.azure-apim.net/redirect. We will copy this value, and use it to update the Reply URL for the KeyVault Connector Client application in Azure AD: Notice that we update the URL on the client application used by Logic Apps to initiate the authentication flow, not on the connector application, as the later should always point to the deployment URL for the WebApi. This step is very important, because otherwise authentication will not succeed for our application. Step 4: Test it Now we’re ready to test our connector! Let’s create a new Logic Apps, and use it: Let’s select the ‘Get the value of a secret’ operation, and we’ll get the big button: This will open the popup window to login to an Azure AD account that has been granted permissions to ac[...]



Logic Apps KeyVault Connector - Part 2

Tue, 24 Oct 2017 00:00:00 +0000

In part 1 of this series of posts, I introduced the idea of implementing a custom Logic Apps connector as a way to get familiar with the challenges involved. The initial part revolved around the ASP.NET Core WebApi implementation. In this second part, I’d like to discuss a bit about setting up the Azure AD credentials. The first step is to create the two Azure Active Directory applications we need. For both applications, we want to create an application key (secret) for authentication. The first application is for our WebApi application, so it should be setup with the URL of the WebApi as the reply URL, and I also wanted to mark it as multi-tenant. The second application is what Logic Apps will use to authenticate the user. It should also be a Web / WebApi application. The docs say that the reply URL for this should be set to https://msmanaged-na.consent.azure-apim.net/redirect, however this is not correct. We won’t know the right reply URL until we actually configure the custom connector resource later on. For now, we’ll leave it to this default value and update it later. To simplify creation of these two applications, I built a simple PowerShell script: [CmdletBinding()] param( [Parameter(Mandatory=$true)] [String]$applicationName, [Parameter(Mandatory=$true)] [String]$apiAppName, [Parameter(Mandatory=$true)] [String]$tenantDomain ) Import-Module AzureRm function New-Password { $rng = New-Object System.Security.Cryptography.RNGCryptoServiceProvider [byte[]]$buffer = new-Object byte[] 32 $rng.GetBytes($buffer, 0, $buffer.Length) return [Convert]::ToBase64String($buffer) } # Create the Web Api application $baseUrl = "https://${apiAppName}.azurewebsites.net/" $appId = "https://${tenantDomain}/$applicationName" $app = New-AzureRmADApplication -DisplayName "$applicationName Connector" ` -HomePage $baseUrl ` -IdentifierUris $appId ` -ReplyUrls $baseUrl ` -AvailableToOtherTenants $true $applicationKey = New-Password $credential = New-AzureRmADServicePrincipal -ApplicationId $app.ApplicationId -Password $applicationKey [PSCustomObject]@{ Application = "WebApi Connector"; ApplicationId = $app.ApplicationId; ApplicationKey = $applicationKey } # Create the client application that will be used by LogicApps $clientAppId = "https://${tenantDomain}/${applicationName}-connector" $clientKey = New-Password $clientApp = New-AzureRmADApplication -DisplayName "$applicationName Connector Client" ` -HomePage "https://login.windows.net" ` -IdentifierUris $clientAppId ` -ReplyUrls "https://msmanaged-na.consent.azure-apim.net/redirect" $clientCredential = New-AzureRmADAppCredential -ObjectId $clientApp.ObjectId -Password $clientKey [PSCustomObject]@{ Application = "Client"; ApplicationId = $clientApp.ApplicationId; ApplicationKey = $clientKey } The script takes the following arguments: applicationName: The base name you want to use for both applications. If I pass KeyVault here, I’ll end up with two applications called KeyVault Connector (for the WebApi) and KeyVault Connector Client (for the Logic Apps client). apiAppName: The name of the API App in App Service that hosts the connector. I use this to configure the reply URL of the WebApi application. tenantDomain: The name of your Azure AD tenant such as mydomain.onmicrosoft.com. Output of the script would be something like this: Application ApplicationId ApplicationKey ----------- ------------- -------------- WebApi Connector 06086c51-09a1-48b3-... Client 681d64cf-0d99-4f5c-... Now we have all the data we need to configure this in our API App and the Logic Apps connector. [...]



Azure SQL authentication with a Managed Service Identity

Thu, 19 Oct 2017 00:00:00 +0000

On a previous article I discussed how to use a certificate stored in Key Vault to provide authentication to Azure Active Directory from a Web Application deployed in AppService so that we could authenticate to an Azure SQL database. With the introduction of Managed Service Identity, this becomes even easier, as we can just get rid of the complexity of deploying the Key Vault certificate. Let’s see how we could use MSI to authenticate the application to a SQL Database. Enabling Managed Service Identity The first step is creating the necessary Azure resources for this post. As usual, I’ll use Azure Resource Manager (ARM) templates for this. I’ll create a new SQL Server, SQL Database, and a new Web Application. The only difference here is we’ll ask Azure to create and assign a service principal to our Web Application resource: { "name": "[parameters('webAppName')]", "type": "Microsoft.Web/sites", "location": "[resourceGroup().location]", "apiVersion": "2015-08-01", "dependsOn": [ "[resourceId('Microsoft.Web/serverfarms', parameters('webAppPlanName'))]" ], "tags": { "[concat('hidden-related:', resourceId('Microsoft.Web/serverfarms', parameters('webAppPlanName')))]": "Resource", "displayName": "Web Application" }, "identity": { "type": "SystemAssigned" }, "properties": { "name": "[parameters('webAppName')]", "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', parameters('webAppPlanName'))]", "siteConfig": { "connectionStrings": [ { "name": "SqlDb", "connectionString": "[concat('Data Source=tcp:', parameters('sqlServerName'), '.database.windows.net,1433; Initial Catalog=', parameters('sqlDbName'))]" } ] } }, "resources": [ { "name": "appsettings", "type": "config", "apiVersion": "2015-08-01", "dependsOn": [ "[resourceId('Microsoft.Web/sites', parameters('webAppName'))]" ], "properties": { "AAD_TENANT_ID": "[subscription().tenantId]" } } ] } The key bit in the template above is this fragment: "identity": { "type": "SystemAssigned" }, Note: You can also enable MSI from the Azure Portal for an existing Web App. Once the web application resource has been created, we can query the identity information from the resource: az resource show -n $webApp -g $resourceGroup --resource-type Microsoft.Web/sites --query identity We should see something like this as output: { "principalId": "f76495ad-d682-xxxx-xxxx-bc70710ebf0e", "tenantId": "8305b292-c023-xxxx-xxxx-a042eb5bceb5", "type": null } With the principalId, we can query AAD to get the full details of the principal, using the az ad sp show --id $principalId, which should print something like this: { "appId": "09b89d60-1c0f-xxxx-xxxx-e009833f478f", "displayName": "msitr2app", "objectId": "f76495ad-d682-xxxx-xxxx-bc70710ebf0e", "objectType": "ServicePrincipal", "servicePrincipalNames": [ "09b89d60-1c0f-xxxx-xxxx-e009833f478f", "https://identity.azure.net/R1arAxq7+EKpM2wyumvvaZ0n+9ICN6YkZB/sse/1VtI=" ] } Note: remember that to use AAD users in SQL Azure, the SQL Server should have an AAD administrator, which the template provider does. Creating SQL Users Azure SQL Database does not support creating logins or users from servince principals created from Managed Service Identity. The only way to provide access to one is to add it to an AAD group, and then grant access to the group to the database. We can use the Azure CLI to create the group and add our MSI to it: az ad group create --display-name SQLUsers --mail-nickname 'NotSet' az ad group member add -g SQLUsers --member-id f76495ad-d682-xxxx-xxxx-bc70710ebf0e [...]



Creating an Event Hub destination using Event Grid in ARM

Wed, 18 Oct 2017 00:00:00 +0000

In a previous post, I presented a few ways to create Azure Event Grid resources using Azure Resource Manager (ARM) templates. Today, support for sending grid events to Azure Event Hubs rather than to an HTTP-based WebHook was announced. Obviously, I wanted to try this out as quick as possible! Since using ARM to automate deployment is something I find very useful, I looked at how an Event Hubs destination would be created through this mechanism. Here’s a sample ARM template that creates a new event subscription on an Azure Resource Group (to get resource events) and routes all events to an existing Azure Event Hub: { "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "subscriptionName": { "type": "string", "minLength": 1 }, "eventHubNamespace": { "type": "string", "minLength": 1 }, "eventHubName": { "type": "string", "minLength": 1 }, "eventHubResourceGroup": { "type": "string", "minLength": 1 } }, "variables": { "eventHubId": "[resourceId(parameters('eventHubResourceGroup'), 'Microsoft.EventHub/namespaces/eventHubs', parameters('eventHubNamespace'), parameters('eventHubName'))]" }, "resources": [ { "apiVersion": "2017-09-15-preview", "name": "[parameters('subscriptionName')]", "type": "Microsoft.EventGrid/eventSubscriptions", "tags": { "displayName": "Hub Subscription" }, "properties": { "destination": { "endpointType": "EventHub", "properties": { "resourceId": "[variables('eventHubId')]" } }, "filter": { "includedEventTypes": [ "All" ], "subjectBeginsWith": "", "subjectEndsWith": "", "subjectIsCaseSensitive": false } } } ], "outputs": { } } This is almost the same as creating a Webhook-based subscription, with two minor differences: The endpointType property is set to EventHub rather than Webhook. The properties of the destination contains a resourceId property that has the Id of the Event Hub you want to send events to, rather than the usual endpointUrl used for WebHooks. This is a very useful addition to the Event Grid service which enables some interesting scenarios such as: Archiving of events: For example, if I had an Event Grid custom topic, I could add a new Event Hub destination and enable the Capture feature on the hub to easily archive every event coming into the topic at scale. Delayed/batch processing of events. Stream processing of events, by using Stream Analytics to process events through the hub. [...]



Logic Apps KeyVault Connector - Part 1

Sun, 15 Oct 2017 00:00:00 +0000

Azure Logic Apps now supports writing custom connectors, which are just custom REST APIs for which you can customize the experience so that they feel like the built-in Logic Apps connectors. I wanted to give try my hand at writing one, so decided on a simple use case: Writing a connector that provides a way to retrieve a secret stored in Azure Key Vault. This part 1 in a series of articles on my experience writing this. Update 2017-10-17: A few details on the post have been updated due to me confusing summary and x-ms-summary. Connector Implementation In this post, I’d like to talk a bit about the connector implementation. I decided to implement this as an ASP.NET Core 2.0 WebAPI application that uses the Microsoft.Azure.KeyVault library to query the secret information. For my initial test, all I implemented was two simple operations: List available secrets in the selected Key Vault store Get the value of a specified secret (for the latest version, or for a specific one) For the most part, implementation is fairly straightforward, but there are a couple of interesting things that are worth discussing. OpenAPI Extensions The connector uses the Swashbuckle library to generate the OpenAPI (Swagger) description for the API. A Logic Apps custom connector has an extended OpenAPI definition that includes things such as: Operation descriptions Additional metadata for object members and operations (such as descriptions) Dynamic parameters and schemas You can either write the connector OpenAPI description by hand and upload it, or start with a standard definition and then customize it in the Azure Portal. I quickly found out that every time you point Logic Apps to your updated OpenAPI definition after changes will overwrite all your customizations, so you’d need to do them all over again. This makes sense, since the idea is that you’d normally upload an OpenAPI definition that already contains the necessary extensions, but if not, it quickly gets really annoying. What I did to simplify this process somewhat was writing a couple of Swashbuckle extensions to produce at least a working definition right from the start. For every operation in your connector, you should at least have a description and an x-ms-summary field. You’ll also want to correctly name your operation as you want it to appear in the Logic Apps designer by setting the operationId. I ended up writing my operations like this: [HttpGet()] [SwaggerOperation(OperationId = "ListSecrets")] [Summary("Lists all secrets")] [Description("Lists the secrets stored in Key Vault")] public async Task> Get(String vaultName) Notice how I use the DescriptionAttribute to specify the longer operation description, and the SummaryAttribute to add the value for summary. Both are processed by custom IOperationFilter extensions for Swashbuckle. Here’s the code for handling the summary: [AttributeUsage(AttributeTargets.Method | AttributeTargets.Parameter)] public class SummaryAttribute : Attribute { public String Summary { get; private set; } public SummaryAttribute(String summary) { this.Summary = summary; } } public class SummaryFilter : IOperationFilter { public void Apply(Operation operation, OperationFilterContext context) { var apiDesc = context.ApiDescription; var summary = apiDesc.ActionAttributes() .OfType() .FirstOrDefault(); if ( summary != null ) { operation.Summary = summary.Summary; } } } The code for handling the description is very similar to this. All that remained was registering these extensions during startup, like this: services.AddSwaggerGen(c => { c.SwaggerDoc("v1", new Info { Title = "KeyVaultController", Version = "v1" }); c.OperationFilter



Azure Managed Service Identity - Querying in ARM Template

Wed, 04 Oct 2017 00:00:00 +0000

In a previous post I was lamenting not having a way to obtained the managed service identity generated for an Azure resource, such as a Azure SQL logical server or a Web App from the Azure Resource Manager (ARM) template itself.

The issue was that the reference() function in an ARM template only returns the properties part of the resource definition, and the identity property is defined outside of that (at the same level as the resource id or the location).

It is now possible to do this thanks to a new parameter introduced in the reference() function in ARM. The new definition is:

reference(resourceName | resourceIdentifier, [apiVersion], ['Full'])

Notice the new, optional 'Full' parameter. When this is specified, the reference() function returns the complete resource definition, and not just the properties section. So we can obtain the generated identity easily using something like this:

{
    "outputs": {
        "sqlIdentity": {
            "type":"string",
            "value": "[reference(concat('Microsoft.Sql/servers/', parameters('sqlServerName')), '2015-05-01-preview', 'Full').identity.principalId]"
        }
    }
}
(image)