Subscribe: Winterdom
http://feeds.feedburner.com/Commonality
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
api management  api  application  azure  connector  create  i’ll  logic apps  management  new  oauth  portal  step  web 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Winterdom

Winterdom



by dæmons be driven - a site by Tomas Restrepo



 



Azure API Management - Getting Query String Values in set-body

Thu, 14 Dec 2017 00:00:00 +0000

Ran a question recently that was a bit tricky to solve with Azure API Management: How do you get a value passed in the URL Query String to your API operation from a policy in a statement?

For example, let’s assume that the query string value we want is called userId. If you’re using a Liquid template, it would look something like this:

 template="liquid">
{
    "userId": "{{context.Request.OriginalUrl.Query.userId}}"
}

Notice how we use OrigiinalUrl rather than Url. The former is the URL the consumer used to call into API Management, while the latter is the URL of the backend service.

If you’re not using a Liquid template, then you just need to make sure that your set-body expressoin explicitly returns a string object:

 template="none">
    @("The user id is: " + context.Request.OriginalUrl.Query.GetValueOrDefault("userId"))

The set-body reference documentation provides some useful information on figuring stuff like this out, particularly when combined with the policy expressions documentation.

(image)



Azure API Management - SOAP-to-REST date/time handling

Sat, 25 Nov 2017 00:00:00 +0000

I’ve been spending some time recently helping customers getting started with Azure API Management, and recently ran into a small issue with the SOAP-to-REST feature that might trip others.

The issue in question came up because the request message on the SOAP service had a field of type xsd:dateTime. When the Import API wizard imports the API WSDL, it does not appear to keep much track of data types used, and so, it would generate a simple liquid template like this:

{{body.request.sentOn}}

We noticed that this was not working as expected: the service was returning an SAX error because it couldn’t parse the request. Using the API Inspector, we quickly realized the issue was the Date/Time format. The XML being sent to the service looked something like this:

9/2/2010 9:23:00 AM

This clearly was not in ISO 8601 format; the value was clearly being formatted in the U.S. English locale, which is invalid. To work around this, we edited the default policy generated by the Import API wizard, to use this:

{{body.request.sentOn | Date: 'o'}}

This forces the Liquid template to format the date/time value in the round-trip format, which is ISO 8601 compliant. Unfortunately, this won’t quite solve the issue in every case. I’ve noticed that in some scenarios (depending on what the input format looks like), API Management won’t maintain the TimeZone information in the original value when using the o format (and sometimes seems to round the time in unexpected ways). I presume this might be because API Management is using DateTime rather than DateTimeOffset underneath, but I have no way to know for sure.

In the specific case I was looking at with the customer, we ended up using a custom format with no TimeZone information, since both consumer/service were in the same Time Zone and there’s no daylight savings time to deal with. If you’re in a different scenario, then ensuring all date/time info is in UTC might be worth considering.

(image)



Using Azure AD B2C with API Management

Fri, 17 Nov 2017 00:00:00 +0000

In a previous post, I discussed how to setup OAuth2 authorization in API Management using Azure Active Directory. This time I’d like to show something very similar, but using Azure AD B2C instead. Once again, I’ll assume you already have an API implemented and configured in API Management. I’ll use the same PQR service I used last time as an example. Step 1: Creating the B2C Sign-in Policy Before we can integrate with Azure AD B2C, we need to create a new sign-in policy that we can use to obtain a token later on. Using the Azure Portal AAD B2C module, I’ll create a new Sign-i policy named b2c-apim-pqr supporting local accounts, as well as Facebook. I also enable basic application claims to include in the token, such as first/last name and email addresses. Step 2: Creating the applications Like in the previous post, we need to create two applications of type Web App / Web Api. One for our PQR API, and another for the API Management Portal instead. I’ll create the PQR API app first: I’ll then create the portal application: As usual, we’ll update the Reply URLs of the portal application once we create the authorization server in API Management. Now, I need to grant the portal application permissions to the api one: I’ll also generate a new key for the portal application, and make note of both the ApplicationId, as well as the new key. Step 3: Creating the OAuth2 Authorization Server Back in API Management, it’s time to create our OAuth2 authorization server. This will be pretty much the same as last time, with a few minor changes. I’ll call this aad-b2c-oauth2-pqr: I’ll configure this to support both the Authorization code and Implicit grant types, and will configure the authorization/token endpoint URLs to point to our new B2C sign-in policy. Notice that both the authorization and token endpoint URLs use the same format used by the normal Azure AD OAuth2 flow, but with the sign-in policy name in the p query string parameter I’ll also add the resource parameter to point to the apim-pqr application we created in the first step: Finally, I need configure the client id and secret based on the application id and key for the apim-portal application I created in step 1. With B2C, we also need to provide a default scope, otherwise obtaining the token will fail. Here we use the scope to ask for user_impersonation (delegation) to the apim-pqr application. By this point, I’ll also have the redirect_uri for our API Management OAuth2 service, so I’ll copy this value and add it as a valid Reply URL in the apim-portal application. Step 4: Configure the API Now I’ll setup my PQR API in API Management to require authorization using the new OAuth2 configuration: Again, I’ll add a policy to the API that validates the token: f498336e-d99f-xxxx-xxxx-22e3f7d87e56 The only interesting bits here is that the openid-config element should point to the OpenId metadata endpoint for our Sign-in policy, and the audience of the generated token will use the Application Id rather than the App ID URI. Now, I should be able to obtain a token from the Developer Portal for my API and test the API: [...]



Protecting APIs with OpenId Connect in API Management

Sat, 11 Nov 2017 00:00:00 +0000

In my last post, I outlined a customer scenario for protecting an API through OAuth2 in Azure API Management. I mentioned in it that I had been unsuccessful at using OpenId Connect, rather than raw OAuth2. After some more testing, and some help, I was able to get this working, and wanted to share how I did it. Once again, I’ll assume you already have an API implemented and configured in API Management. I’ll use the same PQR service I used last time as an example. Step 1: Creating the Azure AD Application The first step is to create the Azure AD application. In this case, we will not be creating 2 separate applications like last time; we only need one. In the Azure Portal, I’ll go over to my Azure AD instance and add a new application registration. I’ll call this one aad-oidc-pqr: After creating the application, there are a few things we need to change: We need to mark the application as multi-tenant. Otherwise, the developer portal in API Management} will not work correctly, and you will get an error similar to AADSTS70001: Application 'xxxxx-xxxxx-xxxxx-xxxxx' is not supported for this API version. (Optional) Enable the OAuth2 implicit flow. We can use the application properties window to mark the application as multi-tenant: To enable the OAuth2 implicit flow, we need to edit the application manifest: Now, copy the application ID, and generate a new Key for the application. Make a note of both as we will need these in a moment. We’ll come back later to the application to configure the reply URLs. Note: Another alternative is creating the Azure AD app as a converged application, but I was only able to make it work with the implicit grant flow. Step 2: Configure OpenId Connect Authorization Back in API Management, we can configure a new OpenId Connect Authorization service. Using the Azure Portal, we will find this under the OpenId Connect option, and in the Publisher Portal it will be under Security -> OpenId Connect. For this part, we’ll need: The OpenId Connect metadata for our Azure AD tenant, which will be in the form https://login.microsoftonline.com/{tenant}/.well-known/openid-configuration. The id of the Azure AD application we created in step 1 The matching key for the application Once we’ve created the OpenId Connect Authorization Service in API Management, we need to go back to the Azure AD Application, and add both the authorization code grant and implicit grant redirect URIs to the Reply URLs collection of our application: Step 3: Configure API Just like in my previous post, I need to configure my PQR API to require OpenId Connect authorization: And again, I want to setup a policy on my API to validate the JWT token: 24f98265-c230-4668-a40b-11aa1b02c29c This is exactly the same as last time, only that when using OpenId Connect, the audience in the token will contain the Application Id, rather than the App ID URI of the Azure AD application. Step 4: Test! At this point, we should be able to use the API Management Developer portal to test that OpenId Connect works with our API: [...]



Protecting APIs with OAuth2 in API Management

Thu, 09 Nov 2017 00:00:00 +0000

I’ve been playing a lot lately with Azure API Management. Recently, a customer asked me about the following scenario: They wanted to expose a Web API through API Management API Management should enforce and validate that an OAuth2 token was provided by the caller The underlying API did not know (or care) about the OAuth2 token. There is an article on the API Management documentation about this very topic, but that one assumes that the Web API itself is setup to accept OAuth2 tokens, which is a bit of a more complex scenario. While good, I found the article a bit confusing to follow, so I thought I’d document here the steps I followed to test the customer scenario. I’ll assume we already have an API implemented and published in API Management and that we want to use Azure Active Directory as the OAuth2 provider. For this article, I’ll use an API I called PQR in API Management. Step 1: Register the Azure AD applications The first step would be to register a new Azure AD application to represent our API. I’ll create a new application like this: Next create a second application, which we’ll call apim-portal. This one will be used to represent the API Management Developer Portal, so that we can test our APIs from it. Once the application is created, we need to generate a new key for it: Make a note of both the application Id and the new key, as we’ll need this in a moment. Also, grant permissions to apim-portal to call the apim-pqr application. You could also create the application easily using the az ad sp create command in the Azure CLI. Step 2: Adding the OAuth2 authorization server Now we can configure a new OAuth2 authorization server in our API Management instance. We’ll do this on the new experience in the Azure Portal. First, go into the OAuth 2.0 section in the portal, and click the + Add button. I will call this instance aad-oauth2-pqr: We don’t need the client registration URL for now, and besides, there doesn’t seem to be a pretty way to configure it for AAD. I’ll leave only the Authorization code grant type enabled for now, and I’ll configure the authorization endpoint URL for my AAD Tenant: The authorization URL will be in the form https://login.microsoftonline.com/{tenantid}/oauth2/authorize. Then we can configure the Token endpoint URL, as well as adding the value of the resource parameter. The latter should be configured with the value of the App ID URI field of the apim-pqr application we created in the first step: The token URL will be in the form https://login.microsoftonline.com/{tenantid}/oauth2/token. We’ll leave the default scope empty for now, and configure the Client ID/Secret that we copied after creating the apim-portal application in step 1: After this, just click the Create button to save the changes. Note that you can also do this using the Publisher Portal, with almost the exact same user experience. Before moving on to the next step, there is something missing. In the screenshot above, notice there’s a field with the redirect_uri to be used with API Management. We need to copy this URL, and add it to the Reply URLs of our apim-pqr application in Azure Active Directory. Otherwise, authentication with fail later on. Step 3: Configure the API to use OAuth2 authorization The next step is to configure our PQR API so that API Management knows that invoking the API requires an OAuth2 token. On the Publisher Portal, we can modify this from the Security tab of the API properties. On the Azure Portal, we’d configure this from the API settings under the Security headline: Notice that doing this doesn’t actually cause API Management to enforce that an OAuth2 token is provided at all. Instead, it adjusts the Developer Portal experience so that you can acquire the OAuth2 token from the authorization server when trying out one of the API’s operations. Step 4: Validating the OAuth2 token At this point, yo[...]



Decoding Application Gateway Certificates

Thu, 02 Nov 2017 00:00:00 +0000

Recently, I wanted to write a PowerShell script that would check expiration on the certificates assigned for SSL/TLS on Azure Application Gateway resources. Obtaining the certificates is easy through the SslCertificates property of the Application Gateway instance. However, it took me a while to figure out how to actually extract the base64-encoded data into an X509Certificate2 instance. Turns out that the certificate is returned in PKCS7 format (also known as P7B), so you need to use the SignedCms class to decode it. Some sample code: function Test-CertExpiresSoon($cert) { $span = [TimeSpan]::FromDays(30) $today = [DateTime]::Today return ($cert.NotAfter - $today) -lt $span } function Decode-Certificate($certBytes) { $p7b = New-Object System.Security.Cryptography.Pkcs.SignedCms $p7b.Decode($certBytes) return $p7b.Certificates[0] } $gateways = Get-AzureRmApplicationGateway foreach ($gw in $gateways) { foreach ($cert in $gw.SslCertificates) { $certBytes = [Convert]::FromBase64String($cert.PublicCertData) $x509 = Decode-Certificate $certBytes if (Test-CertExpiresSoon $x509) { [PSCustomObject] @{ ResourceGroup = $gw.ResourceGroupName; AppGateway = $gw.Name; CertSubject = $x509.Subject; CertThumbprint = $x509.Thumbprint; CertExpiration = $x509.NotAfter; } } } } A PKCS7 envelope can contain multiple certificates, so I might have to revisit this later on in case that is relevant, but it was not an issue in the original scenario. [...]



Logic Apps KeyVault Connector - Part 3

Thu, 26 Oct 2017 00:00:00 +0000

This is part 3 of my series of articles on implementing a custom Logic Apps connector. Read part 1 and part 2. By this point, I’ve implemented and deployed the WebApi application that implements the custom connector, and configured the necessary applications and permissions in Azure Active Directory. It’s time to create the custom connector itself. A Logic Apps custom connector is an Azure resource of type Microsoft.Web/customApis. I’m unsure how much of the creation of such a resource can be automated through ARM or other tools, so I’ll create it manually for now. Step 1: Create the resource The first step is to create the new resource of type Logic Apps Connector: Then provide the following information: A couple of details that are important here: The name of the custom connector resource is important. By default, this is the name you will see in the Logic Apps designer when you try to use it. I have not found a way, so far, of customizing it after creation. The Azure region you choose is important. You will only be able to use the connector from Logic Apps created in the same region. Step 2: Configuring the connector Once the custom connector is created, open the resource and click the Edit button at the top. That will open the editor where we can define the connector information, security configuration, and the connector operations. The easiest way to start is by uploading an existing OpenApi (Swagger) definition for your connector. As I mentioned in Part 1 of the series, I already customized the Swagger generation in the connector code so that it can be used quickly. So I’ll start by pointing the connector to the Swagger definition: This will initialize the connector operations and other information. Then, we can add an icon, background color, description, and the base URL where the connector WebApi is located: After clicking the Continue button, we’ll move to the security section. Here, Logic Apps recognizes that our connector needs OAuth2 authentication. We need to click the Edit button at the bottom and customize it by adding: Client Id: This is the Application Id of our KeyVault Connector Client application we created in Azure AD. Client secret: This is the key we generated for the application. Scope: This is the App ID URI of our KeyVault Connector application. The PowerShell script I presented in the part 2 will register this as https://[tenant-domain]/[appname]: Notice here the Redirect URL field. This will be empty until we update the connector. We’ll discuss this in a moment. Again, we click the Continue button to move to the final tab (Definition). Here, you can customize the definition of the operations that are implemented by the connector. We don’t need to customize anything here for now, as our original Swagger definition already works as is. To complete creation of the connector, press the Update Connector button at the top! Step 3: Update the redirect URI Now that the connector is fully defined, go back to the Security tab and find out the value for the Redirect URL field. Since I created the connector in the East US region, it ends up being https://logic-apis-eastus.consent.azure-apim.net/redirect. We will copy this value, and use it to update the Reply URL for the KeyVault Connector Client application in Azure AD: Notice that we update the URL on the client application used by Logic Apps to initiate the authentication flow, not on the connector application, as the later should always point to the deployment URL for the WebApi. This step is very important, because otherwise authentication will not succeed for our application. Step 4: Test it Now we’re ready to test our connector! Let’s create a new Logic Apps, and use it: Let’s select the ‘Get the value of a secret’ operation, and we’ll get the big button: This will open the popup window to login to an Az[...]



Logic Apps KeyVault Connector - Part 2

Tue, 24 Oct 2017 00:00:00 +0000

In part 1 of this series of posts, I introduced the idea of implementing a custom Logic Apps connector as a way to get familiar with the challenges involved. The initial part revolved around the ASP.NET Core WebApi implementation. In this second part, I’d like to discuss a bit about setting up the Azure AD credentials. The first step is to create the two Azure Active Directory applications we need. For both applications, we want to create an application key (secret) for authentication. The first application is for our WebApi application, so it should be setup with the URL of the WebApi as the reply URL, and I also wanted to mark it as multi-tenant. The second application is what Logic Apps will use to authenticate the user. It should also be a Web / WebApi application. The docs say that the reply URL for this should be set to https://msmanaged-na.consent.azure-apim.net/redirect, however this is not correct. We won’t know the right reply URL until we actually configure the custom connector resource later on. For now, we’ll leave it to this default value and update it later. To simplify creation of these two applications, I built a simple PowerShell script: [CmdletBinding()] param( [Parameter(Mandatory=$true)] [String]$applicationName, [Parameter(Mandatory=$true)] [String]$apiAppName, [Parameter(Mandatory=$true)] [String]$tenantDomain ) Import-Module AzureRm function New-Password { $rng = New-Object System.Security.Cryptography.RNGCryptoServiceProvider [byte[]]$buffer = new-Object byte[] 32 $rng.GetBytes($buffer, 0, $buffer.Length) return [Convert]::ToBase64String($buffer) } # Create the Web Api application $baseUrl = "https://${apiAppName}.azurewebsites.net/" $appId = "https://${tenantDomain}/$applicationName" $app = New-AzureRmADApplication -DisplayName "$applicationName Connector" ` -HomePage $baseUrl ` -IdentifierUris $appId ` -ReplyUrls $baseUrl ` -AvailableToOtherTenants $true $applicationKey = New-Password $credential = New-AzureRmADServicePrincipal -ApplicationId $app.ApplicationId -Password $applicationKey [PSCustomObject]@{ Application = "WebApi Connector"; ApplicationId = $app.ApplicationId; ApplicationKey = $applicationKey } # Create the client application that will be used by LogicApps $clientAppId = "https://${tenantDomain}/${applicationName}-connector" $clientKey = New-Password $clientApp = New-AzureRmADApplication -DisplayName "$applicationName Connector Client" ` -HomePage "https://login.windows.net" ` -IdentifierUris $clientAppId ` -ReplyUrls "https://msmanaged-na.consent.azure-apim.net/redirect" $clientCredential = New-AzureRmADAppCredential -ObjectId $clientApp.ObjectId -Password $clientKey [PSCustomObject]@{ Application = "Client"; ApplicationId = $clientApp.ApplicationId; ApplicationKey = $clientKey } The script takes the following arguments: applicationName: The base name you want to use for both applications. If I pass KeyVault here, I’ll end up with two applications called KeyVault Connector (for the WebApi) and KeyVault Connector Client (for the Logic Apps client). apiAppName: The name of the API App in App Service that hosts the connector. I use this to configure the reply URL of the WebApi application. tenantDomain: The name of your Azure AD tenant such as mydomain.onmicrosoft.com. Output of the script would be something like this: Application ApplicationId ApplicationKey ----------- ------------- -------------- WebApi Connector 06086c51-09a1-48b3-... Client 681d64cf-0d99-4f5c-... Now we have all the data we need to confi[...]



Azure SQL authentication with a Managed Service Identity

Thu, 19 Oct 2017 00:00:00 +0000

On a previous article I discussed how to use a certificate stored in Key Vault to provide authentication to Azure Active Directory from a Web Application deployed in AppService so that we could authenticate to an Azure SQL database. With the introduction of Managed Service Identity, this becomes even easier, as we can just get rid of the complexity of deploying the Key Vault certificate. Let’s see how we could use MSI to authenticate the application to a SQL Database. Enabling Managed Service Identity The first step is creating the necessary Azure resources for this post. As usual, I’ll use Azure Resource Manager (ARM) templates for this. I’ll create a new SQL Server, SQL Database, and a new Web Application. The only difference here is we’ll ask Azure to create and assign a service principal to our Web Application resource: { "name": "[parameters('webAppName')]", "type": "Microsoft.Web/sites", "location": "[resourceGroup().location]", "apiVersion": "2015-08-01", "dependsOn": [ "[resourceId('Microsoft.Web/serverfarms', parameters('webAppPlanName'))]" ], "tags": { "[concat('hidden-related:', resourceId('Microsoft.Web/serverfarms', parameters('webAppPlanName')))]": "Resource", "displayName": "Web Application" }, "identity": { "type": "SystemAssigned" }, "properties": { "name": "[parameters('webAppName')]", "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', parameters('webAppPlanName'))]", "siteConfig": { "connectionStrings": [ { "name": "SqlDb", "connectionString": "[concat('Data Source=tcp:', parameters('sqlServerName'), '.database.windows.net,1433; Initial Catalog=', parameters('sqlDbName'))]" } ] } }, "resources": [ { "name": "appsettings", "type": "config", "apiVersion": "2015-08-01", "dependsOn": [ "[resourceId('Microsoft.Web/sites', parameters('webAppName'))]" ], "properties": { "AAD_TENANT_ID": "[subscription().tenantId]" } } ] } The key bit in the template above is this fragment: "identity": { "type": "SystemAssigned" }, Note: You can also enable MSI from the Azure Portal for an existing Web App. Once the web application resource has been created, we can query the identity information from the resource: az resource show -n $webApp -g $resourceGroup --resource-type Microsoft.Web/sites --query identity We should see something like this as output: { "principalId": "f76495ad-d682-xxxx-xxxx-bc70710ebf0e", "tenantId": "8305b292-c023-xxxx-xxxx-a042eb5bceb5", "type": null } With the principalId, we can query AAD to get the full details of the principal, using the az ad sp show --id $principalId, which should print something like this: { "appId": "09b89d60-1c0f-xxxx-xxxx-e009833f478f", "displayName": "msitr2app", "objectId": "f76495ad-d682-xxxx-xxxx-bc70710ebf0e", "objectType": "ServicePrincipal", "servicePrincipalNames": [ "09b89d60-1c0f-xxxx-xxxx-e009833f478f", "https://identity.azure.net/R1arAxq7+EKpM2wyumvvaZ0n+9ICN6YkZB/sse/1VtI=" ] } Note: remember that to use AAD users in SQL Azure, the SQL Server should have an AAD administrator, which the template provider does. Creating SQL Users Azure SQL Database does not support creating logins or users from servince principals created from Managed Service Identity. The only way to provide access to one is to add it to an AAD group, and then grant access to the group to the database. We can use the Azure CLI to create the group and add our MSI to it: az ad group create --display-name SQLUsers --mail-nickname 'NotSet' az ad group member add -g SQL[...]



Creating an Event Hub destination using Event Grid in ARM

Wed, 18 Oct 2017 00:00:00 +0000

In a previous post, I presented a few ways to create Azure Event Grid resources using Azure Resource Manager (ARM) templates. Today, support for sending grid events to Azure Event Hubs rather than to an HTTP-based WebHook was announced. Obviously, I wanted to try this out as quick as possible! Since using ARM to automate deployment is something I find very useful, I looked at how an Event Hubs destination would be created through this mechanism. Here’s a sample ARM template that creates a new event subscription on an Azure Resource Group (to get resource events) and routes all events to an existing Azure Event Hub: { "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "subscriptionName": { "type": "string", "minLength": 1 }, "eventHubNamespace": { "type": "string", "minLength": 1 }, "eventHubName": { "type": "string", "minLength": 1 }, "eventHubResourceGroup": { "type": "string", "minLength": 1 } }, "variables": { "eventHubId": "[resourceId(parameters('eventHubResourceGroup'), 'Microsoft.EventHub/namespaces/eventHubs', parameters('eventHubNamespace'), parameters('eventHubName'))]" }, "resources": [ { "apiVersion": "2017-09-15-preview", "name": "[parameters('subscriptionName')]", "type": "Microsoft.EventGrid/eventSubscriptions", "tags": { "displayName": "Hub Subscription" }, "properties": { "destination": { "endpointType": "EventHub", "properties": { "resourceId": "[variables('eventHubId')]" } }, "filter": { "includedEventTypes": [ "All" ], "subjectBeginsWith": "", "subjectEndsWith": "", "subjectIsCaseSensitive": false } } } ], "outputs": { } } This is almost the same as creating a Webhook-based subscription, with two minor differences: The endpointType property is set to EventHub rather than Webhook. The properties of the destination contains a resourceId property that has the Id of the Event Hub you want to send events to, rather than the usual endpointUrl used for WebHooks. This is a very useful addition to the Event Grid service which enables some interesting scenarios such as: Archiving of events: For example, if I had an Event Grid custom topic, I could add a new Event Hub destination and enable the Capture feature on the hub to easily archive every event coming into the topic at scale. Delayed/batch processing of events. Stream processing of events, by using Stream Analytics to process events through the hub. [...]