Subscribe: Winterdom
http://feeds.feedburner.com/Commonality
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
api management  api  apis  application  azure  connector  create  i’ll  management  new  oauth  portal  step  time  token 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Winterdom

Winterdom



by dæmons be driven - a site by Tomas Restrepo



 



API Management Groups

Wed, 07 Feb 2018 00:00:00 +0000

Azure API Management supports the concept of User Groups to manage the visibility of Products to users. They are somewhat interesting, in that it is not obvious how to leverage them right away. As the documentation states, there are 3 built-in, system groups: Administrators Developers Guests (anonymous, unathenticated users) These groups are immutable, meaning you cannot add or remove members to these groups. This sounds a bit strange at first, but it is not a big deal: The Administrators group is a bit of a left-over from having the Publishers Portal. Since this is soon going away, it is not that big of a deal. The Developers group has any user that has signed up for the Developer Portal, so all external users are added automatically to this group. The Guest group is special, as it really doesn’t have any members. It’s really non-users, so it fills a special need. Custom Groups, however, are quite interesting, if you take a bit of time to explore how they work. To do that, let’s consider the following scenario: You have a single API Management instance, on which you expose both internal and external APIs. Internal APIs are only for consumption within your organization, while external APIs will be consumed by third-parties. The right way to do this would be to ensure that both Internal and External APIs are assigned to different Products. So let’s say we have the following: Product Order Fulfillment, for internal use only, has the following APIs: Product Customer Service, exposed to third-parties, has the following APIs: Normally, when we add a new Product, users in the Administrators group get access to it automatically. The usual procedure would be to grant users in the Developers group to each product as well, but that would mean everyone would be able to see the products. Instead, what we want is to create two separate groups representing internal and externals users, and grant the right access to the right product: Group Organizational Users only has access to product Order Fulfillment Group Third Parties only has access to product Customer Service So now, only internal users will only be able to subscribe to the Order Fulfillment product, while external users will only be able to subscribe to the Customer Service product. The interesting part comes from the fact that permissions on APIs are transitive from Products. This means that when you log into the Developer Portal, you will only be able to see APIs that belong to Products you have permissions for, and not the rest! So in this example, if I log into the portal with a user who is only member of the Organizational Users group, I only see APIs in the Order Fulfillment product: If I sign in to the portal with a user who is member of the Third Parties group, then I only see APIs in the Customer Service product: Something to keep in mind There are a few things worth keeping in mind about API Management Groups: Remember that any APIs associated to products to which the Developers or Guests groups have access will be pretty much visible to everyone. If you decide to change your Product / Group structure later on, you will find that if a user has a subscription to a Product, she will be able to see all APIs included in the product, even if the groups the user is a member of do not grant permissions over said product. This will be the case until all said subscriptions are removed.gg [...]



Azure API Management - Changing the Subscription Key header or query string names

Tue, 23 Jan 2018 00:00:00 +0000

By default, there are two ways a consumer can specify the Subscription Key on a call to API Management: Using the Ocp-Apim-Subscription-Key HTTP header Using the subscription-key query string value in the URL These are just the default names for both. While they can be customized, it can be non-obvious how to do this, since it is not exposed directly in the user interface in the Publisher or the Azure Portals. Both of these options are part of the definition of the ARM resource for the API (yes, APIs, Products, and other API Management entities are essentially nested ARM resources within the API management instance): There are a few easy ways you can customize this: Using PowerShell If you’re comfortable using the PowerShell Azure cmdlets, then it’s possible to change these properties using the Set-AzureRmApiManagementApi cmdlet. For example, let’s change the header name to ApiKey and the query string value name to api-key: $apim = New-AzureRmApiManagementContext -ResourceGroupName $resourceGroup -ServiceName $resourceName $echoApi = Get-AzureRmApiManagementApi -Context $apim -ApiId 'echo-api' Set-AzureRmApiManagementApi -Context $apim ` -ApiId $echoApi.ApiId ` -Name $echoApi.Name ` -ServiceUrl $echoApi.ServiceUrl ` -Protocols $echoApi.Protocols ` -SubscriptionKeyHeaderName 'ApiKey' ` -SubscriptionKeyQueryParamName 'api-key' This is a bit more annoying than it should be, because Set-AzureRmApiManagementApi has the ApiId, Name, ServiceUrl, and Protocols parameters marked as mandatory, so we need to provide a value even if we’re not trying to change any of them. Simplest way around this is to first query the current values and pass them back when modifying the API definition. Using the REST API Another option is to use any tool that can be used to call REST APIs, if you have the management REST API in API Management enabled. To do this, first get an access token for the service using the Azure Portal or the publisher portal. Here, I’ll use Postman to change the properties: Using the Azure Portal? At this point, it does not appear to be a way to change these options from the Azure Portal. The new experience for editing APIs is quite nice, and you can check the current values using the following process: Select the API you want to modify Make sure the Design tab is selected Click on the pencil icon next to the Frontend section to edit the raw API definition You will see a YAML definition for the API, including a securityDefinitions section where the names of the header and query string value are defined. While you can modify these settings here, and then save the API definition, the changes appear to be ignored, so it’s not, currently, a viable alternative. [...]



ARM Extensions for Visual Studio Code

Sat, 13 Jan 2018 00:00:00 +0000

I’ve mentioned before that Visual Studio Code has been my tool of choice lately for writing Azure Resource Manager (ARM) templates.

I’d like to mention some reasons I’ve found this a great combination:

  • VSCode is far more lightweight than the full Visual Studio.
  • I’ve always found the deployment experience for ARM templates in Visual Studio to be a bit clunky
  • Deploying ARM templates from the command line using the Azure CLI provides a great experience, and it is now my option of choice when testing.

There are some specific extensions that I’ve found useful for editing ARM templates:

  • Of course, the Azure Resource Manager Tools is mandatory, as it provides all the base editing experience.
  • The Azure Resource Manager Snippets extension by Sam Cogan is pretty good, and provides tons of useful snippets for a lot of common resource types.
  • The Azure ARM Template Helper extension by Ed Elliot, which I just very recently discovered, provides functionality close to the bit I miss the most from Visual Studio: The JSON tree view of the template. Only Ed’s version goes beyond resources, and provides more extensive navigation:

    (image)

    My only comment on this excellent extension, is that it would be useful if you didn’t need to explicitly run the ARM: Start Outliner command to enable it.

And of course, JSON is unreadable without some nice Rainbow Braces support. I’m currently using the Bracket Pair Colorizer extension for this.

(image)



Azure API Management - Getting Query String Values in set-body

Thu, 14 Dec 2017 00:00:00 +0000

Ran a question recently that was a bit tricky to solve with Azure API Management: How do you get a value passed in the URL Query String to your API operation from a policy in a statement?

For example, let’s assume that the query string value we want is called userId. If you’re using a Liquid template, it would look something like this:

 template="liquid">
{
    "userId": "{{context.Request.OriginalUrl.Query.userId}}"
}

Notice how we use OrigiinalUrl rather than Url. The former is the URL the consumer used to call into API Management, while the latter is the URL of the backend service.

If you’re not using a Liquid template, then you just need to make sure that your set-body expressoin explicitly returns a string object:

 template="none">
    @("The user id is: " + context.Request.OriginalUrl.Query.GetValueOrDefault("userId"))

The set-body reference documentation provides some useful information on figuring stuff like this out, particularly when combined with the policy expressions documentation.

(image)



Azure API Management - SOAP-to-REST date/time handling

Sat, 25 Nov 2017 00:00:00 +0000

I’ve been spending some time recently helping customers getting started with Azure API Management, and recently ran into a small issue with the SOAP-to-REST feature that might trip others.

The issue in question came up because the request message on the SOAP service had a field of type xsd:dateTime. When the Import API wizard imports the API WSDL, it does not appear to keep much track of data types used, and so, it would generate a simple liquid template like this:

{{body.request.sentOn}}

We noticed that this was not working as expected: the service was returning an SAX error because it couldn’t parse the request. Using the API Inspector, we quickly realized the issue was the Date/Time format. The XML being sent to the service looked something like this:

9/2/2010 9:23:00 AM

This clearly was not in ISO 8601 format; the value was clearly being formatted in the U.S. English locale, which is invalid. To work around this, we edited the default policy generated by the Import API wizard, to use this:

{{body.request.sentOn | Date: 'o'}}

This forces the Liquid template to format the date/time value in the round-trip format, which is ISO 8601 compliant. Unfortunately, this won’t quite solve the issue in every case. I’ve noticed that in some scenarios (depending on what the input format looks like), API Management won’t maintain the TimeZone information in the original value when using the o format (and sometimes seems to round the time in unexpected ways). I presume this might be because API Management is using DateTime rather than DateTimeOffset underneath, but I have no way to know for sure.

In the specific case I was looking at with the customer, we ended up using a custom format with no TimeZone information, since both consumer/service were in the same Time Zone and there’s no daylight savings time to deal with. If you’re in a different scenario, then ensuring all date/time info is in UTC might be worth considering.

(image)



Using Azure AD B2C with API Management

Fri, 17 Nov 2017 00:00:00 +0000

In a previous post, I discussed how to setup OAuth2 authorization in API Management using Azure Active Directory. This time I’d like to show something very similar, but using Azure AD B2C instead. Once again, I’ll assume you already have an API implemented and configured in API Management. I’ll use the same PQR service I used last time as an example. Step 1: Creating the B2C Sign-in Policy Before we can integrate with Azure AD B2C, we need to create a new sign-in policy that we can use to obtain a token later on. Using the Azure Portal AAD B2C module, I’ll create a new Sign-i policy named b2c-apim-pqr supporting local accounts, as well as Facebook. I also enable basic application claims to include in the token, such as first/last name and email addresses. Step 2: Creating the applications Like in the previous post, we need to create two applications of type Web App / Web Api. One for our PQR API, and another for the API Management Portal instead. I’ll create the PQR API app first: I’ll then create the portal application: As usual, we’ll update the Reply URLs of the portal application once we create the authorization server in API Management. Now, I need to grant the portal application permissions to the api one: I’ll also generate a new key for the portal application, and make note of both the ApplicationId, as well as the new key. Step 3: Creating the OAuth2 Authorization Server Back in API Management, it’s time to create our OAuth2 authorization server. This will be pretty much the same as last time, with a few minor changes. I’ll call this aad-b2c-oauth2-pqr: I’ll configure this to support both the Authorization code and Implicit grant types, and will configure the authorization/token endpoint URLs to point to our new B2C sign-in policy. Notice that both the authorization and token endpoint URLs use the same format used by the normal Azure AD OAuth2 flow, but with the sign-in policy name in the p query string parameter I’ll also add the resource parameter to point to the apim-pqr application we created in the first step: Finally, I need configure the client id and secret based on the application id and key for the apim-portal application I created in step 1. With B2C, we also need to provide a default scope, otherwise obtaining the token will fail. Here we use the scope to ask for user_impersonation (delegation) to the apim-pqr application. By this point, I’ll also have the redirect_uri for our API Management OAuth2 service, so I’ll copy this value and add it as a valid Reply URL in the apim-portal application. Step 4: Configure the API Now I’ll setup my PQR API in API Management to require authorization using the new OAuth2 configuration: Again, I’ll add a policy to the API that validates the token: f498336e-d99f-xxxx-xxxx-22e3f7d87e56 The only interesting bits here is that the openid-config element should point to the OpenId metadata endpoint for our Sign-in policy, and the audience of the generated token will use the Application Id rather than the App ID URI. Now, I should be able to obtain a token from the Developer Portal for my A[...]



Protecting APIs with OpenId Connect in API Management

Sat, 11 Nov 2017 00:00:00 +0000

In my last post, I outlined a customer scenario for protecting an API through OAuth2 in Azure API Management. I mentioned in it that I had been unsuccessful at using OpenId Connect, rather than raw OAuth2. After some more testing, and some help, I was able to get this working, and wanted to share how I did it. Once again, I’ll assume you already have an API implemented and configured in API Management. I’ll use the same PQR service I used last time as an example. Step 1: Creating the Azure AD Application The first step is to create the Azure AD application. In this case, we will not be creating 2 separate applications like last time; we only need one. In the Azure Portal, I’ll go over to my Azure AD instance and add a new application registration. I’ll call this one aad-oidc-pqr: After creating the application, there are a few things we need to change: We need to mark the application as multi-tenant. Otherwise, the developer portal in API Management} will not work correctly, and you will get an error similar to AADSTS70001: Application 'xxxxx-xxxxx-xxxxx-xxxxx' is not supported for this API version. (Optional) Enable the OAuth2 implicit flow. We can use the application properties window to mark the application as multi-tenant: To enable the OAuth2 implicit flow, we need to edit the application manifest: Now, copy the application ID, and generate a new Key for the application. Make a note of both as we will need these in a moment. We’ll come back later to the application to configure the reply URLs. Note: Another alternative is creating the Azure AD app as a converged application, but I was only able to make it work with the implicit grant flow. Step 2: Configure OpenId Connect Authorization Back in API Management, we can configure a new OpenId Connect Authorization service. Using the Azure Portal, we will find this under the OpenId Connect option, and in the Publisher Portal it will be under Security -> OpenId Connect. For this part, we’ll need: The OpenId Connect metadata for our Azure AD tenant, which will be in the form https://login.microsoftonline.com/{tenant}/.well-known/openid-configuration. The id of the Azure AD application we created in step 1 The matching key for the application Once we’ve created the OpenId Connect Authorization Service in API Management, we need to go back to the Azure AD Application, and add both the authorization code grant and implicit grant redirect URIs to the Reply URLs collection of our application: Step 3: Configure API Just like in my previous post, I need to configure my PQR API to require OpenId Connect authorization: And again, I want to setup a policy on my API to validate the JWT token: 24f98265-c230-4668-a40b-11aa1b02c29c This is exactly the same as last time, only that when using OpenId Connect, the audience in the token will contain the Application Id, rather than the App ID URI of the Azure AD application. Step 4: Test! At this point, we should be able to use the API Management Developer portal to test that OpenId Connect works with our API: [...]



Protecting APIs with OAuth2 in API Management

Thu, 09 Nov 2017 00:00:00 +0000

I’ve been playing a lot lately with Azure API Management. Recently, a customer asked me about the following scenario: They wanted to expose a Web API through API Management API Management should enforce and validate that an OAuth2 token was provided by the caller The underlying API did not know (or care) about the OAuth2 token. There is an article on the API Management documentation about this very topic, but that one assumes that the Web API itself is setup to accept OAuth2 tokens, which is a bit of a more complex scenario. While good, I found the article a bit confusing to follow, so I thought I’d document here the steps I followed to test the customer scenario. I’ll assume we already have an API implemented and published in API Management and that we want to use Azure Active Directory as the OAuth2 provider. For this article, I’ll use an API I called PQR in API Management. Step 1: Register the Azure AD applications The first step would be to register a new Azure AD application to represent our API. I’ll create a new application like this: Next create a second application, which we’ll call apim-portal. This one will be used to represent the API Management Developer Portal, so that we can test our APIs from it. Once the application is created, we need to generate a new key for it: Make a note of both the application Id and the new key, as we’ll need this in a moment. Also, grant permissions to apim-portal to call the apim-pqr application. You could also create the application easily using the az ad sp create command in the Azure CLI. Step 2: Adding the OAuth2 authorization server Now we can configure a new OAuth2 authorization server in our API Management instance. We’ll do this on the new experience in the Azure Portal. First, go into the OAuth 2.0 section in the portal, and click the + Add button. I will call this instance aad-oauth2-pqr: We don’t need the client registration URL for now, and besides, there doesn’t seem to be a pretty way to configure it for AAD. I’ll leave only the Authorization code grant type enabled for now, and I’ll configure the authorization endpoint URL for my AAD Tenant: The authorization URL will be in the form https://login.microsoftonline.com/{tenantid}/oauth2/authorize. Then we can configure the Token endpoint URL, as well as adding the value of the resource parameter. The latter should be configured with the value of the App ID URI field of the apim-pqr application we created in the first step: The token URL will be in the form https://login.microsoftonline.com/{tenantid}/oauth2/token. We’ll leave the default scope empty for now, and configure the Client ID/Secret that we copied after creating the apim-portal application in step 1: After this, just click the Create button to save the changes. Note that you can also do this using the Publisher Portal, with almost the exact same user experience. Before moving on to the next step, there is something missing. In the screenshot above, notice there’s a field with the redirect_uri to be used with API Management. We need to copy this URL, and add it to the Reply URLs of our apim-portal application in Azure Active Directory. Otherwise, authentication with fail later on. Step 3: Configure the API to use OAuth2 authorization The next step is to configure our PQR API so that API Management knows that invoking the API requires an OAuth2 token. On the Publisher Portal, we can modify this from the Security tab of the API properties. On the Azure Portal, we’d configure this from the API settings under the Security headline: Notice that doing this doesn’t actually cause API Management to enforce that an OAuth2 token is provided at all. Instead, it adjusts the Developer Portal experience so that you can acquire the OAuth2 token from the authorization server when trying out o[...]



Decoding Application Gateway Certificates

Thu, 02 Nov 2017 00:00:00 +0000

Recently, I wanted to write a PowerShell script that would check expiration on the certificates assigned for SSL/TLS on Azure Application Gateway resources. Obtaining the certificates is easy through the SslCertificates property of the Application Gateway instance. However, it took me a while to figure out how to actually extract the base64-encoded data into an X509Certificate2 instance. Turns out that the certificate is returned in PKCS7 format (also known as P7B), so you need to use the SignedCms class to decode it. Some sample code: function Test-CertExpiresSoon($cert) { $span = [TimeSpan]::FromDays(30) $today = [DateTime]::Today return ($cert.NotAfter - $today) -lt $span } function Decode-Certificate($certBytes) { $p7b = New-Object System.Security.Cryptography.Pkcs.SignedCms $p7b.Decode($certBytes) return $p7b.Certificates[0] } $gateways = Get-AzureRmApplicationGateway foreach ($gw in $gateways) { foreach ($cert in $gw.SslCertificates) { $certBytes = [Convert]::FromBase64String($cert.PublicCertData) $x509 = Decode-Certificate $certBytes if (Test-CertExpiresSoon $x509) { [PSCustomObject] @{ ResourceGroup = $gw.ResourceGroupName; AppGateway = $gw.Name; CertSubject = $x509.Subject; CertThumbprint = $x509.Thumbprint; CertExpiration = $x509.NotAfter; } } } } A PKCS7 envelope can contain multiple certificates, so I might have to revisit this later on in case that is relevant, but it was not an issue in the original scenario. [...]



Logic Apps KeyVault Connector - Part 3

Thu, 26 Oct 2017 00:00:00 +0000

This is part 3 of my series of articles on implementing a custom Logic Apps connector. Read part 1 and part 2. By this point, I’ve implemented and deployed the WebApi application that implements the custom connector, and configured the necessary applications and permissions in Azure Active Directory. It’s time to create the custom connector itself. A Logic Apps custom connector is an Azure resource of type Microsoft.Web/customApis. I’m unsure how much of the creation of such a resource can be automated through ARM or other tools, so I’ll create it manually for now. Step 1: Create the resource The first step is to create the new resource of type Logic Apps Connector: Then provide the following information: A couple of details that are important here: The name of the custom connector resource is important. By default, this is the name you will see in the Logic Apps designer when you try to use it. I have not found a way, so far, of customizing it after creation. The Azure region you choose is important. You will only be able to use the connector from Logic Apps created in the same region. Step 2: Configuring the connector Once the custom connector is created, open the resource and click the Edit button at the top. That will open the editor where we can define the connector information, security configuration, and the connector operations. The easiest way to start is by uploading an existing OpenApi (Swagger) definition for your connector. As I mentioned in Part 1 of the series, I already customized the Swagger generation in the connector code so that it can be used quickly. So I’ll start by pointing the connector to the Swagger definition: This will initialize the connector operations and other information. Then, we can add an icon, background color, description, and the base URL where the connector WebApi is located: After clicking the Continue button, we’ll move to the security section. Here, Logic Apps recognizes that our connector needs OAuth2 authentication. We need to click the Edit button at the bottom and customize it by adding: Client Id: This is the Application Id of our KeyVault Connector Client application we created in Azure AD. Client secret: This is the key we generated for the application. Scope: This is the App ID URI of our KeyVault Connector application. The PowerShell script I presented in the part 2 will register this as https://[tenant-domain]/[appname]: Notice here the Redirect URL field. This will be empty until we update the connector. We’ll discuss this in a moment. Again, we click the Continue button to move to the final tab (Definition). Here, you can customize the definition of the operations that are implemented by the connector. We don’t need to customize anything here for now, as our original Swagger definition already works as is. To complete creation of the connector, press the Update Connector button at the top! Step 3: Update the redirect URI Now that the connector is fully defined, go back to the Security tab and find out the value for the Redirect URL field. Since I created the connector in the East US region, it ends up being https://logic-apis-eastus.consent.azure-apim.net/redirect. We will copy this value, and use it to update the Reply URL for the KeyVault Connector Client application in Azure AD: Notice that we update the URL on the client application used by Logic Apps to initiate the authentication flow, not on the connector application, as the later should always point to the deployment URL for the WebApi. This step is very important, because otherwise authentication will not succeed for our application. Step 4: Test it Now we’re ready to test our connector! Let’s create a new Logic Apps, and use it: Let’s select the ‘Get the value of a secret’ operation, an[...]