Changing the App Service Plan of an Azure App Service

To allow a number of App Services to scale independently I needed to pull one of them out of an App Service Plan where it had lived with 3 others to sit in it’s own plan – experience had shown me that it’s scaling characteristics are really quite different from the other App Services.

You can do this straightforwardly and pretty much instantly either in the Portal (there’s a Change App Service Plan option in Settings) or with PowerShell (with the Set-AzureRmAppServicePlan cmdlet).

Super simple – but I did encounter one gotcha. This doesn’t move any deployment slots you might have created and so you end up in a situation with the main App Service sat in one App Service Plan and it’s deployment slots in another which probably isn’t what you want and, in any case, Azure won’t let you swap slots in different service plans.

The solution is simple: you can also move them between App Service Plans in the same way.

 

Serving Static Markdown Content from ASP.Net MVC to JavaScript

I recently moved a bunch of documentation into the Markdown format as I wanted to render it into multiple output formats and inside multiple hosting technologies – including an AngularJS based single page applicaiton.

While doing this I decided it would make sense to have a single source of truth for these files and so placed them as content inside my MVC 5 based website that is entirely public access. After dropping them into the website the first step is to configure ASP.Net to serve the content which involves adding the below to a web.config file:

This enables the Markdown to be served to a browser but if you try and download it from JavaScript you find it blocked by CORS. I solved this with a small ASP.Net module that is installed in the web.config above and the code for which is below:

Hope it’s useful to you.

Signalling API Unavailability

When doing upgrades of websites it’s often useful to be able to signal to users that your service is offline for maintenance either in part or in entirety which is quite straightforward to implement unless you’ve got something like an AngularJS or React app, that could well be cached in the browser, and that actually wants to respond to 503 status calls returned from a web based API. Then CORS has a habit of getting in the way.

To help with that I’ve just pushed this super simple and lightweight ASP.Net website to GitHub that will respond with a 503 status code to any request made of it while ensuring that the CORS protocol will succeed meaning that the 503 status code will make its way through to your own error handling.

It’s ideal for hosting in an Azure deployment slot during upgrades that swap slots.

Note: an alternative approach would be to use the URL rewriter in web.config. It’s not particularly intuitive or, to my taste, readable but I believe can be configured to perform the same task.

Accidental Fish Application Support v3.3.0 Release

Last night I published a minor update to this framework to GitHub and NuGet that adds new timer capabilities via the new ITimerFactory interface.

An interval timer is available that runs an action, then pauses for the specified on completion, and runs the task again until cancelled (either by the action itself or a cancellation token). A regular metronomic timer is available that runs an action n seconds irrespective of task duration. Importantly in the latter case if the action takes longer than the duration of the timer to complete it will be cancelled to prevent compounding overlapping action issues (excessive CPU usage, out of memory etc.).

I’ve also added an IBackoffFactory interface that allows the backoff policies to be created with custom backoff timings. The backoff policies continue to be directly injectable with their default timings.

I hope these additions are useful. Feedback is, as ever, welcome.

Microservice Analytics – Update

For those who’ve got in touch asking for a beta code – thank you for your interest, it’s very much appreciated and I’m nearly ready to start issuing them.

Before releasing them I wanted to make sure I was capturing user and session data properly and that that was getting integrated deep across the other measures and data I capture. Happily this is all working and I’ve just deployed it to my live site.

In this initial release, unless you configure data capture otherwise, then magic numbers will be generated for users and sessions (you can override this to provide your own IDs if you so choose) and these are correlated with everything else that is happening so that it is possible to look at the data from both perspectives:

  • What are your users doing and what system activity is that generating in your applications?
  • Given an system activity (say a SQL command or Web API call) which user initiated it and/or under which session?

There are some specific views to help with the above but in addition you can tag any user or session and then the whole user interface is filtered by that allowing you to explore quite organically this subset of data.

I’ve attached a couple of screenshots below (taken immediately after deploying the new code to live – so not many users and sessions captured yet!) and as I mentioned earlier I’m nearly ready to share those invite codes, thanks for your patience.

sessions

pinned

Microservice Analytics – New Analytic / Diagnostic Solution

I spend a lot of time working with highly distributed systems that use an SOA / microservice type architecture and increasingly are accessed through clients such as native apps, Cordova app, or JavaScript SPAs and it can be challenging to understand what is going on across multiple servers and devices when faults occur or you need to understand something.

I wanted a tool that was fairly non-invasive, had a highly interactive user interface, and would capture a wide breadth of data and was reasonably priced. There were solutions that did bits of what I wanted (particularly at the high end in terms of price) but nothing that really hit the sweet spot for me particularly in terms of being able to drill through and across data to form a coherent picture.

This sort of thing really is my cup of tea as a developer and so I went off and built it coming at this from two directions:

  1. When errors occur in the system I want to know everything that was happening around them across devices and servers and to be able to drill into it all and find related items.
  2. To understand how users are behaving and experiencing the system and to be able to jump off and drill into areas of interest.

I’ve been dogfooding on it for a while now and the first part is pretty much ready to roll and the second part has most of the plumbing in place and is getting expanded out over the next few weeks. It’s tentatively called Microservice Analytics and there are some screenshots below to whet your appetite.

It’s one of those projects that is almost endless in terms of where you might take it (I’ve got a list of enhancements as long as my arm just from my own usage) but at some point I need some feedback and to figure out it its useful to anyone but me and so I’m just starting to issue beta invite codes.

If you’re interested get in touch with a comment below.

overview

related_summary

 

related_timeline

http

Azure Resource Manager and Powershell 1.0

Microsoft recently updated Azure Powershell to 1.0 in the process introducing a large number of breaking changes to the CmdLet’s. Essentially they’ve removed Switch-AzureMode and instead of that the Azure Resource Manager cmdlets have all had an Rm introduced to them so for example:

New-AzureResourceGroupDeployment

becomes

New-AzureRmResourceGroupDeployment

For the most part the errors on upgrading a Powershell script are obvious and a rename will get you going however somewhat confusingly there are now pairs of cmdlet’s that do the same thing but only effect the different cmdlet sets.

An example of one that caught me out is Select-AzureSubscription. I’ve got numerous Azure subscriptions and scripts that deploy into different subscriptions depending on what I’m doing. Previously I used Select-AzureSubscription along with Switch-AzureMode and this worked. The problem is that Select-AzureSubscription still works – but it has no effect on the cmdlets that use the Rm prefix.

This led, confusingly, to a resource group being created in a different subscription to the one I intended and resources with the same name as in my intended resource group (resources which already existed) being created, or attempted to be created, in this new resource group. They failed as they already existed with the same names elsewhere.

The fix was to use Select-AzureRmSubscription.

 

Accidental Fish Application Support v3.1.0 Release

This week I’ve published a couple of updates to this framework taking it to v3.1.0.

Release notes for v3.0.0 are here and for v3.1.0 here.

There are breaking changes moving to the v3.x series as I’ve completely overhauled the logging framework to allow for the introduction of robust logging providers deeply into the framework including a new NuGet package for Serilog. There is a sample showing how to use this package and the new logging framework.

In addition I’ve upgraded the queueing system with some new features.

Firstly there is brand new support for large message queues – as in queues with message sizes in the megabytes or gigabytes. This is surfaced through the ILargeMessageQueueFactory and ILargeMessageQueue interface and combines a standard queue with a blob repository. The ILargeMessageQueue interface is derived from the standard IAsynchronousQueue and so can be used interchangeably and transparently with existing queue consumers.

Finally the queues now provide access to message properties where the underlying queue supports this feature – for example Azure Service Bus queues, topics and subscriptions.

I’m planning to add support for DNX and Entity Framework 7 in the near future (weeks, not months).

As ever if you have any issues or feedback then you can get in touch here or over on GitHub.

Multi-Tenanted Authentication with Azure AD and Office 365 (and IdentityServer3)

With solutions such as Azure AD and Office 365 becoming more common as a source of an organisations identity on the Internet it can be useful to have an application offer authentication against them. As a typical scenario let’s imagine you’ve developed a time tracking SaaS solution and you have a number of customers who wants to use it but logon with their Office 365 identities.

It’s a bit of a lengthy process but not difficult to do when you know how and hopefully by way of this worked example I can show you how to add this functionality yourself.

As it’s fairly commonplace to want to offer other external identities alongside Office 365 I’m going to integrate this into the excellent IdentityServer3 OpenID Connect Provider however nothing here really depends on IdentityServer3 and if you’ve got a basic familiarity with OWIN it should be fairly obvious how to decouple what I’ve done here.

The source code for all the below can be found in GitHub.

Setup the Visual Studio Project

Firstly open up Visual Studio (I’m using 2015) and create a new ASP.Net Web Application. In the New ASP.Net Project dialog select to build an MVC application and ensure that the authentication method is set to No Authentication:

1

Now you need to add a set of NuGet packages to the project:

Install-Package Thinktecture.IdentityServer3
Install-Package Microsoft.Owin.Host.SystemWeb
Install-Package Microsoft.Owin.Security.Cookies
Install-Package Microsoft.Owin.Security.OpenIdConnect

Now in the properties pane for the project enable SSL and note the port number:

2

And in the project properties (I love how this is spread liberally about the place!) set the start page to be the https site:

3

Now add a class called Startup to the root of the project as below. As we’ve added the Microsoft.Owin.Host.SystemWeb package it’s going to look for this on startup and will throw an exception if it can’t find it – we’ll fill it in later.

4

 

Now add the following to your web.config file:

15

 

At this point you should be able to run your solution and the boilerplate MVC website should appear secured with a self signed SSL certificate generated by Visual Studio.

Setup an Azure AD

We’re going to use two domains for the following – one for our application to authenticate against and federate against other directories and a second domain that we’re going to use to login with.

If you’ve got an Azure subscription you should already have a default domain so we’ll use that for logging in.

To enable the multi-tenanted scenario create a second Azure AD in the (old) management portal. This AD is used to host an Azure AD Application and handles the federation between the other domains.

Go to the Applications tab for your directory and add an application, select “Add an application my organisation is developing”. Give the application and appropriate customer facing name (they will be able to see this in their own domain configuration and during sign up) and set this to be a web application:

5

In the next page set the sign on URL to be based on the the root of your web project (as you noted earlier) extended with the path /identity/signin-azuread giving you a URL such as:

https://localhost:44300/identity/signin-azuread

In the second box you need to set a unique URI for your application. The best way to do this is based on your own domain in the form http://mydomain.onmicrosoft.com/myapplication:

6

Azure will now whirr away for a second or two before showing you the application dashboard. Click the configure tab. Move down to the middle of the page and set Application is Multi-Tenant to on and create a new key. Then hit save. The key will display after you’ve hit save. Note down both the key and the client ID as you will need them later.

7

Configure IdentityServer3 and Azure AD / Office 365 as an Identity Provider

Back in Visual Studio edit the Startup class we created earlier. Firstly add a folder to the root of the project called Configuration, download the IdentityServer3 test certificate, and add it to this folder setting it to Copy if Newer in the properties panel:

8

IdentityServer3 will use this certificate (when configured below) to sign the tokens it issues however it’s important to be clear that in a production environment this certificate needs to be generated and kept securely. The certificate used here is a certificate used in the IdentityServer3 tests and stored / loaded in a manner designed for brevity in the source code rather than security. In a production environment you’d keep this in your certificate store.

Now configure IdentityServer3 as an endpoint within this solution by adding the following to the Startup class:

9

We’ve set IdentityServer3 up to use it’s in memory stores and created a client that we’ll later authenticate against (see the IdentityServer3 docs for more details on this).

We now need to configure Azure AD / Office 365 as an identity provider by filling in the ConfigureIdentityProviders method:

10

It’s vital that the client ID and redirect URI supplied to the Open ID Connect options exactly match those in the Azure AD application you created earlier and the Authority string must look exactly as written here including the trailing /.

As this point things will build but there’s nothing in our solution requiring authorization however you can navigate to the published configuration endpoint and should see some json describing our service. My endpoint is at:

https://localhost:44300/identity/.well-known/openid-configuration

Authorizing and Viewing Claims

We’re going to use an Authorize attribute on an MVC action to test our work so far. To begin with we need to get our MVC project to use the token end point we’ve created above. We need to add the following to the bottom of our Configuration method:

11

You need to ensure you use the port your website is running on on your development machine. Mine is on 44300.

Modify the About action in HomeController.cs to look like the following:

12

And the corresponding View to:

13

Now run the project and if you’ve got all the magic numbers and URLs right you should be navigated to the Azure AD logon page:

14

Login using a different AD to the one we set up earlier and if you now login you should end up back at the about page with a set of claims showing:

16

If you now head back into the Azure Management Portal and inspect the Applications tab for the Azure AD you used to login with you should find that the application we created earlier has been added here:

17

If all that worked then great – that’s the happy, easy, path dealt with! If not I suggest checking out my number one tip for dealing with IdentityServer3 issues.

The Administrator Consent Workflow

Depending on your domain configuration (the one you logged in with) then you might have hit the error page shown below:

18

Many domains, particularly of larger organisations, will have enabled Administrator Consent. This prevents users from being able to use their domain account to access resources they are not authorized to – unfortunately the error above isn’t very useful to end users.

To allow users in a domain configured this way to be able to access your application you need to implement a workflow that allows the Administrator to grant consent for users. Before continuing, if you’ve followed the steps above, then in Azure delete the application from your login domain – select it in the management portal and click Manage Access, then click Remove Access.

Firstly let’s turn on Administrator Consent in our login domain – select the domain in the management portal and click the Configure tab. When Azure has finished whirring away scroll down to the Integrated Applications section and set “USERS MAY GIVE APPLICATIONS PERMISSION TO ACCESS THEIR DATA” and “USERS MAY ADD INTEGRATED APPLICATIONS” to no. Then press save.

19

Now in the users section of your domain add a new user and give them only the User role:

20

Next head back to the Azure AD application we created at the start of this post, click the Configure tab and scroll down to the single sign on section. Enter a URL for the root of your website and click save:

24

Ensure the browser is closed and if you now run the Visual Studio solution again and attempt to login with this new user then you should see the error message shown earlier. If not check the configuration in Azure once more.

At this point we’ve got a domain that requires administrator consent for users to attach to our application. Begin by adding a new Nuget package:

Install-Package Microsoft.IdentityModel.Clients.ActiveDirectory

Then add a class called AzureTenant as shown below:

21

We now need to be able to manage our tenants. Normally you’d use some form of persistent store for this but I’m going to use a very basic in memory service by way of an example. The full service code is in GitHub here however there are two important sections. Firstly our controller (which we’ll work on later) will create a temporary tenant before handing off to Azure and later on after association is complete we convert this to a permanent tenant. This two stage process can be useful as in a more realistic example you may want to track some additional state across this request however all you need is someway of attaching data to a persistent unique ID – reusing the tenant like this keeps the example simple. However this code can be seen below:

25

When our controller hands off to Azure to handle the domain association we need to do so by forming up a URL and doing this can be seen below:

26

And finally following the association we need to retrieve the tenant ID for the newly associated tenant (and this is where your client key created at the beginning of this piece is required):

Now we need to add an MVC controller that is going to present this functionality to the user. There is some boilerplate at the bottom but I want to focus here on the AdminConsent and Associate methods as shown below:

22

The default admin consent view presents a button to the user and a paragraph confirming what they are about to do – when the post back hits the controller it does two things. Firstly it creates a temporary tenant in our data store and then redirects to the Azure authorization endpoint using the temporary ID of the tenant as the state.

When the user confirms the association in the Azure AD management page presented during the redirect then it will call back to the Associate action passing in the temporary tenant ID as the state. In more complex examples tracking the state in this way allows us to track back the Azure request to any information we’ve gathered before confirmation. The Associate action then uses our in memory service to confirm the association and handles the routing off to the appropriate page based on success or failure.

Finally having done all of the above we’re going to put a link to our admin consent page on the home page. I modified the Jumbotron to point at the AdminConsent view.

At this point you should be able to run the solution (and this is the complete solution in GitHub) and if things are wired up correctly then you should be able to click your admin consent link. Try this with your test user and it will fail but if you then try it with your admin user you should see a screen like this:

23

However if you’ve got everything right you should end up back at the success screen on your website and can now navigate into the About section.

Hopefully that helps anyone trying to figure out how to do this themselves and feel free to ask me questions in the comments or on Twitter.

Finally, worth repeating – the code is all available on GitHub.