Author: James

React Sample App

Having built a number of reasonably large apps using Angular (v1.x) I began to grow uncomfortable with some of its pain points and so started to look for alternatives. As a result I’ve spent a lot of time over the last 12 months with various JavaScript frameworks but in particular I found myself settling on React combined with Redux.

I find I’m an increasingly functional programmer and the idea of the user interface as a clean expression of state appeals to me massively. I’ve also found that when using React I feel like I’m using the JavaScript language itself, particularly ES6, rather than alternatives to what are becoming core language features.

However although the documentation is pretty good I found I started to struggle to find good patterns and practices when I moved away from simple ToDo type apps. As an experienced developer what I really wanted was some guidance illustrated by code in a working app.

To that end I recently build a sample app that illustrates solutions to typical problems you’ll encounter when moving beyond a ToDo app and have started working on a short book to go along with it. Initially I’d intended to do this as a series of blog posts but as I started to write them I realised the structure was not well suited to the blog format. My intention is still to push this out, at least largely, as open source probably under a creative commons license.

As a first step I’ve published an early version of the sample app onto GitHub and you can find it here. Constructive feedback would be very welcome and the best way to get in touch with me is on Twitter. I’ll have more details on the book as the content develops further.

segmentcomparison

A Simple Voxel Engine – Christmas 2016 Project

Happy New Year everyone – I hope everyone reading this had a great end of year break and has a fantastic 2017.

I suspect I’m not alone in that I like to spend a little time over the Christmas break working (in the fun sense of the word) on something a bit different from the day job. This year I took about 4 days out to work on a really simple voxel engine as both the look and constructible / destructible feel of voxels have always appealed to me, perhaps as a result of a childhood playing with Lego. Though bizarrely I’ve never played Minecraft.

The code for the engine along with a couple of demonstrations can, as ever, be found over on GitHub.

I didn’t want to feel like I was writing glue code and so rather than use an existing library I started with a blank C++ file, OpenGL, and an awful lot of ignorance. Now to be fair I do have a reasonable background in C, C++, graphics and systems programming – before this web thing came along I was working with embedded systems and 2d graphics engines using C, C++ and 68000 assembly – but that was back in 1994. I continued as a commercial C++ developer for 4 more years but then found myself working with C# and the first .Net beta via a brief foray with Delphi.

So I was rusty. Very rusty. However I had enthusiasm on my side and persevered until I had a couple of demo’s and some code that I wasn’t too embarrassed to share. The video below is me navigating a voxel “landscape” created using a black and white self portrait as a height map:

First step was some early research and I quickly discovered some really useful resources:

  • GLFW – a great library for abstracting away the underlying hardware and OS and getting you a window to draw in
  • Glad – OpenGL is a strange, strange, place and chances are you will need to use a “loader” to get access to the full API. This web service makes it really easy.
  • GLM – A header only mathematics library based on the OpenGL shader specification
  • Learn OpenGL – tutorial series
  • OpenGL Tutoral – another OpenGL tutorial series

Being conscious C++ had changed a fair bit since my youthful glory days I also skim read Accelerated C++ and Effective Modern C++. I’ll be returning to the latter and reading it in some more depth as I’m sure my code is still pretty stinky – but it works!

Day 1 – I Know Nothing

Armed with my new found expertise I set myself a goal of generating and rendering a landscape based on Perlin Noise but given I couldn’t yet draw a cube I thought I’d better bootstrap my way up. This first day was the toughest and involved a fair amount of staring at a blank screen, cursing, and feeling extremely thankful for the safety of managed environments. However at the end of the day I managed to get not only a single cube on the screen but a set of voxels arranged into a chunk rendering (a chunk is an organisational concept in a voxel engine – collections of voxels) and you can see that below:

2016_12_29_1

I didn’t yet have any lighting so to help me process what I was looking at I did a simple adjustment of the colours around each vertex.

The next step was to display several chunks together. My first effort at this went… badly. Only one chunk was ever displayed. After spending an hour looking for obscure OpenGL problems I realised I’d done something more basically dumb with C++ (we’ve all copied something we thought we were passing by reference right?), pounded my head into the desk a few times, corrected it and hurrah. 4×4 chunks!

2016_12_29_2

Flush with success wine may have been consumed.

Day 2 – Algorithms and Structures

Sitting at the desk with a slightly groggy head – which the wine had nothing to do with, nothing, not a damn thing – I started to ponder how to arrange and manage chunks.  The memory requirements for voxels can be huge. Imagine that each of your voxels takes 16 bytes to store it and that your landscape is a modest sized 1024 voxels wide, 32 voxels high, and 1024 voxels deep – that’s 536,870,912 bytes, better known as 512Mb, of storage needed before you even account for rendering geometry.

It’s clear you’re going to need someway of managing that and for practical usage compressing and/or virtualising it.

I came up with, and by came up with I mean I’d read somewhere and recalled it, the concept of a chunk manager (this came up somewhere in my research), chunk factories (this didn’t come up in my research but seems a logical extension) and a chunk optimiser (not yet implemented!).

The main decision here was how low to go – so to speak. The most optimal route likely involves getting the STL out of the way, pre-allocating memory and organising it in a very specific manner. However that also seemed the route most likely to feel like having your teeth pulled. In the end I decided to go with the path of least resistance and mostly used the std::vector class with pre-allocated sizes where known.

Nothing to see from my efforts yet except the cube still shows but using the new code. Which while not worth a screenshot was a result!

With that out the way I needed a perlin noise generator to implement as part of one of my new fangled chunk factories. I’ve written one before for a lava type effect at some point back when you had to find the algorithm in a book and figure out how to write it yourself. Having no wish to repeat that painful experience I quickly found a great post and gist on GitHub written in C# and converted it to C++.

I hit F5, Visual Studio whirred away for a few seconds, and to my utter amazement my good run of luck continued and I had an ugly but working voxel landscape.

2016_12_30_2

Good times. And good times when you’re on your Christmas break deserve wine.

Day 3 – This Is Far From Light Speed

Without any lighting it was hard to really get a sense of the terrain in the landscape that I could see but before I could do something about that I needed to address some performance issues. Rendering was fine – I was cruising at a vertical refresh locked 60 frames per second. When rendering started. Problem was that even on my 4.5GHz 6700K i7 it took about 90 seconds to generate the voxels and the initial geometry.

Normally I’d advocate measuring before going any further but the issue was self evidently (famous last words…. but not this time) in a fairly narrow section of code so step 1 was a code review which quickly highlighted a moment of stupidity from the previous day: I was calling the perlin noise function 16 times more than I needed to.

I could see from Visual Studio that memory allocation was also sub-optimal as it was occurring on an “as needed” basis but as previously noted I’m favouring clarity and simplicity over gnarly. However Visual Studio was also highlighting another issue – my CPU was only being used at around 15%. The code is single threaded and I’ve got 4 cores capable of supporting 8 threads.

Haha thinks me with my C# head on. This is eminently parallelisable (the clues being in the names “chunk” and perlin noise “function”) and C++ must have a decent task / parallel library by now. Oh yes. Futures. They’ll do it.

Oh boy. Oh boy oh boy. That hurt. Converting the code to use lambda’s and futures was straightforward but then the issues began – turns out there is not yet an equivalent for something like Task.WhenAll in the C++ standard library. I got there in the end after a bit of a deep dive into modern C++ threading concepts (I really didn’t want to use the OS primitives directly as I want to make this multi platform) and solved this with locks and condition_variables.

I’ve tested it on a variety of systems (with different performance characteristics and cores available) and it hasn’t crashed or done anything unpredictable on a variety of different systems but I’m sure there is something wrong with this code so if any C++ experts read this I’d love some pointers.

So now its quicker, about 6 or 7 seconds to generate a respectably sized landscape, but it looks the same and most of the day has gone.

Fortunately adding basic lighting proved much simpler. Simple voxels, cubes, are pretty simple to light as the normals are somewhat obvious and there is plenty of documented examples of using vertex normals inside vertex and fragment shaders to apply the appropriate colouring. A couple of hours later I had ambient, diffuse, and specular lighting up and running and could turn off my dodgy half shading.

2016_12_31_2

A great way to go into New Years Eve and definitely earning of a celebratory brandy. Slightly warmed and in an appropriate glass of course.

Day 4 – This Code Stinks

Old Man Randall (me) fell asleep long before midnight and so I woke up fresh and eager on New Years Day to do some more work. I’d had the brainwave a few days ago to use photographs as the source for landscape height maps using the brightness (whiteness in black and white photographs, calculated perceived brightness in colour photographs) of each pixel to build an appropriately high stack of voxels.

However before I could do anything else the code really needed some work. It had the distinctive whiff of a codebase in which many new things had been tried with little idea of what would work or not. Structure had become scrambled, inconsistencies were everywhere, and commented out attempts at different approaches abound. Not to mention I’d built everything into a single executable with no attempt to separate the engine from it’s usage.

A few hours later and although still not perfect I had something that was workable and set about implementing my voxel photographs starting with black and white. Being a supreme narcissist I selected a photograph of myself from my flickr library and busied myself trying to load it. This proved to be the hardest part – I’d forgotten how painful C++ and libraries without package managers can be (is there a package manager for C++?) with chains of dependencies.

In the end I realised that cimg could load bmp files without any dependencies and implemented a new chunk factory that used that to scan the pixels and create voxels based on their whiteness.

This worked a treat and I captured the video of me moving around the “landscape” that appears at the top of this post.

2017_01_01_1

A great way to finish off and in a burst of enthusiasm I quickly sent the link to my fans. Or fan. By which I mean my mum. No celebratory brandy sadly as structured cycling training has restarted but happiness abounds.

Conclusions and What Next?

A few things struck me throughout this short project that I think are worth collecting and reflecting on:

  • The modern Internet truly is a wonderful thing. The last time I was coding in C++ (about 1998) it wasn’t the Internet of today and things were so much harder. It’s an amazing resource.
  • The development community is amazing. There are so many people giving their time and expertise away for free in the form of blog posts, articles, and code that unless you’re doing something really on the bleeding edge there is someone, somewhere, who can help you.
  • Wow C++ has changed – for the better. It’s still complicated but there’s a lot more available out of the box to help you not make simple mistakes.
  • GPUs are incredible number crunching machines. I’m fortunate enough to have a pretty top notch GPU in my desktop (a 980Ti) but the sheer number of computations being performed to render, in 60 fps, some of the scenes I threw at this code is astounding. I knew in theory the capabilities of GPUs but didn’t “feel” it – I’m now interested where I can apply this power elsewhere.
  • Iterative approaches are great for tackling something you don’t know much about. I just kept setting myself mini-milestones and evolving the code, not being afraid to do something really badly at first. By setting small milestones you get a series of positive kicks that keeps you motivated through the hard times.
  • Coding, particularly when it involves learning, is fun. It really is. I still love it. I’ve been coding for around 34 years now, since I was 6 or 7, and it still gives me an enormous thrill. My passion for it at its core is undiminished.

As for this project and what I might do next. I’ve got a couple of simple projects in mind I’d like to use it with and a small shopping list of things I think will be interesting to implement:

  • Mac / *nix support
  • General optimisation. I’ve only done the really low hanging fruit so far and there is scope for improvement everywhere.
  • Voxel sprite support
  • Shadows
  • Skybox / fog support
  • Paging / virtualisation – with this in mind this is one of the main reasons that scenes are constructed using the chunk factory
  • Level of detail support
  • Simple physics

Whether or when I’ll get time to do this I’d rather not say, I have a big product launch on the horizon and a React book to write, but I’ve had so much fun that I’m likely to keep tinkering when I can.

 

Debugging the Visual Studio Team Services Build and Release System

I’ve been doing a lot of work with the new task based build and release system in VSTS which, thankfully, is immeasurably saner than the XAML abomination Microsoft have subjected us to in the past.

Mostly things have “just worked” but on a couple of occasions, normally while deploying to Azure, I’ve had inscrutable summary error messages presented to me and needed to know more about what exactly was run.

It turns out its really easy to flip the whole system into verbose logging mode by adding a variable to your build or release called System.Debug and setting its value to True.

Super useful and on each occasion I’ve had an issue this has helped me get to the bottom of it.

PowerShell, Binding Redirects, and Visual Studio Team Services

I’ve blogged previously about setting up binding redirects for Powershell with Newtonsoft.Json being a particularly troublesome package – it’s such a common dependency for NuGet packages that if you deal with a complex project you’ll almost certainly need a redirect in your app/web.config’s to get things to play ball and if you use the Azure cmdlets with others (such as your own) you’re likely to face this problem in Powershell.

I’ve recently moved my projects into Visual Studio Team Services using the new (vastly improved!) scriptable build system where I often make use of the PowerShell script task to perform custom actions. If you hit a dependency issue that requires a binding redirect to resolve then my previous approach of creating a Powershell.exe.config file for PowerShell won’t work in VSTS as unless you build a custom build agent you don’t have access to the machine at this level.

After a bit of head scratching I came up with an alternative solution that in many ways is neater and more generally portable as it doesn’t require any special machine setup. My revised approach is to hook the AssemblyResolve event and return a preloaded target assembly as shown in the example below:

Note that you can’t use the more common Register-ObjectEvent method of subscribing to events as this will balk at the need for a return value.

You can of course use this technique to deal with other assemblies that might be giving you issues.

Capturing and Tracing All HTTP Requests in C# and .Net

Modern applications are complex and often rely on a large number of external resources increasingly accessed using HTTP – for example most Azure services are interacted with using the HTTP protocol.

That being the case it can be useful to get a view of the requests your application is making and while this can be done with a tool like Fiddler that’s not always convenient in a production environment.

If you’re using the HttpClient class another option is to pass a custom message handler to it’s constructor but this relies on you being in direct control of all the code making HTTP requests and that’s unlikely.

A simple way of capturing this information without getting into all the unpleasantness of writing a TCP listener or HTTP proxy is to use the System.Diagnostics namespace. From .net 4.5 onwards the framework has been writing HTTP events to the System.Diagnostics.Eventing.FrameworkEventSource source. This isn’t well documented and I found the easiest way to figure out what events are available, and their Event IDs, is to read the source.

Once you’ve found the HTTP events it’s quite straightforward to write an event listener that listens to this source. The below class will do this and output the details to the trace writer (so you can view it in the Visual Studio Debug Output window) but you can easily output it to a file, table storage, or any other output format of your choosing.

To set it running all you need to do is instantiate the class.

If you’d like to see this kind of data and much more, collected, correlated and analysed then you might want to check out my project Hub Analytics that is currently running a free beta.

Changing the App Service Plan of an Azure App Service

To allow a number of App Services to scale independently I needed to pull one of them out of an App Service Plan where it had lived with 3 others to sit in it’s own plan – experience had shown me that it’s scaling characteristics are really quite different from the other App Services.

You can do this straightforwardly and pretty much instantly either in the Portal (there’s a Change App Service Plan option in Settings) or with PowerShell (with the Set-AzureRmAppServicePlan cmdlet).

Super simple – but I did encounter one gotcha. This doesn’t move any deployment slots you might have created and so you end up in a situation with the main App Service sat in one App Service Plan and it’s deployment slots in another which probably isn’t what you want and, in any case, Azure won’t let you swap slots in different service plans.

The solution is simple: you can also move them between App Service Plans in the same way.

 

Serving Static Markdown Content from ASP.Net MVC to JavaScript

I recently moved a bunch of documentation into the Markdown format as I wanted to render it into multiple output formats and inside multiple hosting technologies – including an AngularJS based single page applicaiton.

While doing this I decided it would make sense to have a single source of truth for these files and so placed them as content inside my MVC 5 based website that is entirely public access. After dropping them into the website the first step is to configure ASP.Net to serve the content which involves adding the below to a web.config file:

This enables the Markdown to be served to a browser but if you try and download it from JavaScript you find it blocked by CORS. I solved this with a small ASP.Net module that is installed in the web.config above and the code for which is below:

Hope it’s useful to you.

Signalling API Unavailability

When doing upgrades of websites it’s often useful to be able to signal to users that your service is offline for maintenance either in part or in entirety which is quite straightforward to implement unless you’ve got something like an AngularJS or React app, that could well be cached in the browser, and that actually wants to respond to 503 status calls returned from a web based API. Then CORS has a habit of getting in the way.

To help with that I’ve just pushed this super simple and lightweight ASP.Net website to GitHub that will respond with a 503 status code to any request made of it while ensuring that the CORS protocol will succeed meaning that the 503 status code will make its way through to your own error handling.

It’s ideal for hosting in an Azure deployment slot during upgrades that swap slots.

Note: an alternative approach would be to use the URL rewriter in web.config. It’s not particularly intuitive or, to my taste, readable but I believe can be configured to perform the same task.

Contact

  • If you're looking for help with C#, .NET, Azure, Architecture, or would simply value an independent opinion then please get in touch here or over on Twitter.

Recent Posts

Recent Tweets

Invalid or expired token.

Recent Comments

Archives

Categories

Meta

GiottoPress by Enrique Chavez