Showing posts with label dev. Show all posts
Showing posts with label dev. Show all posts

16 August 2015

Update #1 to Google Maps Directions to GPX data

I've applied several major updates to the tool discussed in this article. Update #1Update #2Update #3, Update #4 and Original post.

In a previous post I shared an experimental tool that I had been tinkering with that converts Google directions to GPX format.

After quite a few interesting suggestions and discussions with readers I want to present Update #1 to this tool.

Try it


Major changes

  1. New and more user friendly UI with error checking.
  2. More control over output format and output file types (the tool now defaults to generating a file that has the broadest compatibility with the GPX standard).
  3. Auto-generation of unique route point names (format "RPxxx").
  4. Option to include the full direction text with each route point in the <desc> field.
  5. Option to include a brief next turn information with each route point in the <cmt> field.
  6. Optional JSON response for future web-development
The new UI with all options

Try it

11 July 2015

Updating the ESA Small & Medium Sized Enterprises map (#3)

Since I created the ESA contractor map in November I've been surprised to see that I seem to get regular traffic to it.

I must admit that when I built this first I was not concerned with tracking how often the underlying SME database changes on the ESA side and the project doesn't really have any built in stats tracking that (yet).

Neither have I found time to fully automate the extraction and parsing of the data from the SME database. This code to produce and prepare the background data all lives in a C# application that currently I need to run manually to get the updated lists of companies and fields.

Out of curiosity I decided to run an update on the underlying data today and was surprised at the number of new and updated entries that were located.

View Latest

The Changes

All in all there where 39 major changes to the registry including new, deleted and updated companies.

I've updated the map markers and fixed a few minor bugs relating to mobile clients.

I also introduced a change list section. This section is shown in the lower right hand corner of the map and lists the changes that I have made to the map source data.

I should do this to all of my applications actually, it is really handy to track what has changed and when.


A version of this article is also available on LinkedIn
https://www.linkedin.com/pulse/esa-small-medium-sized-enterprises-interactive-map-sigmundarson

7 July 2015

Météo France, converting between different wind speed units (#4)

The small Météo France site I created has been updated with a new parameter that can be used to customize the unit that windspeed is presented as.

The u= parameter

The new parameter that controls the windspeed is ?u= and accepts the following values:

Value Comment Examples
u=bf Beaufort scale of windforce between 0 and 12.
More information about conversions. Unit "F"

Try it
 
u=ms Meters per second. Unit "m/s".
Try it
 
u=kn Knots. Unit "kn".
Try it
 
u=kmh Kilometers per hour. Unit "km/h". This is the current default.
Try it
 


More information on conversion and meaning of these different scales can be found here and on Wikipedia.

This parameter works both for the website as well as the weather API service.

Examples

Knots
Kn

Beaufort Force
F

Kilometers per hour
Km/h

Meters per second
m/s

5 July 2015

Expanding the Météo weather forecast site (#3)

I've discussed the work I've done to build a simpler weather UI for the French météo service in earlier posts (first entry and second entry).

We've been having exceptionally hot weather and I decided to expand the current web app with more information that is provided by the French weather service.

Try it

Severe weather warnings

The Météo service publishes a separate list of areas that have been issued with severe weather warnings (vigilance météorologique).  I thought it might be helpful to have this information readily available to the user.

After fruitlessly attempting to scrape the HTML on this website and being unable to locate the same elements that the browser debuggers indicated were rendered. I discovered to my dismay that most of the content on these pages are being rendered by a client side JavaScript. Why, I will probably never understand as the code is quite the mess to look at.

But luckily I noticed that the whole rendering depended on a single XHR call to an external XML service that luckily didn't require any sort of CrossSite authentication. So I simply went straight to the source and saved countless hours of debugging and fiddling with bad HTML. (http://vigilance.meteofrance.com/data/NXFR33_LFPW_.xml)

Unfortunately the source doesn't really represent the data in any easily understandable way so considerable digging around in the JavaScript code was needed to fully understand all the numerical codes presented in the XML.
The raw XML data from the Météo service
The rest of the JavaScript code and HTML gave me the pieces to understand the colour coding and images to render for each risk type and severity level.

Severe gales / strong wind

While digging around and trying to figure the weather warnings out I realized that my earlier design did not account for displaying warning and windspeed numbers for any forcasted and expected wind gales. This information was already available from my earlier parsing project and needed only minimal tweaking to get correct.

This information is now presented either within the day or as a whole day event, depending on which is applicable.

The new information elements.
Click for a larger version

Try it

This article is also available on LinkedIn:
https://www.linkedin.com/pulse/expanding-météo-weather-forecast-site-sverrir-sigmundarson

16 June 2015

Upgrading the UI for the Météo weather forecast site (#2)

After building the initial version of the Météo weather page and confirming that the data was all correctly handled I went about sprucing the UI up a bit.

Try it

This project has been updated, click here for part three (data enhancements) of this article.

Not only was the original design purely a functional one and not particularly attractive or user friendly but it was also missing a few key data points such as intra-day weather and likelihood of rain.

The Old UI

The tools

With the help of JQuery and the JQuery UI I managed to wrestle it into a usable and aesthetically pleasing look and feel.

I ended up using the JQuery Accordion without too much customization. This widget suits the website quite well as I wanted to be able to show an overview of the next 10 days at a glance but then allow the user to drill down into each day to see the intra-day forecast in more detail.

The final UI design

The accordion is closed upon first loading the website but then the user can tap each day to drill down and see more detail data.
The initial view of the website
Accordion is closed

User has expanded a day and intra-day forecast is visible
Accordion is open

For longer term forecasts additional information is available on how confident the Météo weather model is about the prediction as well as a rough prediction of the likelihood of precipitation.
Extra data points are available for days further into the future

Customizing the forecast area

A new addition was the setup screen. This screen can be used to change the forecast area to a different city or area. This page is available by either clicking the name of the current area at the top of the page or scrolling all the way to the bottom of the page and clicking the "Configure" link.

Try it

The Setup Screen
The user can type into one of the three textboxes

Suggestions are offered as the user types.
Both place names and zip code entries are supported



Feel free to try this forecasting site out and even bookmark it for future use

Try it


This article is also available on LinkedIn
https://www.linkedin.com/pulse/ui-upgrade-météo-mobile-weather-forecast-site-sverrir-sigmundarson

1 June 2015

Getting hassle free and fast(er) weather forecasts in France (#1)

The Météo-France Android app has been annoying me for the past 6 months with its excessive battery usage and frustrating UI navigation and experience. Finally this week I had enough and decided to come up with something simple that still gives me the same information on my mobile as the rather excellent France Meteorological office prepares and publishes.

Try it

This project has been updated, click here for part two (that discusses the UI redesign) and part three (data enhancements) of this article.

Design

As the French Météo does not publish any of its data in an easily programmable way I decided to do simple screen scraping of their existing forecast website. This is straight-forward enough to do in PHP and actually made considerably easier using the SimpleHTMLDOM library. I highly recommend it. It is the closest I've come to having BeautifulSoup in PHP.

The service is split into two pages: 

API

Scapes and re-renders the Météo data into either a JSON or JSONP format.

Try it

Supports the following URL parameters:
ParameterDescription
&area=Lowercase name of the region or area you're interested in. E.g. strasbourg, eckbolsheim
 
or saverne. Defaults to "strasbourg"
&zip=The zip code for the area. This should correspond to a zip code available in the area used. Defaults to "67000"
&callback=Optional, if used then the call becomes a JSONP response and this value will hold the name of the client side Javascript function that should be called when the call returns.


Relies on support data from the following resources:
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/img/spriteCarte40Uvs.png
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/img/spriteCarte40Temps.png
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/legend.css

Website presentation

Simple HTML/Javascript front on top of the forementioned API functionality. Supports both AREA and ZIP parameters described above.

Try it

Examples

Default call in HTML format:
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/index.php

Eckbolsheim weather info in HTML format:
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/index.php?area=eckbolsheim&zip=67201

Saverne API response:
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/api.php?area=saverne&zip=67700

Default API response with JSONP callback:
http://labs.coruscantconsulting.co.uk/strasbourg/meteo/api.php?callback=weatherdatafunction



This acticle is also available on LinkedIn:
https://www.linkedin.com/pulse/getting-hassle-free-faster-weather-forecasts-france-sigmundarson

15 January 2015

Filtering the ESA Small & Medium Sized Enterprises on a map (#2)

The project discussed in this article has now been updated with newer data see post 3.

As a follow up on the work I did in an earlier post I decided to go ahead and expand my initial implementation quite a bit in an attempt to make it more useful for people looking for ESA vetoed contracting firms.

Try it

So I implemented two major enhancements:

1. Richer company info

Now when you click on a marker it will show you a much richer details about each company. Specifically its primary fields of expertise and contact information (in case you're interested in working there).



2. Filtering of companies

You can now hover over a little blue icon in the lower right-hand corner to open up a filter menu. By selecting individual filters in this menu you can limit the markers shown on the map to only companies that specialize in these fields.



Super useful. Wow
!

24 November 2014

ESA Small & Medium Sized Enterprises on a map

This article has been expanded and updated in post 2 and post 3.

Recently I had some business on the European Space Agency (ESA) SME Database. This database lists small and medium sized companies that have been cleared or granted privileges to work as contractors for the ESA.

The primary public access to this database is a long tabular list of company names and addresses. This data can be better presented today by overlaying it on an interactive map instead of a list. Using such a format can also make it easier for people to browse and investigate opportunities in specific areas.

So, in the spirit of being the change you want to see, I set out to create an offline parser (.NET) to source the list of companies from the SME database and then a simple PHP website to render the results onto a nice modern map (in this case Google Maps).

The change

First version is done and you are welcome to use it:

Try it


This first version is very basic. Currently the following improvements are planned

  1. More details about companies shown when the markers are clicked
         2015-01-11: Page now shows company fields of expertise as well as contact information 
  2. Filter companies by their sector and project types
  3. Select companies and print a list for future reference (in case you're job hunting)



19 August 2014

My ongoing list

My totally unoriginal list of things I'm still learning:

  1. It is easier to ask for forgiveness than permission. Just do it!
      
  2. It's better to fix problems than trying to prevent errors. So try to fail as fast and as early as you possibly can and learn from it.
      
  3. It's not your baby, so leave your ego at the door. Your work is not "yours" but your employer's. So as soon as it's been typed, it can and should be criticized and worked on by anyone.
      
  4. Everyone does things slightly differently. If things are correct, avoid judging other peoples work purely on style or differences from how "you would do it".

30 July 2014

Improving the TFL Open Data Feeds and APIs #2

So following up from my earlier post about the TFL Data APIs I don't think I was being fair towards the TFL dev team. The API that is currently provided, called TrackerNet, was made available in August 2011 and is now going on its 3rd year. The API is still quite horrendous by modern (and 2011) standards but granted is the first step publishing the vast network of disparate data-sources in use used within the TFL after its merger in 2000.

Little did I know that there is indeed an impressive brand new API (api.beta.tfl.gov.uk) powering the newly redesigned tfl.gov.uk website that was launched in late March 2014. This I discovered by accident when investigating the seeming information difference between station disruptions in the TrackerNet API and what was displayed on the TFL website.

Below are images showing the difference in information provided for the same tube station "Paddington" between the old TrackerNet API and what is shown on the TFL website itself.

TrackerNet results for Paddington station

TFL website information for Paddington station
(only the past paragraph is what is available in TrackerNet)

The new API

api.beta.tfl.gov.uk offers information about it seems every aspect of the TFL network and facilities. The developers seem to have taken the approach to developing the new TFL site as simply one presentation layer on top of this vast and rather well performing API.

The guidelines ask that developers that want to leverage the API sign up for a key and an app-id that should be submitted with every query. But all API calls seem to work just fine without supplying these values. 

One minor gripe that I have with the API is that it is perhaps overly verbose with average data result sizes close to 1MB in size (uncompressed) . Comparing the size of the boris bikes between the APIs:

Old APINew API
CompressedUncompressedCompressedUncompressed
31.8KB266KB110KB1.6MB

But this is a small price to pay today for a much richer and consolidated API experience.

Unfortunately the data published by the old API is not compatible with the new one and most notibly is the station and tube line names that are different. Luckily developers such as Matt Hinchliffe have already gathered some really useful support data to make the data and its use easier.

Now I guess I should update my bike and bus websites to leverage this new found API gold :)

3 June 2014

Programming Garmin Nüvi 1690 SatNav using C#

Update Oct 2015
I've now written a web-tool that automagically converts
Google Maps directions to GPX format. Its pretty awesome!

Take me to the new article
 

Oh dear, this is going to be one of these things that make you go d'oh.

If you just want to know how the tech bits and skip the narrative you can jump to the Solution section below.

I have a now rather ancient Garmin SatNav, a Nüvi 1690. I bought it a few years back when I realized that renting a GPS device with my rental car would be more expensive after the second time than buying a new device.

This little device has paid for itself many times over by now. However the process of getting data points to the device prior to a trip has always been a very complicated and round-about way one which I've vowed to make smoother every time I go through it.


My Process

I like to plan my medium to long trips using either Google Maps or Google Earth. These are excellent tools to find hotels, parking, places of interest and planning which roads to take. However neither of these two tools can handle the necessary GPS data formats (GPX being the most common) that the Garmin tools do. The Google tools (understandably if you remember their original acquisition source) only deal with KML and KMZ data formats.

This has required me to take the following steps:
  1. Plan points and routes in Google Earth/Maps
  2. Save waypoint data from Google Earth to KML file
  3. Convert KML file to GPX file (excellent tool for this is GPS Visualizer)
  4. Handle any data format errors
  5. Use any of the many Garmin tools to import the data to the device
    1. Sometimes I've used Google Map's "Send to GPS" feature for single points. This feature seems to have been removed from the new Google Maps.
    2. BaseCamp has very good support for importing waypoints and routes. It regretfully lacks support to delete said points from the Nüvi device though.
    3. myGarmin web page used to have an import feature with their GarminConnector plugin (but this is gone now)
    4. MapSource had some support for this (but this tool is discontinued now and cannot be downloaded)
    5. Garmin Express is simply useless when it comes to uploading waypoints (it is atrociously simple)
This time the only tool available to me was the rather nice BaseCamp tool. However I ran into some problems as I was unable to delete some previously loaded waypoints off my device. BaseCamp could not delete anything and even after deleting everything I could from the device menus there were still some points that lingered in there.

Deleting Everything

When using Windows 7 and newer the Garmin device shows up as an external USB storage drive. So after spending a very unsuccessful hour trying to clean my previous data from the device I finally decided to open up Windows Explorer and go hunting through the mounted drive.

I finally found that the GPX folder actually held an archive of my waypoints and trips that were somehow read by the device. Deleting every gpx file in this folder removed all custom points from the device (I also removed the Archive folder for good measure).

The Garmin USB Protocol

My perception of the device communication standard has always been the thing that has scared me away from actually diving in and creating a helper application. I decided to embark on trying to understand and leverage this protocol to automate my process as much as I could.

Cutting a very long story short, after numerous attempts using their Garmin SDK library and reverse engineering their .NET libraries I had made no real progress. The SDK samples I had, strangely just indicated that there seemed to be no compatible GPS device connected to my machine. This was weird as I could easily verify that the device was there by using their BaseCamp tool and the web-based GarminConnector (showed it connected and accessible).

Solution: USB Mass Storage Mode

Although embarrassing I, in desperation, went back to Windows Explorer and scoured through the device that was mounted as my G:\ drive trying to find any hints to how to access it.

I found a rather interesting file under the Garmin\ folder which was named GarminDevice.xml. Looking through this file it wasn't long until the shear stupidity of what I had been trying to do hit me.

I was already accessing the device! I had previously removed waypoints by deleting the files from the GPX directory, I could just as well have added a file in there with my new data!

And sure enough, this is indeed the case for a large range of Garmin devices. Below is an explanation of the relevant GarminDevice.xml section:
The GpsData DataType element also contains a File element with TransferDirection=InputToUnit which specifies a file type, extension and file path. The application places a GPS Exchange (.gpx) file containing routes, tracks and waypoints in the directory path specified. This file will be processed by the GPS device upon exit from mass storage mode.
This has reduced the problem to a simple FileIO and XML parsing exercise which is trivial to implement using any modern programming language (e.g. C# or Python).

Stay tuned for my new app :)

26 May 2014

The Extra Effort


“Perfection is achieved, not when there is nothing more to add, 
but when there is nothing left to take away.” 
– Antoine de Saint-Exupery

There is an interesting research published recently in Social Psychological and Personality Science, "Worth Keeping but Not Exceeding: Asymmetric Consequences of Breaking Versus Exceeding Promises".

It describes research into how promises and contracts are viewed by people, especially how their completion is perceived and how people are likely to react to under- and over-delivering on promises as compared to delivering on exactly what was agreed upon.
"Businesses may work hard to exceed their promises to customers or employees, but our research suggests that this hard work may not produce the desired consequences beyond those obtained by simply keeping promises. [...] The results of our experiments suggest that it is wise to invest effort in keeping a promise because breaking it can be costly, but it may be unwise to invest additional effort to exceed one’s promises. When companies, friends, or coworkers put forth the effort to keep a promise, their effort is likely to be rewarded. But when they expend extra effort in order to exceed those promises, their effort appears likely to be overlooked."

This somewhat confirms what I have experienced during my career when it comes to delivering software. It is the single most important thing to deliver what was originally agreed upon in a working state. A solution with the minimal level of features that each does only exactly what was agreed upon is in most cases much better received than a solution that has a bunch of bells and whistles even though both solutions deliver equally reliable software on the exact same promise.

Pro-Tip

Deliver only what was agreed on and at the right time. Nothing more, nothing less. Never mention any bonus features but consider saving any that may have been implemented as a free surprise a few weeks after a successful delivery on the initial contract.

Publication information:

Worth Keeping but Not Exceeding: Asymmetric Consequences of Breaking Versus Exceeding Promises Ayelet Gneezy and Nicholas Epley
Social Psychological and Personality Science published online 8 May 2014
DOI: 10.1177/1948550614533134

Abstract
Promises are social contracts that can be broken, kept, or exceeded. Breaking one’s promise is evaluated more negatively than keeping one’s promise. Does expending more effort to exceed a promise lead to equivalently more positive evaluations? Although linear in their outcomes, we expected an asymmetry in evaluations of broken, kept, and exceeded promises. Whereas breaking one’s promise is obviously negative compared to keeping a promise, we predicted that exceeding one’s promise would not be evaluated more positively than merely keeping a promise. Three sets of experiments involving hypothetical, recalled, and actual promises support these predictions. A final experiment suggests this asymmetry comes from overvaluing kept promises rather than undervaluing exceeded promises. We suggest this pattern may reflect a general tendency in social systems to discourage selfishness and reward cooperation. Breaking one’s promise is costly, but exceeding it does not appear worth the effort.

20 February 2014

Failure resiliency and retry delay planning

When dealing with networked or external data sources I've learned the hard way that all code should be designed as to expect failures. It is significantly cheaper and easier to bake a graceful handling of errors this from the start rather than attempt to do it later on. A common first step in combating failures and providing resiliency is to have your logic retry an operation in case of a failure.

But how should you retry?

Simple retries

The most simplistic retry mode is to simply surround all your code with a while loop that executes your block a predefined number of times, ala:

int retry = 0;
do
{
   // Operation
   if( true == MyFlakyOperation() )
       break;
}
while ( ++retry < 6 )

The problem with this approach is that it completely ignores the most likely underlying reason for the failure. Congestion or resource load on the remote end could be causing your calls (and many others) to intermittently fail as the server cannot handle the incoming requests. In this case your naive implementation might actually be contributing to making the situation even worse.

So how do we solve this?

Spacing out retries

One common approach to spacing out retries is called Exponential backoff. This algorithm uses a predefined feedback (e.g. retry count) to systematically increase wait times between repeated executions of the same code to avoid congestion.

Example of exponential spacing based on 4sec base wait time.
The vertical bars indicate retry points.
The idea is that with every additional retry that is required it is more likely that the system we're communicating with is heavily congested and needs more "breathing" space to resolve its current backlog of requests.

Example of backoff retries

Below is an example of a very simple C++ algorithm snippet that performs this kind of exponential backoff based on 4sec intervals:

int success = OK;
int retry = 0;
do
{
   // Operation
   success = MyFlakyOperation();

   // Sleep if operation was not success
   if (success != OK)
   {
       int sec = static_cast(std::pow(4, retry));
       std::this_thread::sleep_for(std::chrono::seconds(sec));
   } 
}
while ( ++retry < 6 && success != OK)

In this example my algorithm has a maximum running time with full retry count of a whooping 22 min and 44 seconds! (4+16+64+256+1024 = 1364sec).

How much does the waiting time increase?

Care must be taken when choosing the interval to increment by when using a naive approach as my example above. Below is a table listing the waiting times in seconds for each retry for 2-7 second intervals.

Remember that your maximum running time is the cumulative waiting numbers for all  intervals!

Retry#2sec3sec4sec5sec6sec7sec
1
2
3
4
5
6
7
2
4
9
16
25
36
49
3
8
27
64
125
216
343
4
16
81
256
625
1,296
2,401
5
32
243
1,024
3,125
7,776
16,807
6
64
729
4,096
15,625
46,656
117,649
7
128
2,187
16,384
78,125
279,936
823,543
8
256
6,561
65,536
390,625
1,679,616
5,764,801
9
512
19,683
262,144
1,953,125
10,077,696
40,353,607
10
1,024
59,049
1,048,576
9,765,625
60,466,176
282,475,249

* so using 7 sec as a base and allowing up to 10 retries, the total maximum waiting time will be just shy of 10,5 years!


13 May 2013

Base64 encoding image files using Python

In an earlier entry I mentioned that I was building a little app to display a bunch of static information. Sometimes I find it immensely useful to embed all resources into a single data-file. Not only does it make handling the data simpler but also retrieving it over the internet.

My data has a bunch of text and a few images as well to go with it. I wrote the code below (in python) to automate embedding of image data directly in with the other text that I will be serving.

@staticmethod
def encode_image_as_base64( image_path, base_path ):
    """ 
        Loads an image from the supplied path (using the base path as a 
        reference) and returns it as a base64 encoded string
    """
    
    # The image path can be a URI with a query string. 
    # Remove any query string elements, basically everything following 
    # a question (?) mark
    qs_split = image_path.split("?")
    image_path = qs_split[0]
    
    file_name = os.path.join( base_path, image_path)
    file_name = urllib.unquote(file_name)
    
    print "Encoding image: "+file_name
    
    encoded_string = ""
    with open(file_name, "rb") as image_file:
        encoded_string = base64.b64encode(image_file.read())

    return encoded_string


It may not compress all that well but it makes data handling clean and simple.

11 May 2013

Calculating File Change List using SHA256

Realised today that I needed a good way to create a change list document for a small app that I am writing. The app has a few hundred screens, each of which is supported by a single data document that contains the information that the screen is showing.

This data is downloaded in individual JSON files from the web the first time the user launches the application. Each of these JSON files can be amended after the app is released. I wanted to be able to provide a quick and easy way for the app to download changelist information to detect if anything needed updating.

I wanted the data to be separated in such a way that if only one file is changed the user only needs to download that one piece of data but not the rest. This is crucial as the entire dataset is just shy of 100MB whereas each individual file averages around 400K.

Python to the rescue

Wrote a little script that creates a single changelist.json file for each of my data files using the built in SHA256 key generation library that ships with python. So simple and quick:

out_file_path = r"C:\outfiles"
jdata = {}
for path, subdirs, files in os.walk(r"C:\myfiles"):
    for name in files:
        if( not name.endswith(".json") or path == out_file_path ):
            continue
            
        json_file_path = os.path.join(path, name)
        
        if( json_file_path == out_file_path ):
            continue
        
        jdata[name.replace(".json", "")] = hashlib.sha256(open(json_file_path, 'rb').read()).hexdigest()
        
out_file_name = os.path.join(out_file_path, "changelist.json")
print "Writing "+str(len(jdata))+" change list items to output file '"+str(out_file_name)+"'"

# Now write the file to the out location and quit
with open(out_file_name, 'w') as outfile:
    json.dump(jdata, outfile, indent=4, separators=(',', ': '))

Python, gotta love it :)

20 April 2013

Layouts in Android not switching correctly between portrait and landscape

After having followed all the examples on the Android developer site to the letter about supplying multiple layouts for different screen sizes and orientation I was still having issues with them on extra large screens (10" tablets).

Basically the layout would not load when I rotated the tablet while the application was running.

If I started the application in portrait then that layout was correctly selected and when I started the app in landscape that layout was likewise correctly loaded. But after one layout was loaded the orientation change did not trigger Android to switch automatically.

The culprit was in my AndroidManifest.xml file. For some reason when I first created my activity the configChanges attribute had been set automatically for me to:
android:configChanges="keyboardHidden|orientation|screenSize"
removing screenSize solved the issue:
android:configChanges="keyboardHidden|orientation"

Obvious eh...

Edit: This behavior is actually documented in the android development guide.