Tin foil hats or a new way of life..

Twenty years ago the prime time television news started to report financial highlights to the masses, the “Aussie dollar is up against the greenback and the ASX is down 300 points”. Up until then there was no general consciousness in the broader population about the everyday status of the financial markets and systems.

That changed with the rise of compulsory superannuation and the float of Telstra in the mid-90s, all of a sudden ‘mum and dad’ investors emerged, through their super funds or Telstra shares they had a stake in listed Australian companies. Whilst the complexity of the financial system is really the purview of only a few, the masses felt a connection, and thus the media filled that demand – although I’d challenge you to find more than a small percentage of people on a suburban street who could explain the long term bond market, or the difference between going long or short on a stock.

Over the past few months there’s been a sense of deja vu with blockchain and crypt-currencies. Recently some media has been adding the price of Bitcoin to their financial reports – sparked almost certainly by the stratospheric rise (and subsequent fall) of the Bitcoin price. This despite only a tiny fraction of the population even owning crypt currency and an even smaller percentage who could explain crypto currencies and more importantly the technology which enables those currencies to exist.

So what is blockchain?

Blockchain is a technology concept to enable the creation and maintenance of a robust audit trail of transactions or data. By robust I mean it is not possible to compromise the integrity of the information, and the identity of people altering the data is definitive.

Here’s a simple accounting analogy. I’m old enough that my first management role was with a business that still kept its accounts on paper (yes, gasp). We didn’t have Quickbooks or Xero – in fact we didn’t even have computers on everyone’s desks when I commenced.

The accounts were maintained in a paper ledger, where we recorded credits and debits and a running balance. If we later discovered a mistake, we could go back up the columns of numbers, alter the incorrect item, then update each subsequent number that relied on the amended item. The obvious flaw here is that there are absolutely no controls:

  • anyone could change the numbers
  • there was no way of being sure numbers had or had not changed
  • nobody knew who had changed the numbers
  • if the building burnt down, or we were burgled, or I spilt my morning coffee on my desk, we would lose all records of our transactions.

Blockchain resolves all of those failings by introducing integrity, identity and secure storage into the process.

Blockchain enables you to maintain a register of data, where each change or transaction occurs in sequential order, and each change is constructed in such a way it relies on the preceding transaction to remain valid – hence the idea of a ‘chain’.

Blockchain means that if I go back up the chain of transactions and alter the data, every subsequent transaction that flowed from that transaction I amended will be rendered invalid.

Blockchain uses very strong encryption (SHA256 for those in the know) and private ‘key’ passwords to authenticate each person who adds to or amends the chain.

Blockchain uses a ‘distributed’ model, whereby copies of your chain are held on many computers around the world, with changes synchronising across the network, so even if a few computers fail, your data remains safe.

Who invented blockchain?

Pundits will tell you blockchain is going to revolutionise society as we know it, touching our lives eventually in a myriad of ways. So it’s rather astounding that the identity of the person who kicked all this furore off is not known. The origins of blockchain lie with a paper published by a Satoshi Nakamoto. Problem is, we have no idea who Satoshi Nakamoto is. At one stage some overenthusiastic media outlets tracked down a possible person in California, camped outside his home and plastered your TV screens with his picture. Turns out the poor chap was not Mr Satoshi Nakamoto, and had no idea what blockchain is. There’s also a perhaps opportunist Australian character Craig Wright who endeavoured to claim credit.

What’s the difference between crypto currency and blockchain?

Crypto currency is the first wide spread, publicly visible use of blockchain, because blockchain is very well suited to maintaining financial transaction information, as I’ve highlighted above.  But they are not the same thing. I can use Quickbooks to maintain my accounts in many different currencies. Blockchain is the ‘ledger’ system which enables a currency to exist.

At the end of the day any currency is simply a promise for an exchange of value. I give you $2, you give me a Mars Bar. Ever since the Knights Templar acted as the world’s first Western Union, we’ve used representations of value to enable transactions. Crypto currencies are just the next phase. Bitcoin currency is the most well known, but there are actually hundreds of crypto currencies which have popped up. There are many other uses for blockchain beyond currencies.

Should I retire to my fallout shelter?

No. Much as the emergence of the internet has fundamentally changed how we live our lives (although tough to explain to my children life before the net), or sequencing the genome is generating new understandings of human biology affecting many branches of medicine, blockchain has the potential to be another fundamental change to the way many things in our life work – even if it’s not immediately obvious to the average person on the street. You can skip the tin foil hats.

Testing web sites in multiple versions of Internet Explorer


There’s nothing more frustrating running  a web site only to receive complaints that it “doesn’t work” in some specific version of a browser. The major culprit almost always seems to be Internet Explorer – I’ll leave the philosophical debate why this might be the case for another day.

The issue came up for me because I was messing with a little personal project the other night that uses the Bootstrap 3 framework. I tried the page in IE11, with IE8 emulation, and the pages didn’t work correctly – despite being on a full size screen, the pages were acting all ‘responsive’ on me, and rendering the mobile versions. I then read somewhere that IE8 emulation on IE11 is not accurate – pretty useful! It seems IE11 in IE8 emulation mode does not respect conditional statements – which was what was messing with my responsive design. It was not handling the media queries include that’s needed to correctly render modern CSS in an IE8 browser.

Of course the problem here is I don’t have IE8 on a real machine available to me anywhere.

However, help is at hand, with thanks to Microsoft, which has now made available pre-packaged virtual machines for many combinations of Internet Explorer and Windows OS.

Providing you have one of the virtual machine software setups, for example Virtual PC on Windows or Parallels on Mac you are in business. I use Parallels and here are the VM configurations available to me – it’s pretty wide, there’s even IE6 on XP, but also IE8, IE9, IE10 and IE11 on Windows OS  including Vista, Windows 7 and Windows 8.

Screen Shot 2014-03-07 at 8.03.59 AM

Downloading and setting up the VMs is a snap, within a few minutes I had a fully functional version of IE8 running on Windows 7 in a window on my Mac – and 10 minutes after that had diagnosed the problem with my web site and applied a fix. Note, if you are using Parallels on Mac there is some helpful information on the Parallels web site about configuring the VMs, starting with the fact the file you download has the wrong file extension – you need to manually update the file name before Parallels will recognise it as a VM.

Copy Azure SQL Database to local machine (SQL 2008 R2)

You would have thought in these days of point and click a simple task like copying a database from one place to another would be a walk in the park. So when I wanted to take a copy of a SQL database from Azure and move it to my local machine, I thought it surely would be straightforward. It’s not. Here’s what I came up with:

In Azure make sure you have a Storage account set up, with a container inside. My container is called ‘files’.


In Azure go to your database. Click the Export button down the bottom.


A new window will pop up. The file name will probably be pre-filled. Select your blob storage account (if you only have one then it’s already selected). Select a container (mine is ‘files’). Enter your SQL Server log in name and password – these are the credentials you set up when you added the database. Click OK and the export should start – you’ll see the status down the bottom.


Once the export is done, you can click to view your Storage container, and you should see the export file there.


The trick now is to get the file from Azure to your local machine. I use Azure Storage Explorer, it’s a free download. Download the file from your container.


If you have SQL Server 2012, then you are home free, because you can import the BACPAC file in SQL Management Studio. Right click on the Databases folder under a server in the Object Explorer and choosing “Import Data-tier Application.

If like me you are on SQL Server 2008 R2, you need some help.

Create a new blank database on your local machine.

Go to  http://msdn.microsoft.com/en-us/jj650014 and install the SQL Server Data Tools for Visual Studio 2010. After install go and find SqlPackage.exe, I located it in C:\Program Files\Microsoft SQL Server\110\DAC\bin. Open up a DOS prompt and use the following command:

SqlPackage.exe /a:Import /sf:c:\Desktop\FILENAME.bacpac /tdn:DATABASENAME /tsn:SQLSERVERNAME
  • FILENAME.bapac is the file you downloaded from Azure, whereever it is located on your machine.
  • DATABASENAME is the name of the new blank database you created a minute ago.
  • SQLSERVERNAME is the name of the SQL server on your local machine.

I followed these steps, albeit for a fairly small file, and it worked perfectly.

puTTY SSH connections dropping when on wifi instead of ethernet


I use puTTY to create SSH tunnel connections to servers, and usually have the connection up all day. Recently I’ve been plagued with puTTY losing the connection every 15 or 20 minutes – all because I moved to wifi instead of a wired connection. It’s because I bought a new MacBook Pro, which doesn’t have a standard ethernet port, a small fact I didn’t realise so I didn’t buy an adaptor. Plus my family was probably becoming a little bored with the blue network cable hanging down the stairs from our cable router to my study – makes the place look untidy.

I already had keep alives configured in puTTY:

Screen Shot 2013-11-06 at 9.19.46 AM

After some research I came across this post ‘Why Windows 7 / PuTTY drop TCP connections even on very brief outages?‘ which seemed to mirror my own experience to some extent.

I took a look at my Windows registry, and did not see the TcpMaxDataRetransmissions parameter, so added it in with a value of 15 (the superuser post suggests 15, the linked FAQ on the puTTY web site says 10).


With this change my connection remained up for a couple of hours, and then died, a darn sight better than the previous habit of dropping a couple of times every hour – except this time puTTY doesn’t report the connection as dropped, it looks fine, I just can’t reach anything at the other end of the tunnel.

I decided to fiddle with the puTTY keepalive, and changed it from 30 seconds to 15 seconds, over the past day or two I’m seeing 3 or 4 hours uptime. Interestingly, I only notice when I try to connect to a server and receive a timeout message, previously puTTY would say it was disconnected, now it seems to think the session is still current.

All in all good progress. When I have some time I’ll play with the configuration some more. I haven’t even got to looking at whether there is a setting on my  wifi router that might be to blame.

Create Twitter-like REST API endpoints with IIS URL rewrite

Twitter’s REST API uses endpoints that look like this:


In addition for GET request you can append parameters like this:


So for example a call to the Twitter REST might look like this:


I wanted to try and reproduce this format with a REST API I’ve been playing with, so I spent some time figuring out how to configure IIS URL rewrite to handle these URLs and deliver the parameters to an ASP page.

You’ll need IIS rewrite installed, in IIS Manager you should see the ULR rewrite icon if you click on a website, for example:

Screen Shot 2013-09-23 at 11.30.31 AM

If the icon is not there download and install URL rewrite extension from the Microsoft website, then open URL Rewrite and add a new Inbound Rule.

Here’s the regular expression pattern I came up with:


With the Rewrite URL of:


This means that a URL like this:


Will pass the parameters into my ASP page like this:

objectTypeName = myObject
objectMethod = myMethod
param1 = a
param2 = b

You need to make sure you have the “Append query string” option checked for the Rule so the additional parameters (eg param1 and param2) are passed through the rewrite.

Time Machine: The problem may be temporary. Try again later to back up. If the problem persists, use Disk Utility to repair your backup disk


Yesterday Time Machine on my MacBook started to fail with the message:

“The problem may be temporary. Try again later to back up. If the problem persists, use Disk Utility to repair your backup disk”

It kept erroring after each hourly backup attempt. So I tried the obvious and unmounted the external HDD, turned off/on, remounted, ran backup again, no dice.

I ran Disk Utility to repair the disk, no dice.

I then found this forum post , which led me to the absolutely fantastic Pondini site (the link on the forum post is old, you need to go here) and smacked my head because I’m an idiot, I already have the Time Machine Buddy widget on my Desktop, just been so long I had forgotten.

That showed me:

“Starting standard backup
Backing up to: /Volumes/2TB/Backups.backupdb
Waiting for index to be ready (101)
2.52 GB required (including padding), 1.18 TB available
Indexing a file failed. Returned 1 for: /Library/Spotlight, /Volumes/2TB/Backups.backupdb/Apple’s MacBook Pro (2)/2013-09-03-071009.inProgress/C981E713-FC42-4A7D-BDD5-C83709DF0D5B/Macintosh HD/Library/Spotlight
Aborting backup”

Which shows that Time Machine has been choking on a Spotlight search file, which in turn led me to notice that my Spotlight seemed to be taking forever to complete an index (you can tell Spotlight is indexing, there’ll be a tiny black dot in the circle of the Spotlight magnifying glass top right of your screen).

Screen Shot 2013-09-04 at 11.01.28 AM

In fact, the hours remaining at one point said “About 11 MONTHS” remaining. Not encouraging. I know there’s a way to force Spotlight to reindex your Mac:

  • Open System Preferences
  • Click Spotlight icon
  • Click the Privacy button
  • Add your HDD to the list “Prevent Spotlight from searching these locations”
  • Close System Preferences
  • Re-open, click Spotlight icon, click Privacy button, remove your HDD from the list, close System Preferences

This should kick off a fresh index, it’s much easier than one method I figured out a while ago where you manually delete the Spotlight index file (at one stage my Spotlight decided it didn’t want to index my local hard drive, just my external drives).

So far so good. Time Machine successfully completed its last backup. Spotlight has been sporadically re-indexing, but appears to be returning correct search results when I try a few tests.

If I have both internet and wifi enabled on my Apple Mac which connection is used?

I’ve long pondered this – if I have my MacBook connected both to ethernet and WiFi, which internet connection is it actually using?

In general sitting at my desk I want my Mac on ethernet, because speed-wise I see anything up to 100Mb down our cable connection. But I like having the WiFi on because then my iPhone and iPad are syncing in the background, and I don’t have to remember to turn the WiFi on and off as I move to and fro. However, the connection speed over the WiFi is slower, and I tend to see disconnects from my company’s VPNs although I could probably solve that with a bit of fiddling with the cable router configuration.

Finally I’ve tracked down the answer, and it’s simple.

Open System Preferences > Network. You’ll see a list of your connections on the left hand side. The active ones are green.

Screen Shot 2013-08-06 at 3.13.10 PM

The answer is, your Mac will use the connections in order – so if Ethernet is active and top of the list, that’s the connection used.

By default the order looks good to me – but you can change. Click the little ‘gear’ icon at the bottom of the list, there’s an option to change the order.

Screen Shot 2013-08-06 at 3.19.31 PM