Archive for November, 2009

To move or not to move.. why is it always a nightmare when hardware is involved..

Monday, November 9th, 2009

This is actually my first ever post on the WAYN Founders forum, not because I have not wanted to speak out and stay the silent one, but really because I just did not think that anyone would be interested in listening to the mundane rattling’s of a day in the life of a Social Network’s techy problems on a generally business focused blog! Besides, Pete & Jerome just love talking more than I, so I am happy to let them 😉

However, a few months back we had to think fast and hard about what to do with our ever soaring hosting costs in the Big Smoke (that is London for those of you not from the UK) and I thought perhaps it would be interesting to share some of the thought processes we went through about how to combat the seemingly “recession-immune” Data Centre industry in London.

I say immune really only because it seems that even with the banks pulling out loads of servers and cutting IT budgets left, right and centre that there is plenty of people all to willing to quickly fill the gaps they leave, naturally, meaning that Collocation costs have not stopped increasing in many areas.

Full Cab

First a little history of WAYN Tech. Being a London based business we have always kept our IT infrastructure close by, thinking that the niceties of having it close by should a major failure occur would outstrip the increase in cost, add to this the fact that we started in 2003 when the “Bubble” had not yet been repaired from the burst around the millennium there was actually a ton of space available, and it was going quite often for a song.

We started out life hosting our entire site on a single IBM rack mounted server with a small ISP based in the Docklands, we did not really hang around too long though as they seemed to enjoy downtime a lot more than we did. OK, we did not have many users at the time but nonetheless it is not the way you want to go, so we eventually migrated our hardware to another small ISP based down in Biggin Hill of all places, in fact it was the same ISP where FriendsReunited was born, that is how I got to know about them really as in my previous incarnation I was the Sysadmin for FR. I had a excellent relationship with the owner which I have maintained to this day so things worked out well for us there overall. We actually continually hosted equipment with them from circa 2004 right up until a couple of months ago, although we had migrated the vast majority of kit to another more traditional Tier 1 Data Centre based in the City of London in 2007, we kept some equipment with our original provider for off-site backups. Unfortunately, we had simply outgrown them.

When our contracts for our main ISP in the City were coming up for renewal earlier this year we thought it would be wise to reconsider things, especially given the tough economy, it made sense to try and find a way to cut our costs without impacting the service. Of course, the ideal scenario was to simply negotiate a cut in price with our current ISP, as I am sure anyone in the business will tell you moving Data Centers is an absolute nightmare, and usually a huge impact with downtime. However, since they hadn’t raised the prices in the previous 2 years there really wasn’t very far we could go with this, so we had to move our feet and start looking around for a better deal before it was too late and the contracts renewed!

We looked at a lot of sites, I would go as far as saying we probably looked at every single Data centre in the UK that was big enough and well connected enough for our needs. We also looked in to Europe, however, being very IBM focused on our hardware we were left with a dilemma of losing warranties on all our existing equipment if we took it out of the UK, given that we have paid a premium for top quality hardware we didn’t really want to shoot ourselves in the foot and write off a large chunk of that value! Next time we have to think about buying a large chunk of new kit though it is something we will definitely reconsider, although the Euro’s strength against the pound is diminishing the savings rapidly.

Eventually, we narrowed our options down to 2 sites in London, one near Birmingham and 2 in Manchester. We set up a bunch of tours and I jumped in the car and did a Tour of the sites to check them out. I didn’t really need to visit the ones in London as I had been in them before and pretty much knew what they were all about. To cut a very long story short we eventually had to choose between a site in London, which was cheaper but not so much, or a site in Manchester which was attractively priced – but 4hours up the road.

The Datacenter in Manchester was a company called UKGrid which had recently filled up their data floor, but was in the middle of building out their second floor. I liked this site because it was based very close to Telecity which meant we could pick up internet feeds from any of the big providers we liked, the infrastructure in the building was of a high standard and the Data Centre seemed to be professionally run, with a couple of respectable clients on site that gave it more credibility. We was also able to spec out our cage exactly how we liked it, which made life a lot easier for us to migrate things like for like.


One of the biggest things I think I noticed is that there is a significant difference in prices from one Data Centre to the next, indeed some were over 100% more expensive than our current facilities, and some were as much as 40% cheaper, yet I could see little difference between them to justify the different price points – in the end I think that it is clear that some facilities are just able to demand a higher price and get away with it – perhaps this is why our Banks have no money left!

We migrated our equipment in early September, witht he help of a specialist removals company. We had no choice but to shut down the service for 24 hours, which would have been more like 72 hours if we had not used the removal firm. Being the paranoid type we decided to move our core database server ourselves in the back of my car, and sneaked a Google Latitude enabled smartphone with GPS in to their van to track them – this was a brilliant idea as we could keep a really close eye on progress and make sure we met them on site bang on time.

Crazy Switches

We had some fun and games getting our kit up to the new suite ont he first floor, firstly because there was no lift (no joke!) and we eventually had to rope our empty racks up through the window! It sounds a bit cowboy i know, but it was a 100x easier than trying to squeeze them up the stairs and a lot faster than dismantling them! The data centre would normally provide a forklift for the job, but given we was moving in at 7am on a Sunday the forklift firm wasnt interested in getting out of bed.

Overall, the move was very smooth because it was well planned and well executed with a lot of man power on hand, moving equipment from one building to another is always tricky but movinf it from one end of the country to the other is another level of complication. However, with good planning this can be overcome and the savings to be had simply cannot be ignored. We managed to get our kit moved and back online withint 24 hours, which we felt was a great success.

My top tips for migrating datacentres, especially when they are far apart:

  • Invest in a thermal Label printer – you do not want to be decyphering peoples handwriting mid migration!
  • Label EVERYTHING, every cable, every server, every plug, cabinet and anything else you can think off. Its a godsend when it comes time to put it all back again, even if one network cable is the same as another its just a great way of ensuring you connect everything right first time, especially when your severly sleep deprived!
  • Create a good decommision/reccomission plan, and lable any boxes of cables or equipment clearly so you know whats what and dont start opening every box searching for a specific cable. Any special cables or connectors should be packed seperatly so they dont get lost. Your not going to be able to replace special connectors in the middle of the night – so anything that isn’t available from your high street computer shop needs extra TLC!
  • Get a specialist removals firm to do the moving, they’re not always cheap but the costs can be easily offset with the reduced downtime and a reduction in risk.
  • Have as many people on site as possible when it comes time to put things back together again, the more hands the better you just cannot underestimate the amount of work involved
  • Prepare a backup plan – what if an important server fails?
  • Servers do not like being turned off and on, they really like to break when they cool down after having been switched on for months (or years), so make sure you shut them off for a good hour before you remove them.
  • Get lots of food, energy drinks, warm clothes!
  • If you move ALL your hardware, make sure you have a place to host a holding page. If you host your DNS and MAIL servers, you should have a temporary replacement for them as well. You can use a cloud (Amazon S3 etc) and/or OpenSource software for that, or pay someone to take care of it. Because we own our own IP ranges, we was able to switch them over to temporary servers at the new site in seconds, but not everyone can do that and remember DNS can take a long time to update around the world, your customers will already be upset your offline but they will be even more upset if they get blank pages and no explanation!
  • Plan ahead – what should be racked and launched first. You do not want to end up in a situation when all hardware is up and ready, buy you really do not remember where have you put main firewall 😉

If you plan well in advance you should be able to migrate without too much hasstle. There is good, reasonably priced colocation space out there but it wont always be on your doorstep so dont be afraid to move further afield.

Mike Lines
Chief Technology Officer