Keeping the large volumes of confidential data held about their tenants and communities safe creates a huge headache for most housing providers.
There are two major factors affecting the secure storage and management of housing providers’ data – the sheer amount of it and the pressure on costs. According to a recent article in The Guardian, social landlords in England alone hold sensitive information on around eight million people. And with budgets stretched, housing costs going up and housing benefit cuts causing extra pain, housing providers continue to feel the pinch.
In the July 2011 issue of Housing Technology, I looked at how you can save significant amounts of money through flexible data management – categorising your data into mission-critical, important and legacy – especially if you outsource its storage. In this issue, I am going to look at a much more fundamental aspect; who you use to host your servers and data.
Choosing the right provider
If I had to offer one simple piece of advice about choosing a third-party datacentre, it would be: beware of cheap headline prices. Particularly in the current economy, it’s very tempting to look only at price, but as with most things in life, you get what you pay for and buying cheap can be counter-productive and costly for your company in the long run. The small savings you might make at first are insignificant once you’ve lost contact with your datacentre for 12 hours.
So what should you look for?
Choosing a data-hosting provider can be difficult, especially when many are little more than property agents. The most important thing is to establish ownership – ownership of the property and ownership of the services and the platforms on which they run. Whether you’re buying co-location services, hosted physical and/or virtual servers, or investing in a full range of managed data services such as online backup, you don’t want to be trusting your data to a company which is basically nothing more than a rack reseller.
Companies running these types of white-labelled data centres operate by sub-contracting equipment and services from other companies and largely washing their hands of maintenance, management and support responsibilities. There is also a danger that they will sell you poorly-engineered services, only to start charging for hidden extras later.
When data-hosting providers own the data centre themselves, they will be much more committed to a continuing programme of investment. They will have complete control over the equipment, and will regularly test, service, upgrade and replace their UPS, generators and switchgear, guaranteeing a much more secure and reliable service. In addition, the best will offer round-the-clock manned support, and proactive monitoring of systems and access both onsite and remotely.
Ageing datacentres could put your business at risk. Just as you wouldn’t run your business on 10-year old laptops, why would you even consider putting your data in 10-year old datacentres – especially on a five-year contract? It’s simply asking for trouble.
Datacentres are classified in tiers, with tier one the lowest standard. Although many legacy datacentres haven’t moved beyond tier two, you should be looking for a datacentre built to at least tier three standards.
Not only will this offer high levels of redundancy and ensure that service levels will be uninterrupted even during maintenance operations but, perhaps more importantly, it will be significantly more energy-efficient. With energy costs rising steeply, 40 per cent or more of datacentre costs can be down to fuel, electricity and cooling – an old datacentre running legacy equipment will be eating up money and adding to your overall bill.
So to make the most of your investment, you need to be looking for a datacentre that uses the latest cooling technologies such as cold-aisle containment and free-air cooling, as well as industry-leading building management and monitoring systems.
Availability and resilience
No matter how good the datacentre you choose, it’s no good if you can’t connect to it so it’s crucial that you consider the standard of network being used to transport your data. If the network and datacentre are owned by the same company, you’re off to a good start because you won’t be incurring access circuit costs and if a problem happens, you won’t become embroiled in arguments about who should fix it.
Again, a low headline price will almost certainly prove to be a false economy. For example, a cheap network will not be able to withstand multiple concurrent failures and if you’re being offered cheap bandwidth, be very careful: it almost certainly means you’ll be sharing it with other users.
The question of resilience is particularly important. If a company tells you that their network is resilient, what exactly do they mean? Fixing a break as quickly as possible isn’t resilience. Ask them if the network is multi-meshed, i.e. does it offer alternative routes if there’s a break? How many concurrent breaks can it withstand? Are most faults fixed automatically? Do they own and maintain their own routers, etc?
All of this is important in keeping your data accessible to you round the clock. Essentially, to make sure the money you’re spending on outsourcing your data to a third party isn’t wasted, you should be looking for a company that can provide a complete end-to-end infrastructure capable of delivering a full suite of data and voice over a secure, resilient and scalable MPLS network.
Are they right for you?
All the areas I’ve discussed above are important in fleshing out what precisely you will get from a datacentre provider in return for your investment. In addition, you can also check if they have widely-recognised certifications such as ISO 27001 and PCI DSS.
But one of the best ways to find out if they are offering an enterprise-grade service that offers real value for money is to ask them two questions: do you use it yourselves, and can we come and look round?
If the answer to either question is ‘no’, look elsewhere.
Stefan Haase is divisional director for data cloud services at InTechnology.