3 MIN READ
5 MIN READ
3 MIN READ
Utility computing is now a reality. Just as the National Grid allows businesses to access electricity at the flick of a switch (rather than owning, manning and running their own generators), the cloud allows similar economies of scale to be applied to IT infrastructure.
The cloud is, however, not floating in the atmosphere or an unseen cyber dimension. It is found in purpose-built bricks and mortar or, more likely, concrete and steel data centers down here on earth. With an estimated US$128 billion spent on data centers worldwide in 2012, data center provision is big business in itself. 
The Uptime Institute, an independent global authority on data centers, recognizes tiers of datacenters.  These run from Tier I, where there is no redundancy to protect the data center from external shock and downtime, to Tier IV where there is full redundancy, allowing processing and cooling to continue running immediately duringan outage.
Location, location, location
Data centers require two main elements: energy and connectivity. Energy consumption is a huge area of cost for data centers as processors produce a lot of heat and need to be cooled to run smoothly.
Therefore, data center providers look for locations that can guarantee energy supply by being near a main hub, but can balance that with the costs of the location. Further to this, green taxes on energy units are pushing the need to innovate on cooling methods. Some multinationals are relying on Mother Nature, opting to develop sites within the Arctic Circle.
Some have opted for the milder air of Dublin, which has the advantage of not only fewer polar bears, but is positioned on the cable with the highest capacity data transfer connecting Europe and North America.
Connectivity is almost so obvious a requirement that it is easy to overlook. While advances in processor power and memory size have been huge over the past few years, wide area networks (WAN) still face challenges that can put the brakes on ‘on demand’.
The challenges of latency, bandwidth and quality of lines can only be dealt with by better investment from governments and private investors in digital infrastructure. To date, this remains patchy, and faces budget cuts. 
When things don’t run smoothly
Outages, such as that caused by Hurricane Sandy, demonstrate the cloud’s vulnerability to external shock. Even if the location of the data center is safe, it is still vulnerable to energy and internet cable disruption.
In the UK, an outage experienced by a major high street bank, NatWest, led to a surge in demand for data centers throughout the sector. Banks are already spending to meet an expected sector requirement of a main data center, a backup and an additional disaster recovery location.
While the cloud is often described as being infinite and everywhere, private consumer or client data held by businesses is subject to local legislation and regulation. EU citizens’ data stored outside the EU requires data security structures equal to those of the EU. Differing legislation on state access to private data is also a concern, blocking many companies from storing customer data in the United States.
While on closer inspection the cloud might not look as soft and fluffy as countless bloggers claim, with greater cloud usage the demand for data centers is likely to grow exponentially.
However, consumers’ increasing data security awareness may add to the pressure to prove that any use of public or hybrid cloud services involves a trustworthy partner. As choice increases, understanding what constitutes a quality data center will become all the more important.
To read the full article and our other B2B Tech Trends, please click here.
2 MIN READ
2 MIN READ
5 MIN READ
Subscribe to GfK Insights