Opinion

Are data centres a waste of space asks Roger Keenan, MD of City Lifeline

Roger Keenan.jpg
Here's Roger Keenan, MD of City Lifeline, who seems to suggest that the data centre is a waste of space and could be doomed. Or does he? Read this fascinating history of technology and draw your own conclusions.

Enterprise data centres, government data centres, commercial data centres and colocation houses have seen a massive expansion over the last ten years or so. Every organisation in the world uses more computing power today than it did five years ago and five years ago used more than it did ten years ago.  Even small children who can barely walk use more computing power.  Is there any end?  And how can the world cope with the space and electrical power that all this computing needs?

Over the last decade, the server/client concept has migrated from larger organisations down to the point where most organisations of more than just a few people have a server holding their shared files and data and multiple desktop or laptop computers to access them.  VPN's and mobile smartphones have become the norm, with remote access to all sorts of facilities no-one would ever have dreamed of a few years ago.  Many enterprises and organisations have coped with this by building their own server rooms.  Many have put the servers in a cupboard-under-the-stairs server room, plugged them into the mains and hoped it would all be all right.  Others have built out mini enterprise data centres to semi-professional standards.  Most are very inefficient in terms of their resilience, their efficiency of utilisation of electrical power and their utilisation of available space.

A changing world

The world is changing.  Technology continues to change at a ferocious rate. Moore's law states that computing power at constant price doubles every eighteen months, and that has been true since Intel designed the first 4004 chip in 1978.  Things which were inconceivable are commonplace, and the size of the gate on a CMOS transistor inside a modern integrated circuit is now only eight atoms wide and can change state at over six thousand million transitions per second (ie a 3GHz clock).  Twenty years ago, people would have said that was impossible and thirty years ago no-one would even have imagined it possible.

Many computer applications are straightforward uses of pieces of commercial software. Whereas there used to be a need for the organisation to have the servers that ran it in-house, the availability of high-quality high-bandwidth communications means that cloud computing has become a reality (but only for those who are securely connected to a fast, reliable communications backbone).  Now, applications that needed to run on the in-house server to attain a consistent speed of response that would satisfy users can run in much the same way, but with the server far more remote. So what does that mean for the data centre? 

The need for reliable bandwidth 

Multiple instances of standard commercial applications run most efficiently and effectively in high-density environments, with hundreds of processors operating together.  That isn't a cupboard-under-the-stairs computer room. It's a large data centre.  For a large organisation that can afford a large data centre, it's just a refresh of the type that would happen every few years anyway.  But for smaller organisations the opportunity arises to substantially reduce the in-house data centre, or, in some cases, to do away with it altogether by moving the applications running on the in-house servers to remote virtualised servers operated by third-party suppliers - the cloud.

And what does that mean for the data centre?  It depends on which one.  For the third-party suppliers, there are big opportunities to run very large, very efficient data centres delivering routine computing functions at a much lower cost than smaller organisation can deliver it in-house.  Big corporates have seen that and new entrants such as Google and Amazon have entered the market, such is the scale of the business opportunity (Amazon runs a shop, for goodness sake).  And for their colleagues in data communications, the need for reliable bandwidth just goes up and up. A new, rich world of growth and expansion.

Some jobs are at risk

But how about the data centre managers in charge of small and medium-sized data centres in small and medium-sized enterprises?  Their jobs are substantially at risk.  In a typical small distributed computing environment, 85 per cent of computing capacity sits idle at any one time.  If an organisation is running only standard applications on standard commercial software packages, and the site is located where reliable, high-quality bandwidth is available, then some of the data centre functions can go to remote virtualised servers in the cloud, where the use of available computing capacity is very much higher, and that is reflected back into costs.  But if some, then why not all?

Then the in-house data centre becomes just a rack of communications equipment collecting together the terminations of the user terminals and the communications lines going to the outside world.  The data centre isn't going to become extinct at all.  It's just that, like many things in life, the big and rich (like Amazon, Google and HP) will get bigger and richer and the small and poor (like in-house data centre managers) will get smaller and poorer.

 

This was first published in May 2011

Join the conversation Comment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.