All too often, storage is simply taken for granted. After all, the price of storage has dropped from approximately $100 in 1997 to around $0.10 last year, and with more options for purchasing storage than ever before - cloud or in-house, tape, SSD or HDD - one might be forgiven for thinking that storage 'just works'.
The Data Deluge
But is it? According to Cisco, the total IP traffic in Central and Eastern Europe will be 3.7
Exabytes (3700 Petabytes) per month by 2015. It may seem obvious, but a lot of this data will have
to be stored somewhere. Furthermore, this figure does not include the large amount of information
currently stored today.
To give a practical example, Lancashire Constabulary is currently using a CCTV system which records car number plate details. Each snapshot is only 25kb. However, across the entire estate, this means that the system can record in excess of 200Gb per day on traffic monitoring alone. If all 48 county police forces used such a system, it could result in around 9Tb of data every day: 3Pb a year.
With data retention laws in force - and new laws coming into play, there is only going to be more strain on storage systems. For example, a new FSA ruling comes into play later this month, whereby all FSA-regulated companies supplying their users with mobile phones must record the conversations made on such mobile phones for six months.
A Compound Problem
Technologically, we are also facing additional problems. Storage is becoming fragmented, as companies simply acquire more low-cost storage in a piecemeal way, rather than unifying storage technologies. The problem with the former is that management costs begin to rapidly outstrip the cost of purchasing and ownership. Indeed, traditional approaches to optimising IT are beginning to reach the limits of what is possible and fresh approaches are needed.
So the problem is one not only of size, but also of strategy. With management costs for data rising fast, simply adding more disc drives to the company will soon be ineffective. Businesses could find themselves struggling with a mountain of duplicated data, staff holiday snaps stored on company storage and then backed up and retained for seven years. With this understanding, it is easy to see how the costs associated with storing and managing this data can soon spiral out of control.
Bridging the Gap
Fortunately, there are a range of solutions to the problem. Smart solutions providers need to take a holistic approach to storage to maximise the benefits, or customers will simply chip away at the problem with little hope of solving the problems.
There are a number of tactics which resellers can recommend to manage the data deluge, which are most effective when deployed together. These include deduplication, smart tiering, intelligent data compression and virtualisation.
The data deluge is clearly being recognised, with 44% of IT decision makers recognising that improving storage efficiency is a high priority. By deploying multiple strategies across the IT estate, and putting in place reporting metrics to track the size and scale of the data deluge, organisations can begin to harness this data.
Surfing on the Data Flood
Data is undoubtedly one of the most valuable organisational assets - after its employees, of course. After all, most organisational data is useful information - it is simply over-replicated, inaccessible information. Through good management practises, this information can be turned into intelligence, supporting and driving the organisation.
This intelligence can be the differentiator which pushes an organisation ahead of its competition, refining business strategies and changing direction to shape their strategy, their customer engagement and new products and services, competing effectively and thriving in the market landscape. Solutions providers which do harness the power of the data deluge could find their customers supercharged by information, rather than hampered by the flood.
This was first published in November 2011