Some interesting arguments were made in MicroScope’s Security fears holding back big data moves article on addressing security concerns hampering moves to big data.
Network managers are not the only people in IT dealing with “data overload”. With millions of lines of code and complex, multi-tiered applications, apps teams also struggle to manage huge volumes of user experience and transaction data and extract meaningful intelligence.
Furthermore, the article suggests that 56% of senior business managers said “security worries” had prevented them from deploying big data initiatives.
I believe that by bringing branch data back to the datacentre, IT departments can safeguard crucial corporate information and enterprises will no longer be forced to grapple with myriad local data regulations and compliance requirements.
It is important to include the protection of data by moving it from vulnerable locations to highly protected, highly secure datacentres when the critical and valuable data is backed up, protected and encrypted.
It is no surprise that security has topped the charts of concerns users have about moving to the cloud. However, the various strategies discussed above and in the big data article can combat any apprehension in moving to cloud models.
As part of IT best practices for disaster recovery planning and service availability, wide area network (WAN) optimisation and application and network performance monitoring also improve the resilience of business-critical big data applications.
All of these factors should be considered when evaluating the effectiveness of big data projects in enterprises.
Paul Coates is regional vice president of UK and South Africa at Riverbed
Image credit: Thinkstock
This was first published in August 2013