Chris James –
While for some organisations turning large volumes of data into meaningful information is a real challenge, for over half of the business population it’s really not that important. In fact, according to recent analyst reports on Big Data, just under half of worldwide businesses are looking at implementing Big Data IT infrastructure changes. At the end of the day this means that over half are not. Disappointing? Maybe. Surprising? No. Adoption of new technologies is usually slower than you think and the hype is often not reflected in the real world.
When we’re talking about Big Data, we’re looking at the growing volumes of data files that are putting pressure on storage infrastructures to deliver peak sustained performance without any downtime. As the IT industry has been looking at how business can identify when its IT systems are struggling and how to cope for over a decade, what’s new?
Many organisations continue to use the same policy: when performance starts dropping they add storage capacity to the existing system to eke out more return from legacy equipment. Most enterprises are not in the enviable position of newer (or larger) kids on the block like Google and Amazon, and can’t afford to build brand new, sparkling data centres to house state-of-the-art IT infrastructures. Nor can most businesses see the benefit of the rip-and-replace approach; when things go wrong, they often prefer to stick with the technology they have (and know) and build upon it.
It’s a well-known fact that new technologies and solutions become widely adopted only once they have been tried and tested. This has certainly been the case with the uptake of virtualisation software and now that VMware has pretty much become the standard customers are pushing vendors to provide compatible offerings. When talking about adopting a new technology, if it’s not the IT manager putting the brakes on adoption, it will more often than not be the budget holder. In the dynamic IT market, this is understandable: a myriad of new products are launched daily and it’s not easy to identify which technologies are keepers.
What we are witnessing is that canny data centre managers want to use technology to “see” the source of the glitches so that additional storage is only purchased when really needed and to fix the right part of the IT infrastructure. In this way companies are saving millions of pounds by not over provisioning and by adopting Infrastructure Performance Management (IPM) strategies to make the best of what they already have while retaining and where possible improving performance levels and SLAs.
This is great news for Virtual Instruments as we can offer exactly the right tools to enable managers to implement IPM strategies. For those peddling Big Data solutions and other new technology however, the business case is far muddier. These are times of austerity and for a new technology to succeed it needs to offer clear benefits in terms of cost savings and performance benefits while working smoothly with legacy equipment. While there are many such new technologies emerging, their adoption will very much depend on the state of an organisation’s existing systems and as we all know some IT systems are there for the long haul as we’ve seen by the IBM mainframe still having a home in many large organisations today!
Log in or register to post comments