Eric Jorgensen –
Over the past 50 years, data storage has changed and progressed on an unprecedented scale. Thinking back to 1956 when IBM launched the RAMAC 305, the world’s first computer with a hard disk drive, weighing one tonne and with approximately 4.4MB of memory, we can only imagine what the future might hold. While Moore’s law, which recognises a long-term trend in the history of computing hardware, remains relevant, the way in which companies are using their storage resources is fundamentally changing.
From my position at Virtual Instruments I have seen first hand a shift in the way in which organisations use and rely on data. At a customer level, there is a continual and growing pressure, exacerbated by budgetary constraints in a post-recessionary environment, to manage platforms and infrastructure projects as efficiently as possible. The challenge for companies is no longer a simple matter of holding a generic block of computer power; instead, they must align resources with application-based demand. Of course, many businesses continue to add more storage as a means to accommodate an ever-increasing amount of data but this often negatively impacts their bottom-lines and is not a long-term solution.
At a helicopter level, there is a great deal of uncertainty around the way in which the industry is heading and it remains unclear how data will be managed and stored over the next five to ten years. A financial services giant recently signed a major ten-year agreement with a supplier which will see the introduction of a cloud-based infrastructure within its datacentres. But there are significant and justified concerns around outsourcing data on such a large scale, and while we grapple with issues associated with the risk of data loss and security, a hybrid solution whereby data is stored both on premises and offsite with a third party, seems likely.
What underpins this shift in companies’ relationships with storage is the emergence of Big Data, which is due to a number of drivers including the rapid spread of social media usage, Governmental and corporate data retention and protection policies, the increased adoption of data analytics and the growing use of rich media both by businesses and end users. As we collect more and more information, we require effective ways of analysing and interpreting it so that we can make informed decisions, often in real-time. Furthermore, new means of collecting data, such as the introduction of SIM cards into fridges to inform supermarkets of customers’ levels of food and drink, will lead to significant changes for consumers but, in equal measure, for suppliers. These new insights in consumer behaviour will, for example, lead to changes in the back-end of businesses, from sourcing to inventory.
We have moved into an era in which the trajectory Moore observed remains uncannily true. However the key is no longer fast but smart technology: in order to negotiate the demands of storing and analysing growing volumes of data, businesses need to take a strategic approach to long-term planning if they are to maximise their infrastructures and drive efficiency within a justifiable budget.
Log in or register to post comments