By Raj Patel, Sr. Director, Corporate and Field Marketing
It’s no secret that Flash offers many advantages over traditional memory. When considering moving to all-Flash (or at least hybrid) data centers, however, it’s not as simple as switching out the technology and firing it up.
At the Flash Memory Summit 2016 in Santa Clara, our CMO Len Rosenthal got together with Storage Switzerland Lead Analyst George Crump to discuss what companies need to consider when moving to a Flash data center.
They talked about two main sets of questions companies have to ask when considering a move:
Any Flash product you purchase must be tested with your workloads. Don’t be confused by vendor sales teams giving you a lot of performance statistics. These will differ significantly from your workloads. Therefore, workload acquisition and analysis is critical before making purchase and deployment decisions around Flash.
To do this, you have to truly understand your workloads, of course. That’s where solutions that include workload acquisition and workload analysis are critically important.
Workload acquisition technology extracts all types of performance-related I/O profile data from production storage arrays and switches – everything from reads/writes and random sequential data, to data and metadata, compressibility and dedupe ability – to get detailed workload profiles for your environment.
Once you have this data, Load DynamiX Enterprise workload analysis and modeling will help you determine: Flash technology vs. Flash technology, Flash vs. Hybrid, vendor vs. vendor, different configurations, different amounts of Flash and so on.
The key takeaway here is that it’s your workload data. Flash vendors don’t work with your data. They don’t know your workload profiles. Even the lower-end simulation tools available can’t provide the realism to accurately simulate your specific workloads.
Once you’ve taken the plunge, you need a “real-time view of everything that’s going on in production,” Len says. This includes performance assurance, risk reduction, and even cost optimization by ensuring you are getting the most out of deployed assets before provisioning additional ones.
This lets you discover potential issues and fix them before they show up in applications. It also lets you proactively optimize performance, availability and utilization of infrastructure in a totally vendor-independent way.
Here at Virtual Instruments, with our VirtualWisdom platform, we monitor and correlate infrastructure data in real time across servers, switches and storage. This is the only way to get a handle of your environment and obtain true understanding of how applications are performing in terms of the infrastructure delivering them.
Why is real-time awareness and analysis so important? For fast-moving industries like financial services, e-commerce, healthcare and others, five-minute-old data can be five minutes too late. As George Crump explains, “The whole event could’ve passed in five minutes.”
To get more tips from Len and George on how to answer these questions, and how Virtual Instruments technology can help, check out the whole video.