There is a lot that gets said about the huge data growth IT managers face both from traditional workloads we have all grown up with and from new generation workloads like mobile, social, big data and analytics. It’s well understood that there is a mismatch between this growth and the budgets that are allocated to deal with the problem. For years the dominant conversation has been on lowering the cost of the raw capacity and on packing it with as much data as possible. There’s certainly a lot to be gained from lower-cost physical infrastructure, tiering, and technologies like thin provisioning, compression and deduplication. But as petabyte storage farms become commonplace and workloads become more sensitive to service levels, the job of balancing efficiency with performance, and performance with workload requirements becomes much more intense. That’s where IBM Spectrum Control excels – providing IT managers with intelligent analytics for managing storage.
Cost reduction and optimization
Most modern storage systems include tools that offer administrators a view of what’s going on under their covers – health of the system, performance of the system, utilization of the system, and so on. Two challenges arise for administrators responsible for storage estates of any consequential size.
- Consolidating a view of all storage: Datacenter storage is often a mix of vendors, tiers, and protocols like block and file. Tools included with individual storage systems don’t offer a broad enough perspective. Imagine being able to scan the complete environment and quickly identify capacity that isn’t yet allocated to a workload, or perhaps it is allocated but hasn’t seen any I/O activity in the last month. You could reallocate this idle capacity to maximize utilization. Imagine in that same scan identifying capacity that is working hard for you – and being able to forecast its future growth. No more guesswork. Imagine being able to monitor performance regardless of what storage tier or vendor you chose. You could apply that historical usage knowledge to tiering decisions and reduce cost.
- Managing consumer-oriented service levels: Businesses are intrerested in service levels for applications and business lines, not for pieces of hardware on the datacenter floor. Health, capacity, utilization, tiering… these are certainly all interesting at the storage system level, but their importance to the business is highlighted in more consumer-centric groupings like applications or business lines. Imagine being able to manage storage service levels – regardless of what hardware was in use – for an application like SAP or a business line like Corporate Accounting. When the financial quarter close was running (like it is in IBM at the time of this writing), you could show how the storage infrastructure associated with that business line was behaving.
We think this level of cost reduction and optimization should be quickly available to all storage administrators – whether they have deployed software defined storage like IBM Spectrum Virtualize or IBM Spectrum Scale, or are still operating with traditional hardware-centric arrays. That’s why we’re making IBM Spectrum Control available as a Software-as-a-Service offering called Storage Insights. Take a look and learn about the beta.
Clients who prefer to deploy Spectrum Control software on premises have the added opportunity to exploit advanced analytics for optimizing cost (one of my personal favorites). Here’s the scenario.
Suppose you are one of those IT managers I described above who are tasked with using multiple tiers of storage to balance efficiency with performance, and performance with workload requirements. You’ve got a substantial storage estate so the prospects seem overwhelming. You’ve deployed Spectrum Control and you’re about to experience the value of analytics first hand.
For this example let’s suppose you have three pools of tier-1 storage and one pool of tier-2 storage that you want to analyze for re-tiering.Spectrum Control discovers that one of the volumes in the tier-2 pool is over utilized. If it is moved to a tier-1 pool with sufficient performance capacity, the performance of the volume can be improved. The performance of the target pools are then analyzed and recommendations generated. The recommendations involve up-tiering the volume from the tier-2 pool to a tier-1 pool. You can review the recommendations or leverage the transparent data migration capability of Spectrum Virtualize to automatically move the volume to the tier-1 pool. Using a similar analysis, Spectrum Control can make recommendations to down-tier underutilized volumes that are occupying more expensive storage than is necessary. A single tier analysis can result in multiple volume movements in which volumes are moved to both lower and higher storage tiers. You can also schedule analysis tasks to run at specified intervals so you regularly monitor opportunities for re-tiering.Another form of optimization is balancing. Pools in the same tier can have both low and high activity levels. But your goal might be to keep all the pools in a given tier close to the same utilization. Spectrum Control can identify the average utilization for a tier and specific pools that are deviating from that utilization by, say, more than 10%.By analyzing pools on the same tier, Spectrum Control identifies opportunities to move volumes and optimize overall utilization of your storage assets. Again, when used in concert with Spectrum Virtualize, these volumes can be moved transparently.That’s intelligent analytics for managing storage. If you are an IT manager responsible for making the most of your storage investment, consider IBM Spectrum Control and Storage Insights.
And join the conversation with #IBMStorage and #softwaredefined