IBM Spectrum Protect – Crash Diet for Your Data Protection Budget

My career in storage started back in the late 1980’s when the IT world revolved around the computer system and everything else was considered a sub-system (I guess in some ways that made me a sub-administrator). The discipline of managing storage assets was just taking hold and the first order of business was to ensure all the corporate data was protected. Data that fed mainframe applications topped the list for most organizations but data associated with mission critical client-server workloads was growing rapidly. It was into this world that the great-great-grandfather of IBM Spectrum Protect was born.

Trivia question
Trivia question – leave a comment to play: Who can name the complete family lineage of IBM Spectrum Protect? Bonus points for expanding all the acronyms.

 

The world of IT has evolved a lot since then. Data is no longer a sub-thought, it is central – the new currency of business. The race to simply get all the important data protected is largely over. Spectrum Protect is now a highly evolved one-stop family that IT managers use to do that job. It is tightly integrated with workloads like databases, email systems and ERP applications; with the hypervisors they run in; with the file systems and storage devices they store their data on; and with the data capture tools that surround them such as snapshot and replication. It also includes advanced data reduction techniques like deduplication and compression.  Check out the live demo!

IBM Spectrum Protect dashboardThe question of simply ensuring your important data can be protected has been answered. The question now for most of the clients I talk to is just how efficiently the job of data protection can be done. They want to minimize the budget for data copies so they can shift investment to new business growth initiatives.

A few years ago IBM acquired Butterfly Software, a small company in the United Kingdom who had developed some BIG thoughts around communicating the economic benefits brought by certain approaches to storage. Butterfly had developed what they called an Analysis Engine Report (AER) that followed a straight forward thought process.

  1. Using a very light weight collector, gather real data about the existing storage infrastructure at a potential customer.
  2. Using that data, explain in good detail what the as-is effectiveness of the environment is and what costs will look like in five years time if the customer continues on the current approach.
  3. Show what a transformed storage infrastructure would look like compared to the as-is approach, and more importantly what future costs could look like compared to continuing as-is.

Using the Butterfly technology, IBM has partnered with clients to analyze thousands of different infrastructures scattered across every industry in most parts of the world and comprising exabytes of data. In all that analysis, our clients have discovered some remarkable things about software-defining storage and IBM’s ability to help transform the economic future of storage. One area of specialty for Butterfly is backup environments.

When compared to as-is competitive backup environments, transforming to an IBM Financial Belt Tightening 8595689Spectrum Protect approach can be, on average, 38% more efficient.  Of course your results may vary. For example, when we look at  just the mass of results from as-is Symantec NetBackup or CommVault Simpana or EMC NetWorker environments, each shows that transforming to a Spectrum Protect approach produces different, and in these three cases at least, somewhat stronger economic savings. We’ve got data by industry and for many other competitive backup approaches but you get the picture. Upgrading a backup environment to IBM Spectrum Protect is like a crash diet for your data protection budget. (Tweet this)

The best way to see for yourself is to contact IBM or an IBM Business Partner and ask for a Butterfly Backup AER study.

Join the conversation with #IBMStorage and #softwaredefined

IBM Spectrum Control – Intelligent Analytics for Managing Storage

There is a lot that gets said about the huge data growth IT managers face both from traditional workloads we have all grown up with and from new generation workloads like mobile, social, big data and analytics. It’s well understood that there is a mismatch between this growth and the budgets that are allocated to deal with the problem. For years the dominant conversation has been on lowering the cost of the raw capacity and on packing it with as much data as possible. There’s certainly a lot to be gained from lower-cost physical infrastructure, tiering, and technologies like thin provisioning, compression and deduplication. But as petabyte storage farms become commonplace and workloads become more sensitive to service levels, the job of balancing efficiency with performance, and performance with workload requirements becomes much more intense. That’s where IBM Spectrum Control excels – providing IT managers with intelligent analytics for managing storage.IBM Spectrum Control

Cost reduction and optimization

Most modern storage systems include tools that offer administrators a view of what’s going on under their covers – health of the system, performance of the system, utilization of the system, and so on. Two challenges arise for administrators responsible for storage estates of any consequential size.

  1. Consolidating a view of all storage: Datacenter storage is often a mix of vendors, tiers, and protocols like block and file. Tools included with individual storage systems don’t offer a broad enough perspective. Imagine being able to scan the complete environment and quickly identify capacity that isn’t yet allocated to a workload, or perhaps it is allocated but hasn’t seen any I/O activity in the last month. You could reallocate this idle capacity to maximize utilization. Imagine in that same scan identifying capacity that is working hard for you – and being able to forecast its future growth. No more guesswork. Imagine being able to monitor performance regardless of what storage tier or vendor you chose. You could apply that historical usage knowledge to tiering decisions and reduce cost.
  2. Managing consumer-oriented service levels: Businesses are intrerested in service levels for applications and business lines, not for pieces of hardware on the datacenter floor. Health, capacity, utilization, tiering… these are certainly all interesting at the storage system level, but their importance to the business is highlighted in more consumer-centric groupings like applications or business lines. Imagine being able to manage storage service levels – regardless of what hardware was in use – for an application like SAP or a business line like Corporate Accounting. When the financial quarter close was running (like it is in IBM at the time of this writing), you could show how the storage infrastructure associated with that business line was behaving.

We think this level of cost reduction and optimization should be quickly available to all storage administrators – whether they have deployed software defined storage like IBM Spectrum Virtualize or IBM Spectrum Scale, or are still operating with traditional hardware-centric arrays. That’s why we’re making IBM Spectrum Control available as a Software-as-a-Service offering called Storage Insights. Take a look and learn about the beta.

Intelligent Analytics

Clients who prefer to deploy Spectrum Control software on premises have the added opportunity to exploit advanced analytics for optimizing cost (one of my personal favorites). Here’s the scenario.

Suppose you are one of those IT managers I described above who are tasked with using multiple tiers of storage to balance efficiency with performance, and performance with workload requirements.  You’ve got a substantial storage estate so the prospects seem overwhelming. You’ve deployed Spectrum Control and you’re about to experience the value of analytics first hand.

For this example let’s suppose you have three pools of tier-1 storage and one pool of tier-2 storage that you want to analyze for re-tiering.IBM Spectrum Control storage poolsSpectrum Control discovers that one of the volumes in the tier-2 pool is over utilized. If it is moved to a tier-1 pool with sufficient performance capacity, the performance of the volume can be improved. IBM Spectrum Control overutilization analysisThe performance of the target pools are then analyzed and recommendations generated. The recommendations involve up-tiering the volume from the tier-2 pool to a tier-1 pool. You can review the recommendations or leverage the transparent data migration capability of Spectrum Virtualize to automatically move the volume to the tier-1 pool. IBM Spectrum Control uptierUsing a similar analysis, Spectrum Control can make recommendations to down-tier underutilized volumes that are occupying more expensive storage than is necessary. IBM Spectrum Control downtierA single tier analysis can result in multiple volume movements in which volumes are moved to both lower and higher storage tiers. You can also schedule analysis tasks to run at specified intervals so you regularly monitor opportunities for re-tiering.IBM Spectrum Control storage tieringAnother form of optimization is balancing. Pools in the same tier can have both low and high activity levels. But your goal might be to keep all the pools in a given tier close to the same utilization. Spectrum Control can identify the average utilization for a tier and specific pools that are deviating from that utilization by, say, more than 10%.IBM Spectrum Control pool balancint analysisBy analyzing pools on the same tier, Spectrum Control identifies opportunities to move volumes and optimize overall utilization of your storage assets. Again, when used in concert with Spectrum Virtualize, these volumes can be moved transparently.IBM Spectrum Control pool balancingThat’s intelligent analytics for managing storage. If you are an IT manager responsible for making the most of your storage investment, consider IBM Spectrum Control and Storage Insights.

And join the conversation with #IBMStorage and #softwaredefined