Predictable costs with IBM cloud-like pricing for on-premises storage systems: Remove the mystery!

5 minute read

Cloud rocks! In a lot of ways, it has become a philosophy in organizations everywhere. Forrester has observed that the flexibility of its economics is too great for this model to fail. But they are also quick to note that “leaders can operate under a cloud philosophy without ever using public cloud, via cloud-like pricing for physical assets.” On the same point, IDC suggests that 60% of organizations will be using a consumption-based OPEX model for acquiring on-premises infrastructure in the next several years. One reason is that it reduces risk. IT managers in the datacenter can provide cloud-like flexibility with capacity always at the ready – without the risk or cost of paying for something that’s not being used. And cloud-like pricing prevents service providers from carrying the cost of additional infrastructure before they have sold that capacity to their customers. 

In this post, I’m exploring cloud-like pricing for on-premises storage from IBM.  It’s a straightforward approach that works like IT managers would intuitively expect it to, removing the mystery often encountered in competitive schemes, and providing ongoing predictability in costs. 

For IT managers and purchasing folks, the basic needs are really quite simple. 

  • Storage consumption goes up – and comes down. I only want to pay for what I’m using, nothing more. 
  • When storage capacity consumption bursts up, I need the capacity to be there and available. 
  • Storage technology evolves. When it’s time for new infrastructure, I don’t want overlapping costs, and I don’t want application disruption while my data moves. 

It’s a simple set of needs, but for most vendors it has been much easier said than done. Some ship over-provisioned systems with relatively small ‘buffer capacity’ or ‘reserve capacity’ and a bill that goes up as the reserve is consumed.  So far so good, but as the buffer is used up, the vendor has to install more capacity.  There are two issues with this, one is the the bill doesn’t go back down when the added reserve is emptied again.  Nice for them isn’t it!  The other issue is most IT managers want to avoid the periodic capacity upgrades that could result in workload disruptions. Some vendors offer to replace your storage controller as technology evolves, but they don’t replace the actual flash drives that are holding your data. What good is that – technology evolves there too doesn’t it? Or maybe you can bring in a complete replacement system, but the process of moving data from the old system to the new system is disruptive to your applications. It doesn’t have to be that way. 

Meet IBM cloud pricing for on-premises storage. It’s simple. 

Step 1: Do you have a rough forecast of how much storage capacity you might need over the next three, four, five years? Most IT managers can get in the ballpark. IBM and its Business Partners can help refine your thinking into a reasonably educated guess. Whatever number you jointly come up with, IBM is prepared to install the full amount of capacity on your premises. Remember, nobody has paid for anything yet, so maybe you want to edge your estimate up a little just to remove some risk or provide for unexpected bursts.

Step 2: With your educated guess established, take whatever the total capacity is and count some small part of it. For illustration purposes let’s say 35% plus or minus (your projected growth rate may edge that number up or down a bit). This amount should represent the part you are absolutely certain you are going to utilize. Often times this is the amount of capacity you are already using – no ambiguity there. The data already exists and as soon as your new IBM storage is installed, you’ll already have that percentage of the capacity filled up. This percentage becomes your baseline. 

Step 3: How do you want to pay for your baseline? You get to choose. You can rent it per TB/month with as little as a 12-month commitment and cancel with 60 days notice any time after that. Or you can lease it per TB/month for 3-5 years. Or you could choose to just purchase it all up front. It’s your call.

Step 4: Well, there isn’t really a step 4. You have enough storage installed to meet the demand you’ve forecasted over the next several years, but you are only using and paying for a small part (in our example about 35%). There’s nothing more for you to do except focus on managing your business. IBM Storage Insights, which I discussed in my post Software for Simplifying Storage Operations, is included with your cloud pricing to make life easier for your IT manager. It’s also there to meter your actual storage capacity usage. Your daily usage above the baseline is automatically averaged across the month and you receive a quarterly bill. You are only billed for what you are using, and importantly, if your usage comes back down, so does your bill. It’s that simple. 

See all that IBM FlashWatch has to offer

Fast forward three years

Actually, fast forward about two and a half years. It’s time to evaluate how new technology enhancements might benefit your organization and decide how you want to proceed at the end of your three-year term. You can choose to upgrade your complete system – controller AND storage – or keep your current system, or simply return the old system and walk away. If you choose to upgrade, you’ll feel like a kid in a candy shop. 

  • First, you get to choose what size system to upgrade to. Configure a bigger, faster system if you need to. Or a smaller system if you like. Or just keep capacity the same only with current technology. If you keep things the same, IBM will guarantee the same or lower monthly price. 
  • Next, IBM will provide the new storage a full 90 days prior to the term running out on your current storage. Using the power of the IBM Spectrum Virtualize software foundation, data can be transparently migrated from the old system to the new one. And even though the new system is already installed, payments start only after the 90-day migration window giving you continuity in your costs. Out with the old, in with the new, and no overlapping costs. 

And if that’s not enough…

In my post on Elegant Flexibility – IBM Spectrum Virtualize, I included a section on These systems make your storage simply operationally resilient. In a world with increasing demands on data availability, IBM offers Hyperswap. When properly deployed by IBM Lab Services, IBM will guarantee 100% data availability. That’s not five-9’s (99.999%) or six-9’s (99.9999%), it’s 100% availability. The cost challenge is that Hyperswap storage configurations require twice as many storage systems. To help make that more affordable, with the cloud pricing for on-premises storage that we’ve been discussing, you can get the 2x systems needed for high availability and only pay 1.2x the baseline costs.   

Share this post with your purchasing folks. 

What do you think? Are you ready to shift some financial risk to your supplier? IBM is ready to be your partner!

Software for simplifying storage operations: IBM Storage Insights

5 minute read

Business leaders rely on storage to keep the business running. For them, downtime in storage often means revenue loss. Line of business owners rely on storage to provide always-on fast access to data. “Irritated” may not be a strong enough word to describe their reaction when applications slow down or stop. Whether you are one of these business leaders or the IT infrastructure team on the front line of keeping applications running, there’s a new breed of intelligent help worth taking advantage of.  

Meet IBM Storage Insights and Spectrum Control – software designed to simplify your storage operations.  This software lead the way for IBM Storage offerings in exploiting IBM Enterprise Design Thinking(1) to deliver great customer experiences. It combines the latest in AI assist and analytic insights with secure SaaS delivery from the IBM Cloud. The result? Well, it’s a simply great experience from the application owner to the IT manager to the C-Suite who is concerned with overall business uptime. 

Let’s look at some details

With the Storage Insights family, clients have the choice whether to take advantage of IBM’s free software-as-a-service, securely from the IBM Cloud, or to purchase and deploy the monitoring software on-premises. For this blog, I’m going to focus on the free software-as-a-service, called Storage Insights. I’ll touch on priced options, Storage Insights Pro and Spectrum Control, toward the end. 

Keeping track of system performance, capacity utilization and overall health is simple. In fact, clients can get a consistent operational view of all their IBM storage. But really, what distinguishes Storage Insights is its use of analytics and AI.

Analytics and AI love a good problem. Give them data to work on and it’s sometimes surprising what you can learn. Storage Insights collects telemetry at the rate of about 23 million points from each system every day – and, as of the time of this post, with Storage Insights managing over three exabytes of storage, the data lake has a LOT of telemetry information! From there analytics can offer insights into configuration best practices that will help reduce risk. They can also help predict component failures giving IT managers opportunity to be proactive and avoid business impact. And AI can detect anomalous patterns in workloads and system behavior that can help identify and resolve complex issues that arise in storage infrastructure.  

IBM Storage Insights helps connect client teams, both infrastructure teams and lines of business, with IBM

First, all parties have a unified view of IBM Storage. This provides a single pane of glass to see an inventory of all a client’s IBM block storage systems and their characteristics. It also generates a live event feed so they know, up to the second, what is going on with their storage enabling fast action when needed. 

Second, Storage Insight collects telemetry data and securely ”calls home” with that data providing up to the second multi-conditional storage alerting around both capacity and performance. 

Next, Storage Insights monitors the overall health of storage, its configuration state to see if it meets the best practices of others like it in the industry, and system resource management helping proactively avoid situations where system resources are overtaxed. 

See a demonstration of Storage Insights predictive analytics

Finally, Storage Insights provides an enhanced level of customer service. Event filters eliminate unimportant noise helping get to the root issues quickly. Logs are automatically collected and communicated to IBM Support eliminating all the waiting and back-and-forth that, in the past, could slow down time-to-resolution by as much as 50%. And Storage Insights gives a direct means of opening, closing, and tracking support tickets. It’s one place for your interactions with IBM Storage Support.

If clients choose to take advantage of the free Storage Insights service, there are a few important attributes they’ll be impressed with.

  • They gain an extended team. Not only is IBM taking care of keeping the service up-to-date and running for them, IBM support becomes extended members of the client’s team able to virtually sit with them on either side of the same Storage Insights dashboard and work issues.
  • Notice the little green lock in the center of the graphic below. Storage Insights is delivered securely from the IBM Cloud. Client security teams will be happy to know that this is a one-way communication of device telemetry (metadata only), the data-at-rest is AES 256-bit encrypted and Storage Insights is certified to the ISO/IEC 27001 Information Security Management (ISM) standards to keep an organizations information assets secure. 

As valuable as the free Storage Insights Service is, there’s even more that can be unlocked. Clients can start a free trial of these capabilities with a single mouse-click right from the Storage Insights interface. 

Storage Insights Pro deepens the available information on IBM storage exposing up to a year of history, broadens the coverage to include some non-IBM storage, and introduces a configurable reporting interface. Adding an on-premises deployment of Spectrum Control brings an almost infinitely customizable reporting interface and further broadens the number of non-IBM storage systems that can be directly managed.  

Most clients we work with have a heterogeneous mix of storage. In environments like that, keeping track of system performance, capacity utilization and overall health can be difficult. With Storage Insights Pro or Spectrum Control it can be simple. In fact, clients can get a consistent operational view of all their storage. 

Read the new Forrester study on the Total Economic Impact of storage built with Spectrum Virtualize and managed by Storage Insights

In every organization, there is a need to share information. Leaders want to understand and track key performance indicators. With Storage Insights Pro, most anything seen in the cloud-based console can be directly shared with stakeholders as a report on a scheduled basis. This helps keep everyone on the same page and working toward common outcomes.  

What do you think? Could you use a simpler approach to monitoring storage system health, capacity, and performance from just about anywhere you happen to be? How about a simpler approach to creating tickets and uploading diagnostic information that automatically flows into the proper support queues? Leave a comment.

(1) IBM Enterprise Design Thinking is a best practice that IBM teaches and certifies practitioners on across the industry. It is the method we use to ensure our offerings deliver great experiences to our customers.

Storage Software for Hybrid Multicloud: IBM Spectrum Virtualize for Public Cloud

4 minute read

When your journey takes your infrastructure to hybrid multicloud, the same Spectrum Virtualize software that is available on IBM enterprise storage systems from entry to midrange to high-end, and across over 500 heterogeneous on-premises storage systems, can also be quickly deployed on multiple public clouds. It’s storage for hybrid multicloud made simple.

Certainly we have all noticed the rapid industry transition toward hybrid multicloud. With most organizations now using a mix of cloud models and more than one public cloud provider, and billions of US dollars expected to be spent in 2020 on hybrid multicloud infrastructure, hybrid multicloud is the new normal – and it is impacting strategic storage choices. You may be wondering “Why does storage in the cloud change how I should look at storage on-premises?” It’s an important question that I explored in my post on Innovating with an infrastructure-independent storage software foundation. It’s worth a quick 4 minute read. 

Most IT managers are quite adept at evaluating on-premises storage options. But this area of hybrid multicloud storage is relatively new. Spectrum Virtualize for Public Cloud runs on multiple public clouds, but for the purposes of this blog post, I’ll focus on Spectrum Virtualize for Public Cloud on Amazon Web Services as my example.

Consistency on-premises and on cloud

Peruse this list of cloud misconceptions and count the ones you may have heard before (I’ve heard them all). 

  • My IT management team operates the on-prem infrastructure. We won’t have to mess with the cloud. 
  • Cloud storage is cheap. Moving data there means you don’t have to worry about tiering and data reduction like you do on-premises.
  • Cloud storage has services for things like snapshotting and replication. I guess that’s what the application guys will use. 

One of the realities I find in cloud deployments is that, at least initially, applications are being lifted and shifted from on-prem to the cloud. Few, if any, modifications are being made and largely the same IT organization is being tasked with managing both the on-premises and cloud infrastructure. There are decades of investment in automation and skilled IT managers that organizations want to leverage, making consistency important. And just because companies are renting their storage infrastructure from a cloud provider doesn’t mean they no longer care about being as efficient as they can to keep their costs down. 

Learn how IBM Spectrum Virtualize for Public Cloud is helping service providers to help their clients extend on-premises storage to Amazon Web Services

On-premises, Spectrum Virtualize can bring consistent behavior to over 500 different IBM and non-IBM storage systems. On the cloud, Spectrum Virtualize for Public Cloud can bring that same consistency to Amazon Elastic Block Storage (EBS). Consistency how? Here are a couple of examples. 

  • To make efficient use of your on-premises storage capacity, you would use deduplication, compression, and thin provisioning. Spectrum Virtualize for Public Cloud does the same for Amazon EBS. 
  • To make effective use of your on-premises storage tiers you would turn on automated tiering. Amazon also offers multiple EBS volume types – tiers of storage – and Spectrum Virtualize for Public Cloud includes the same AI-driven EasyTier function found on-premises, helping ensure the right data is placed on the right EBS tier at the right time. With EasyTier, most environments can be configured with a small amount of the highest performing tiers and a larger quantity of more economical storage – saving money on your bill from Amazon. 

Connecting storage across the hybrid multicloud

There are a number of compelling storage use cases we see organizations deploy across the hybrid multicloud. Leveraging the cloud as a replication target for business continuity. Snapshotting on-premises data to the cloud for DevOps or so applications can leverage analytic and AI services found on the cloud. Creating air-gap copies to the cloud to aid in cyber resiliency. Or simply migrating workloads either from on-premises to the cloud, or from one cloud provider to another. With all these use cases, it’s important to have consistency on both ends of the connection.  

With Spectrum Virtualize on-premises across any IBM or non-IBM storage systems you might choose, on your Amazon EBS storage, and on block storage from other cloud providers, your hybrid multicloud storage infrastructure can offer a consistent set of services, surfaced by a consistent set of APIs, and be managed in a consistent manner regardless of your choice in on-premises storage vendors or your choice in cloud providers. As your application programmers begin leveraging environments like Red Hat OpenShift that allow them to build applications once and deploy anywhere, having a consistent approach to storage in all the places they might deploy becomes critical. At the same time, as an IT administrator, you can still have the flexibility to source your infrastructure from most any on-premises or cloud provider you like. It’s an infrastructure independent approach that gives you both consistency and flexibility. 

Check back for the next post where I’ll dive a little deeper into software for simplifying storage operations. 

Entry Enterprise Storage: IBM FlashSystem 5000 / 5100

3 minute read

Small businesses are the engine behind the worlds economic output. According to the World Bank, “They represent about 90% of businesses and more than 50% of employment worldwide.” Why is it then that so many of the leading storage system vendors relegate small businesses to feature depleted, second-tier storage systems — reserving their best efforts only for large enterprises? Small businesses deserve more!

Listen to Oxford Falls Grammar School describe accelerating cost efficiencies with simple, more affordable IBM Storage.

Meet the IBM FlashSystem 5000 and 5100 – enterprise storage systems designed for entry environments.  

These systems are engineered with the best from IBM Enterprise Design Thinking(1) for intentional simplicity. From installation, to initial configuration, to ongoing monitoring and support, these systems are simple.  And that simplicity is extended to the full set of enterprise capabilities that many other vendors reserve for only their expensive high-end equipment. I’m referring to things like encryption, compression and deduplication, two and three-site replication that can include the cloud, and automation with open source tools like Red Hat Ansible. As I discussed in my post Change in the storage industry is coming – and it’s good!, IBM now offers a single software stack, Spectrum Virtualize, on enterprise systems from entry to high-end, on multiple public clouds, and across over 500 heterogeneous on-premises storage systems. That’s why small businesses can now benefit from rich capabilities once reserved only for large enterprises.

Like all storage based on Spectrum Virtualize…

These systems are simple to monitor.

Take a quick tour of Storage Insights.

Keeping track of system performance, capacity utilization, and overall health is simple. IBM Storage Insights is software-as-a-service (SaaS), delivered securely from the IBM Cloud.  It is included with any IBM FlashSystem storage as well as many other IBM storage offerings. Because it’s SaaS, it is like IBM has become a part of your operations and support team – keeping the service updated and running, watching for issues in your storage, and helping you resolve them. This consistent operational view of storage can contribute to better data availability for your applications, better asset utilization, and better overall performance. 

These systems are simple to support.

Watch a demonstration of opening a support ticket with Storage Insights.

To prevent potential issues from impacting your business, Storage Insights collects detailed telemetry from your system and applies analytics to predict potential failure conditions and proactively notify you of risks. Experience to-date suggests about two-thirds of potential issues can be averted automatically. In the event you do need support, Storage Insights simplifies the process of both opening and tracking tickets, as well as collecting and uploading diagnostic information all with the goal of getting your issue resolved quickly. This simplicity can result in 40% faster action plans. 

Check back next week when I’ll discuss more about connecting these systems to the cloud with Spectrum Virtualize for Public Cloud

What do you think? Which enterprise storage capabilities have you been waiting for in an affordable entry package?

(1) IBM Enterprise Design Thinking is a best practice that IBM teaches and certifies practitioners on across the industry. It is the method we use to ensure our offerings deliver great experiences to our customers.

Midrange Enterprise Storage: IBM FlashSystem 7200

3 minute read

Whether you work for one of the hundreds of thousands of middle market companies that generate roughly one-third of the business revenue in the world, or for a larger firm who has core workloads that don’t quite push the extreme limits of storage technology, one thing is true — your business can very likely benefit from what has been considered enterprise-class storage capabilities.  

Meet the IBM FlashSystem 7200 – an enterprise storage system designed for the midrange.  These systems combine the very latest hardware technology advancements with the power of a strategic storage software platform, IBM Spectrum Virtualize, and an AI-infused management service, IBM Storage Insights, to deliver great experiences from the application owner to the IT manager to the purchasing manager. The outcome is intentional simplicity. 

Learn more about IBM Hybrid Multicloud Storage Solutions

Like all storage based on Spectrum Virtualize…

These systems are simply flexible.

FlashSystem 7200 is a hybrid flash system that can be configured with several flash, storage class memory, and HDD combinations, and offers host connection options to best meet the requirements of your applications. 

While industry standard flash is available in both NVMe for control enclosures and SAS for expansion enclosures, the highest capacity, best performing flash that also has the longest lifespan are the IBM FlashCore Modules. They also contain dedicated hardware to deliver compression and encryption with no performance penalty. This makes them singularly unique in the world of NVMe-based storage. 

These systems can also be configured with Storage Class Memory, a completely different type of flash that achieves groundbreaking performance. Due to the relatively small capacities and higher price, for now the majority of Storage Class Memory will be used as a type of cache or specialized tier. Making efficient use of this specialized tier is simple with the AI-driven EasyTier capability — we’ll talk about that in a moment. 

When it comes to connecting these systems to hosts and applications, there are also a wide variety of options including Fibre Channel adapters that support full NVMe over Fabrics (NVMe-oF).

These systems are simple to tier.

With the flexible drive options I covered above — including Storage Class Memory, all-flash, and hybrid configurations — the IBM FlashSystem family allows you to craft the optimal balance between cost and performance. To make the most of the tiers, these systems include the AI-driven IBM EasyTier capability that helps ensure the right data is placed on the right tier at the right time. The system takes care of transparently moving the hottest blocks of data up the tiers while the cooler blocks of data are moved down. There’s nothing for you to do except set it and forget it. And in the process, not only does your overall system throughput improve, but so does your cost effectiveness. With EasyTier, most environments can be configured with a small amount of the highest performing tiers and a larger quantity of more economical storage. You’ll find that the vast majority of your I/O being handled in the top tiers. 

It’s that simple. 

To learn about other enterprise storage options, read my post on High-end Enterprise Storage or check back for the next post where I’ll introduce Entry Enterprise Storage from IBM along with another few Spectrum Virtualize capabilities found across the hybrid multicloud. 

High-end Enterprise Storage: IBM FlashSystem 9200 and 9200R

3 minute read

Rapid deployment of new applications combined with crazy fast growth in the amount of data being operated on is pushing enterprises of all shapes and sizes into territory that can only be described as the most demanding of environments. 

The radically simple idea IBM has introduced is that, with Spectrum Virtualize and Storage Insights, all the IBM FlashSystem storage from entry enterprise through to high-end enterprise, and all the other block storage you might have from other vendors, and the block storage you might use on multiple public clouds can all behave in a consistent manner. In my post Elegant flexibility – IBM Spectrum Virtualize and the SAN Volume Controller, I explored how this works with the block storage you may already have from multiple vendors. In this post I’m going to introduce IBM FlashSystem high-end enterprise storage and a few more of the consistent software capabilities Spectrum Virtualize and Storage Insights can bring across your hybrid multicloud. 

Meet the IBM FlashSystem 9200 and 9200R, enterprise storage systems designed for the most demanding environments.

These systems are simply fast. 

With its end-to-end NVMe throughput advantage, IBM FlashSystem 9200 sets a new standard in I/O performance density, the measure of work done per rack unit spent. A FlashSystem 9200 can deliver over 2.2 million I/O’s per second per rack unit — twelve times better than the current generation of some competitors. The IBM FlashSystem 9200 can also deliver faster business decisions with 70 microsecond latency — thirty percent faster than the current generation of other competitors. And for gigabytes per second, a scaled-out IBM FlashSystem 9200 can deliver a stunning 180GB/s — much faster than competitive alternatives(1). 

Like all storage based on Spectrum Virtualize…

These systems are simply efficient.

Making efficient use of available storage capacity can have a profound impact on client budgets. This is where data reduction comes in. Data reduction can come in two forms. First is the hardware-accelerated compression in IBM FlashCore modules. This is ideal for performance sensitive applications that can still benefit from data reduction. There is also software data reduction where Spectrum Virtualize applies several techniques including deduplication, compression, thin provisioning, zero detect, and SCSI Unmap. And it can do this in a consistent manner for all the block storage in your datacenter regardless of your choice in hardware vendors. 

To help size systems correctly, IBM guarantees capacity savings of up to a 5-to-1 ratio, based on a profile of a your workload. If it is not practical to work with IBM to collect data and run reports on your workloads, you can take advantage of a self-certified guarantee of 2:1 data reduction. 

These systems are simple to secure.

Encryption can come in two forms. First is the U.S. Government computer security standard FIPS 140-2 validated encryption that can be enabled with IBM FlashCore modules. This is tamper-resistant hardware encryption for your most sensitive data. There is also software encryption that can be extended system-wide and in a consistent manner to all the other block storage in your datacenter regardless of your choice in hardware vendors.  There’s also choice in how to take care of your encryption keys – either with a physical USB key in the system or with an enterprise key manager like IBM Security Key Lifecycle Manager or Gemalto SafeNet KeySecure.

Check back for the next post where I’ll introduce Midrange Enterprise Storage from IBM and a few more of the Spectrum Virtualize capabilities found across the hybrid multicloud. 

(1) Competitive performance comparisons taken from these third-party and vendor sources for DellEMC link, link; NetApp link; Pure link, link; that were publicly available as of the date of this post.

Elegant flexibility – IBM Spectrum Virtualize and the SAN Volume Controller

4 minute read

Elegant flexibility

Strategic storage choices are more about the software than the hardware it happens to be running on at the moment. Do you agree?

In my post Change in the storage industry is coming – and it’s good! I shared why I think there is a trend toward radical simplification starting in the storage industry. IBM started the trend and others are sure to follow in bids to remain competitive. I’m continuing the discussion on what IBM has done. For other available posts in the series, look here.

With its February 2020 announcement of Storage Made Simple for Hybrid Multicloud, IBM introduced a strategic storage software platform, IBM Spectrum Virtualize, and an AI-infused management service, IBM Storage Insights, that operate on a family of enterprise storage systems spanning from entry to midrange to high-end, all engineered with the best from IBM Enterprise Design Thinking(1) to deliver great customer experiences. Spectrum Virtualize and Storage Insights can also be applied to over 500 heterogeneous on-premises storage systems and multiple public clouds to give IT managers consistency in storage operations across their hybrid multicloud. 

Discussions about storage systems and their capabilities are often compartmentalized – here’s one storage system and what it does, here’s another storage system and what it does differently, etc. The radically simple idea IBM has introduced is that, with Spectrum Virtualize and Storage Insights, all the IBM FlashSystem storage from entry enterprise through to high-end enterprise, and all the other block storage you might have from other vendors, and the block storage you might use on multiple public clouds can all behave in a consistent manner. It’s an elegant approach. To keep these posts consumable in length, I’m going to introduce one of the platforms and a few of the consistent software capabilities in each post. A good place to start is with the IBM SAN Volume Controller (SVC)

Meet the IBM SAN Volume Controller with Spectrum Virtualize

Now in its tenth generation, clients have been deploying Spectrum Virtualize on SVC for over a decade to bring the power of a strategic software foundation to over 500 on-premises storage systems from a wide mix of vendors. If you are interested in modernizing the storage you already have, this is a starting point. 

Like all storage based on Spectrum Virtualize…

These systems make your storage simple to scale.

One of the enemies of storage efficiency is stranded pools of capacity.  And an enemy of operational efficiency is a heterogeneous storage infrastructure. SAN Volume Controller helps defeat both enemies.

A single SAN Volume Controller cluster can combine up to 32 petabytes of storage capacity from over 500 different heterogeneous storage systems into a single pool of capacity. That capacity is presented as a single system, has a single point of management, and importantly a single set of APIs and procedures. Think of that for a moment. You have the freedom to maintain heterogeneous storage infrastructure, creating a little healthy competition between vendors, without the operational headaches that usually follow. And you can apply advanced features like encryption, data reduction, enhanced high-availability, and connection to the cloud across all your storage even if its an older model that predates that sort of capability. Then, when it comes time to add more capacity or shift your vendor strategy, no big deal. Just insert the new and remove the old. Your data can be migrated without disruption and your operations remain consistent. It’s that simple. 

SAN Volume Controller makes your storage simple to scale

These systems make your storage simply operationally resilient.

Ensuring your data stays available to your applications is the primary function of storage. When you are operating storage with Spectrum Virtualize, you’re covered. In fact, for the most stringent requirements, IBM guarantees it.

Regardless of your choice in storage hardware vendors, Spectrum Virtualize offers a consistent approach to traditional 2-site and 3-site replication configurations using your choice of synchronous or asynchronous data communication. There is also IBM Hyperswap. When properly deployed by IBM Lab Services, IBM will guarantee 100% data availability. That’s not five-9’s (99.999%) or six-9’s (99.9999%), it’s 100% availability. 

Finally, Spectrum Virtualize running on the SVC offers unique enhanced high-availability (HA) configurations that deliver zero-impact failover between two to four sites and the possibility of fifth-site replication that can include public cloud. 

SAN Volume Controller makes your storage simply operationally resilient

Check back for the next post where I’ll introduce FlashSystem high-end enterprise storage and a few more of the Spectrum Virtualize capabilities found across the hybrid multicloud. 

What do you think? Are strategic storage choices more about the software than the hardware it happens to be running on at the moment? Join the conversation – leave a comment!

(1) IBM Enterprise Design Thinking is a best practice that IBM teaches and certifies practitioners on across the industry. It is the method we use to ensure our offerings deliver great experiences to our customers.

Change in the storage industry is coming – and it’s good!

3 minute read

Let me make a prediction. As an IT manager, you are about to see a wave of actions by storage vendors to consolidate and simplify their portfolios. Pendulums swing, and for too long the storage industry has been swinging toward extreme customization – delivering a crowded field of individual storage platforms for only subtly different use cases. Got a slightly different scalability requirement – introduce a new platform. Need some new function – pop out a new higher-end platform. Find a new niche in availability or performance requirements – deliver a new platform and charge more for it. Your smaller customers need a smaller entry point – offer a feature depleted platform for less money. You get the picture. 

There’s a practical challenge for IT managers with all this.

Most enterprises posses many different use cases. They have mission critical transaction processing workloads, Big Data and analytic workloads, departmental DevOps, some social and mobile workloads, etc. Even if an IT manager has a single supplier purchasing strategy (which most don’t), they are faced with several differing platforms from that single supplier. “Differing how?” you ask. Well, in most cases, each of these platforms have a different approach to management and troubleshooting, different APIs for automation, and different paths to the cloud. And that means different skill and training requirements for the IT staff charged with operations.

Storage platform complexity c. 1H2020

There’s also a practical challenge for the vendors.

Each of these platforms often represents a separate development team and set of business tradeoffs. Think about it. Pick the vendor(s) you happen to buy from at the moment and consider a function like multi-site replication for example. As a vendor, you have enterprise customers who could take advantage of that capability from the entry to the high-end, but you can’t afford to separately develop it on every one of your platforms so you only offer it on your high-end stuff and charge your customers more. Or maybe you’re trying to work out a path to the cloud for your clients so they can leverage the cloud as a disaster recovery target, or for DevOps, or for AI and analytic services, or as an air-gap to help with cyber resiliency. Delivering those things requires an approach to replicating or snapshotting data from on-premises to the cloud. As you let out a big sigh, you realize you have to build that capability separately for each of your different platforms and that’s expensive. So, in most cases, you pick the subset of your portfolio you’ll be offering it on and just don’t offer it on the rest. 

Watch Dave Vellante (SiliconANGLE theCUBE) and Ed Walsh (IBM) discuss storage complexity

For the vendor, this kind of complexity drives up cost and slows down innovation.  Of course I’m not privy to all the business conversations different vendors have, but I can only imagine that some of these dynamics have contributed to vendors doing silly things like offering multiple cost-tiers of hardware but not delivering any automated way for their clients to tier data from one of those platforms to another. How difficult it must be for a sales person to make a big deal about some new low-cost hardware platform in their lineup… and then have to tell their customer that it’s up to them to figure out how to manually get the right data to that platform at the right time. 

For the IT manager, it makes every purchase a complex headache, and ongoing operations more difficult than it has to be.

That’s why I’m confident predicting that IT managers are about to see a wave of actions by storage vendors to consolidate and simplify their portfolios. IBM started the trend with its February 2020 announcement of Storage Made Simple for Hybrid Multicloud and other vendors are sure to follow if they hope to remain competitive. IBM now offers a single software stack, Spectrum Virtualize, on enterprise systems from entry to high-end, on multiple public clouds, and across over 500 heterogeneous on-premises storage systems.

Follow this series of posts on the IBM announcement

What do you think? Are you ready for a more simplified set of choices from the storage industry? Join the conversation – leave a comment!

Software Defined Storage Use Case – Block SAN Storage for Traditional Workloads

In my last post, IBM Spectrum Storage Suite – Revolutionizing How IT Managers License Software Defined Storage, I introduced a simple and predictable licensing model for most all the storage needs an IT manager might have. That’s a pretty big concept if you think about all the storage use cases an IT manager has to deal with.

  • Block SAN storage for traditional workloadsIBM Spectrum Storage Suite Symbol
  • File storage for analytic or Big Data workloads
  • Object storage for cloud and mobile workloads
  • Scale-out block storage for VMware datastores
  • Storage for housing archive or backup copies

Just to name a few… The idea behind software defined storage is that an IT manager optimizes storage hardware capacity purchases for performance, environmentals (like power and space consumption), and cost. Then he ‘software defines’ that capacity into something useful – something that meets the needs of whatever particular use case he is trying to deal with. But is that really possible under a single, predictable software defined storage license? The best way I can think of to answer the question is to look at several of the most common use cases we see with our clients.

Perhaps the most widely deployed enterprise use case today is block SAN storage for the traditional workloads that all our businesses are built on – databases, email systems, ERP, customer relationship management and the like. Most IT managers know exactly what kind of storage capabilities they need to deploy for this use case. It’s stuff like:

Here’s the thing… This use case has been evolving for years and most IT managers have it deployed. The problem isn’t that the capabilities don’t exist. The problem is that the capabilities are most often tied directly to a specific piece of hardware. If you like a particular capability, the only way to get it is to buy that vendors hardware. It’s a hardware-defined model and you were locked in. With IBM Spectrum Storage, IBM has securely unboxed with software defined. All the capabilities I just mentioned can be accomplished with one IBM Spectrum Storage Suite software license and you have complete flexibility to pick whatever hardware vendor or tier you like. The idea of software defined changes everything. With the software securely unboxed from the hardware, you really are free to choose whatever hardware you want from most any vendor you like. And since the software can stay the same even while hardware is changing, it means you don’t experience any operational or procedural tax when you make those changes.

All of the capabilities mentioned above for addressing this Block SAN storage for traditional workloads use case can be accomplished with one IBM Spectrum Storage Suite software license. This may be the most widely deployed use case today, but it’s not the fastest growing use case. In my next posts, I’ll continue looking at the wide variety of use cases that are covered by the simple, predictable IBM Spectrum Storage Suite software defined storage license.

Are you interested in taking the first step with software defined storage? Contact your IBM Business Partner or sales representative. And join the conversation with #IBMStorage and #softwaredefined.

IBM Spectrum Storage Suite – Revolutionizing How IT Managers License Software Defined Storage

About a year age IBM shook up the software defined storage world with IBM Spectrum Storage.  It was the industry’s first complete family of software defined storage offerings that could drive cost efficiency and total management across most all the needs an IT manager might have:

IBM Spectrum Storage Family Tall

  • Software defined storage for traditional workloads like database, email and ERP systems as well as new-gen workloads like analytic, mobile, Big Data, cloud and Cognitive business.
  • Capabilities needed for primary data as well as backup or archive copies.
  • Access via block, file and object protocols.
  • Operating on heterogeneous storage hardware from most any vendor – traditional SAN storage infrastructure as well as newer scale out infrastructure with storage-rich servers.
  • …all nicely integrated with a consistent set of interfaces and vocabulary.

 

In a world where much of this type of capability had been tied to some piece of hardware – hardware defined storage – IBM had securely unboxed storage and forever changed storage economics. In the last year, over 2,000 brand new clients have started with IBM Spectrum Storage. Through those conversations, we’ve learned two important things about how enterprises are approaching software defined storage – things that have led us to revolutionize how software defined storage is licensed.

  1. CFO’s are exasperated with the unpredictability of storage and storage software licensing. IT managers generally have a good feel for what type of capabilities they need to accomplish the use cases they want. But have you ever paused to think about how confusing it must be for them to figure out exactly how much software they need to license? Think about it from the perspective of someone not using IBM Spectrum Storage.
    • SAN Virtualization software can be licensed starting with a per frame base and then adding a per TB managed
    • Storage Resource Management software can be licensed per system and per tiered-TB
    • Backup software can be licensed per operating platform or application and then adding a per TB
    • Data stores for Virtual Machines can be licensed per server and per VM
    • Deduplication software can be licensed per system
    • …and the list can go on. If the IT manager can figure out how much of this stuff he wants at a given point in time, how is a CFO supposed to predict future costs?
  2. IT managers are dealing with transition in storage infrastructure. What software they need can shift rapidly. Most IT managers reading this post are responsible for estates of SAN storage. For the traditional workloads that our businesses are built on, this has been the dominant storage infrastructure approach for years. But there is a transition well underway. Who hasn’t been impacted by newer mobile and cloud workloads? What company or organization isn’t looking to make better use of Big Data and analytic workloads? These applications are being built to take advantage of a different kind of storage infrastructure, one that is characterized by direct-attach JBODs and fields of servers with lots of internal capacity, and really none of it connected to a SAN. IDC data suggests that IT managers are now purchasing more TB’s of this type of storage than they are SAN storage, and the trend isn’t likely to ever change. In many enterprises, the capacity mix is shifting from traditional SAN to newer storage-rich servers. The transition presents a challenge in storage software licensing. When physical infrastructure shifts from big SAN-attached storage systems to scale-out storage-rich servers, the types of capabilities needed don’t change dramatically, but the specific vendor software packages change a lot. Instead of a virtualizer for block SAN storage, an IT manager might need to shift toward offerings that software-define his field of storage-rich servers into something useful like performance optimized file storage for analytics, an integrated VMware block data store, or cost optimized object storage for backup and archive. This is all new and the rate of change is unknown – and that presents a challenge. Flexibility is paramount.

IBM Spectrum Storage Suite SymbolWith IBM Spectrum Storage Suite, IT managers now have a simple and predictable licensing model for the entire IBM Spectrum Storage family. Its straightforward per-TB pricing relates costs to the amount of storage capacity being software-defined, regardless of use case. That makes it easy for IT managers to grow and transition how they use storage, and for CFOs to predict costs. And with the Suite, clients can save up to 40% compared with licensing all capabilities separately.

Consider a typical enterprise starting with its existing SAN storage infrastructure but rapidly growing a new kind of infrastructure for new workloads. In most datacenters this transition is coming, but few really understand how fast or exactly which use cases will emerge. There’s going to be some experimentation and rapid change.

IBM Spectrum Storage Suite evolving use case 1Attempting to navigate the next few years using a’ la carte licenses of point products from multiple vendors is going to be difficult. In fact, CFO’s are going to push back against the unpredictability and may ration what software can be licensed. That can slow down innovation. IBM Spectrum Storage Suite offers cost predictability and frees IT managers to exploit any IBM Spectrum Storage capability required to get the job done.

IBM Spectrum Storage Suite evolving use case 2Let’s suppose you are an IT manager at the front end of this picture. You’ve deployed Block SAN storage for traditional workloads as your first IBM Spectrum Storage Suite use case. Now you want to explore another use case. Well, with IBM Spectrum Storage Suite you already own entitlement to all capabilities in the IBM Spectrum Storage family, so you are free to download any of the software you like. To help you quickly adopt the additional use cases your business may need, IBM Spectrum Storage Suite licensing offers the ability to perform extended tests in an evaluation sandbox proving ground without additional charge. So go ahead, experiment with your next use case. Prove it, become familiar with it, pay for it when it’s deployed for productive use.

Are you interested in taking the first step with software defined storage? Contact your IBM Business Partner or sales representative. And join the conversation with #IBMStorage and #softwaredefined