Gartner predicts that in 4 years 50% of all storage will be either on-premise Software-Defined Storage (SDS) or in the public cloud – more than tripling from 15% today. A few years ago, IDC made a similar prediction, forecasting a compound annual growth rate for SDS of 13.5% between 2017 and 2021, with the market reaching $16.2bn in that time frame – growth far outstripping traditional proprietary storage.
What’s driving this expansion?
There are a number of clear benefits for SDS such as:
- Flexibility: SDS can handle different types of storage workloads from different applications on the same platform.
- Speed of provisioning: when you’re extending the same system rather than adding new disk arrays, you can get things done faster. And you can do this without disruption.
- Scalability: SDS provides limitless scalability – with providers like SUSE already supporting deployments into the 100s of Petabytes using solutions like Ceph; infinite scalability is possible today.
- Cost reduction: the biggest driver of them all – SDS enables enterprises to significantly reduce the capital cost of enterprise storage by replacing expensive proprietary storage hardware with commodity off the shelf hardware running SDS.
In summary, SDS provides greater flexibility, greater agility, greater scalability, and lower cost – in this context the confidence of analysts looking at growth in the sector is entirely unsurprising.
And make no bones about it, many enterprises are struggling with enterprise storage costs consuming larger and larger percentages of their IT budget. Every year storage demand increases, with more and more data being stored, accessed, and analyzed, replicated between applications and/or accessed by multiple applications. Just one example of the trends driving increased data storage is the Internet of Things (IoT); which is already here for many enterprises resulting in significantly more data needing to be collected and stored.
Against this backdrop, simply adding storage the way you always have – rack by row, for specific applications and stored in proprietary disk arrays, is not likely to be a winning strategy. The costs this strategy imposes will not pass the CIO and CFO who like ‘hockey stick’ graphs when they are applied to revenue, but not when they are applied to IT costs.
Where is your organization on the SDS journey?
Have you done your homework already, gotten your business and technical case straight and taken the plunge into the new way of doing things? Or are you still nervously tip-toeing towards the edge of the diving board, thinking as you move that the water looks a long way off, and feeling that a misstep might result in a painful belly-flop –- rather than the smooth entry you were hoping for? When it comes to doing something unfamiliar and new, it isn’t wise to assume that change will be easy or without challenges.
However, while it is true that you should ‘look before you leap’, it’s also true that nobody learns to swim in the classroom – you have to overcome your nerves and physically get into the water. Your soaring data storage costs resulting from exponential capacity growth are not going to go away by themselves – sooner or later you’re going to need to ‘hit the water’.
When you are ready to free yourself from proprietary hardware storage costs and gain the flexibility to seamlessly add storage capacity to an intelligent, self-healing, unified block, file and object software-defined storage system, SUSE is here to help you make your move.
Learn more about our storage solutions at SUSE: https://www.suse.com/programs/data-explosion/