Is the ‘Data Explosion’ Forcing You to Rethink Your Storage Strategy? | SUSE Communities

Is the ‘Data Explosion’ Forcing You to Rethink Your Storage Strategy?


You don’t have to look far to find stats on how much more data is being created today, than ever before. The big tech analysts – Gartner, Forrester and IDC – seem to be in competition with each other, each trying to out-do each other with superlatives and lurid headlines. Their preferred adjective for describing data growth is ‘exponential’.

IDC expects worldwide data to increase to 175ZB by 2025, with a compound annual growth rate of 61%. Network World called that ‘staggering’. Meanwhile, Cisco reports enormous increases in internet traffic as Internet of Things (IoT) devices (and still more people) are added, predicting that by 2022, there will be more internet traffic per year than was created in the first 22 years of the internet. Wikipedia calls it the ‘information explosion’.

According to Forbes, we are on the cusp of the fourth industrial revolution and expect that the way we work and live will change beyond recognition in the next few years. In fact, they expect that change to be ‘exponential’, with much of it driven by big data analytics, IoT, machine learning, and AI.

The upshot of all these trends is that they create more and more data, that then needs to be stored. If your organization is like most others, you’re storing more data this year than you did last year. And most likely the rate of growth in percentage terms is greater this year than it was last year. And that happened the year before. And the year before that.

Unsurprisingly, all this growth is producing a storage budgeting headache. The challenge is that when it comes to budgets, CFOs don’t do ‘exponential’ increases. They don’t find ‘staggering’ IT costs impressive, and while they like ‘compound annual growth rates’ when applied to company revenues, they don’t like them when applied to spend on storage hardware.

Here at SUSE, we call this the ‘data explosion’, and for most the outcome is unavoidable. Sooner or later, the IT team ends up with its back against the wall: no amount of data tiering, data compression – or even aggressive data deletion – can make traditional storage affordable.

Day in and day out we see smart CIOs making the shift to open source software-defined storage, so they can eliminate expensive proprietary license costs and the steep hardware costs that go with them.  Yes, they know they will have to learn new skills and techniques – and they know they may well face some additional headcount costs – as our proprietary competitors are keen to point out. But, they are also wise enough to recognize, like Einstein, ‘doing the same thing over and over again and expecting different results’ is the definition of budget madness.

Is your current storage approach flexible enough to adapt and manage the data explosion? SUSE is here to help – we’ll get you skilled up and on the road to an affordable, sustainable approach that can turn your CFO’s frown upside down.

To learn more about how you can reduce your storage costs with SUSE, visit


Leave a Reply

Your email address will not be published. Required fields are marked *

No comments yet

Avatar photo
Larry Morris Larry Morris is a Senior Product Manager for SUSE focused on open source Software Defined Storage products. During his 25 years in the storage business he has held various engineering and management responsibilities in product development, product management, program management, marketing and total customer experience. While at Hewlett-Packard Enterprise he was part of a three person executive team that grew the Enterprise Virtual Array from a $300 million dollar business to over $1.2 billion in yearly revenue. Larry holds a Bachelor of Science in Computer Science Engineering and a Master of Business Administration. He currently resides in Park City, Utah.