Growing Pains: Data Storage Today | SUSE Communities

Growing Pains: Data Storage Today

Share
Share

Depending on who you ask, digital data is growing at a combined annual growth rate of 20, 30 or even 40 percent. But if you’re in an IT department today, you don’t need a technology analyst to tell you the obvious. We’re undergoing a data explosion—and mostly, that’s a good thing.

We need to store more data because our customers are creating more data every day in their online interactions. Because now we have Internet of Things (IoT) systems gathering the kind of data we’ve never had available before. Because we now use video instead of just text.

Airplanes are one example of a technology transformed by the IoT. The Airbus A350 has almost 6,000 sensors and generates 2.5 terabytes a day, but the newest Airbus model will generate at least three times that amount of data.[1]

And as Richard Fichera of Forrester Research says in our video on the biggest factors driving the data explosion, we need to store more data because nobody wants to throw any of it away. Consumers want every image they take on their smartphones to be stored forever—and with 1.5 billion new smartphones sold in 2017, even a couple gigabytes a year per user means many billions more gigabytes to store.[2]

And it’s not just a consumer issue. As enterprises interact with consumers online and on social media, the amount those enterprises need to store is growing too. There’s strong reason for enterprises to want to keep it all too, because they now have analytics systems that can identify deep behavior from historical data and turn old data into actionable information.

In short: We need to store more data because data represents value. But that doesn’t mean storing it is easy.

For one thing, the stakes today are higher. Data compliance and protection have both become challenges. In the last decade or so, governments have passed a number of laws such as the Sarbanes-Oxley Act, the Health Insurance Portability and Accountability Act (HIPAA) and, most recently, the EU’s General Data Protection Regulation (GDPR) that dictate how, when, where or how much data organizations must store. Industry regulatory frameworks, like the Basel III guidelines for banking, can add to the challenge.

Protecting data is not a new challenge, but most will admit it’s now a much bigger challenge than before. Gone are the days of employees all being on a physical corporate network. That means access control is harder—at the same time that high-profile breaches have shown the world the value stolen data can have.

In this regard and in many others, traditional storage systems aren’t keeping up. On-premises, hardware-based storage can be expensive and hard to manage, but more importantly, it just can’t keep up with the kind of rapid scaling necessary today. Storage in the public cloud is one solution, but while it can be less expensive up front, service tiering and access costs—not to mention compliance issues—make it unviable for many.

Perhaps worst of all, there’s no system that’s going to scale affordably if your price goes up with every gigabyte you store. Solving the problem takes an entirely new way of thinking about enterprise storage.

Over the course of the next few weeks, you’ll see more posts here exploring different aspects of the data explosion. But I’ll give you a sneak peek: There is a solution to today’s data explosion. It’s software-defined storage. And with SUSE, implementing that software-defined solution can be easy and affordable. The data explosion doesn’t have to hurt. Stay tuned and we’ll show you why.

[1] www.datasciencecentral.com/profiles/blogs/that-s-data-science-airbus-puts-10-000-sensors-in-every-single
[2] www.statista.com/statistics/263437/global-smartphone-sales-to-end-users-since-2007/ 
Share

Leave a Reply

Your email address will not be published. Required fields are marked *

No comments yet

Avatar photo
4,737 views