I think this is a really intriguing question, to which curiously the answer is “yes” and “no”. As with many things in business the answer is not straight forward, so let me explain.
Over the past few decades, organizations have come to rely on their own data centers to run business applications, network their users together and for data storage. Initially, these data centers were largely hardware-centric.In the early days, a mainframe and terminals were the order of the day, before we moved onto the RISC/UNIX era, followed more recently by the server sprawl period of commodity X86 servers.
But now, the whole concept of an organization-owned data center is going through a radical change. It started with virtualization, which separated the direct relationship between application software and the underlying hardware infrastructure. This helped improve server utilization, efficiency, and provisioning speed. The next step towards an even greater level of abstraction is the move to a software-defined infrastructure (SDI), including compute, storage and networking.
Today we’re in a world where agility, innovation, automation and speed of operations are everything. DevOps and CI/CD are the buzz words of the moment. Features and functionality are now being defined by software, instead of relying on more expensive and largely proprietary hardware solutions.
To that end, cloud computing is on the rise and gaining momentum. Some 94 percent of enterprises are using cloud computing, with 84 percent saying they have already adopted a multi-cloud strategy. That means that just about every organization is using the cloud and more workloads are being migrated to cloud platforms all the time.
Does that mean that the data center is dead and buried?
No, It doesn’t.
Analyst firm IDC reports that IT infrastructure spending is currently split evenly between traditional data centers and cloud computing. It’s worth noting that the cloud revenue in this report includes private cloud, meaning the organization-owned data center spending is still over 65 percent of the total. IDC estimates that traditional non-cloud IT infrastructure grew by 12.3 percent last year.
Why is traditional data center spending still growing? The Uptime Institute points out that “platform choice…is a balancing act based upon cost, capacity, timing, security, support, and regulatory needs.” The applications and services that are best kept in your own data center are the most critical ones to your business. Some reports suggest that 80 percent of mission-critical workloads and sensitive data are still running on-premises due to performance and regulatory requirements.
That will change over time. More applications will inevitably be migrated to the cloud. But it’s likely that legacy and cloud-native workloads will continue to co-exist for at least the next decade. That leaves most organizations trying to juggle a multi-modal IT infrastructure and strategy. They’re trying to optimize existing data center environments, while at the same time embracing more agile SDI environments.
Let me come back to our original question one more time.
Are we ready to ditch the data center?
In one sense, the answer is a resounding yes.
That’s because the whole idea of centralized data is becoming a bit of a misnomer. Data is being generated everywhere these days – certainly not just in a central location.
The Internet of Things (IoT) has become a reality. Mobile devices and wearables mean we’re personally producing masses of biometric data. Data is flowing from smart cities, homes, and spaces. Automated vehicles, factories, and farms are now teeming with sensors and compute devices. We can add artificial intelligence (AI), virtual reality/augmented reality (VR/AR). healthcare applications, energy, and utilities. The list goes on and on.
IDC estimates there’ll be 31 billion IoT endpoints by 2021. The data from all these endpoints will need to be collected, stored and processed. And that has to happen nearer to where the data is generated. Sending it all back to a central location for analysis and analytics is simply impractical. The new frontier of edge computing is here and it’s going to be increasingly important in the future.
Cloud computing has been around for almost two decades. “Edge computing” is now becoming a commonplace term. Maybe it’s time we adopted the term “core computing” instead of referring to a “Data Center”.
That’s why here at SUSE, we’re focused on building an open source solution portfolio that seamlessly spans from edge to core to cloud. To learn more about our approach to digital transformation, please visit https://www.suse.com/programs/digital-transformation/