How AI and ML will drive increased use of software defined storage
Few technologies have been as hyped as artificial intelligence and machine learning in recent years, but then few technologies have the same potential for transformative change.
You won’t find a CMO anywhere on the planet unmoved by the possibility of increasing sales and customer loyalty through personalisation at scale: the right offer to the right individual at the right time. Across the corridor in the CFO’s office there’s excitement about the potential for fraud reduction as algorithms trawl data for patterns of activity that raise digital red flags.
Down the hall in the CIO’s office, the atmosphere reaches fever-pitch as multiple uses-cases around security and automation raise the pulse rate. COOs everywhere are seeing the potential to eliminate the manpower required for ‘robotic’ processes that would be better performed by software and machines: taking out the human errors, speeding everything up, and removing the labor cost at the same time – what’s not to like?
It’s not only businesses that are in a hurry to embrace the possibilities. AI will revolutionise areas as different as medicine and public transport. We’ve heard about the potential of AI to shake up cancer diagnostics by comparing CT and X-Ray scans of new patients with the entire history of all patients, and IoT-based traffic management is fast moving from science fiction to everyday real-life experience.
In fact, the transformative potential of this technology is so great that there are strong voices who argue that non-adoption, or even delayed adoption will put companies out of business. ‘Companies that wait to adopt AI may never catch up’, as the Harvard Business Review put it.
As consumers, we are coming to expect our life experiences to be ‘smart’, and we will choose to buy products and use services that make our experiences simple, easy and fast. Already 54 million US adults own a smart speaker. We’ve been collectively immersed in Amazon Alexa advertising and we are starting to expect that our cars will drive themselves along a faster route, that our shopping lists will write themselves (and the shopping deliver itself) and that increasingly we will be able to delegate mundane tasks to our own personal AI assistants.
Nobody likes sifting the web or comparison sites for cheap home insurance, so ‘Siri, who has the best home insurance policy for my house?’ will soon morph into ‘Siri, monitor home insurance policies and automatically change my provider when a better deal becomes available’. We will seek to automate our own individual robotic processes. ‘Boring’ details will be taken care of by machines. These Natural Language Processing queries – talking to machines – are powered by AI. It should be no surprise that Forrester reports 63% of enterprises are already adopting AI.
The storage impact
AI applications in business and public life – diverse as they are – have one thing in common: they ingest data from many different sources and then analyse that data to derive actionable intelligence. Behind the scenes of every great AI-enabled application is the data on which the AI is basing its learnings and its decisions. No data, no insight. Your personalised retail offer involves AI examining your transaction history, your browsing history and frequently your activity on social media and even your physical location.
Decisions are only as good as the data on which they are based. The more data you have, the better your AI decisions will be. As a result organizations will store more data to improve AI’s decisions and the business outcomes you get from them. Thus, AI is a fundamental driver of increasing storage requirements.
Organisations that succeed with AI and ML are going to be organisations that manage to store and make available enormous volumes of data. Data from many different storage sources, across on-premise and multi cloud, and from gathering collaborative data from the organisation’s supply chain and their associated cloud providers.
It’s hard to imagine any organisation succeeding in delivering on the potential of AI and ML without going software defined storage.
Three reasons you’ll need software defined storage
- Cost control and affordability. You could – in theory at least – use traditional hardware arrays. However, the massive amounts of data and rapid data growth would put unmanageable pressure on your CapEx spending. You’re going to need an approach that allows you to use commodity hardware: paying for the comfort of a top branded logo on the outside of the box is no longer a viable option.
- AI and ML are going to be unforgiving when it comes to the requirement to scale out. Organizations need a storage solution that can scale into the hundreds of petabytes or even into the exabytes without interruption or downtime. “Forklift upgrades” are no longer an option organizations can afford.
- Reduced complexity. Organizations need a single storage system that can store data in any format; object, block or file. AI applications need to cross-reference all data available. As an example an AI application might access medical history stored in block format while previous X-Rays and ultrasound data is stored in file format and CT scan and MRI data is stored in object format. The AI application needs to pull data from all three formats.
SUSE supports customers using AI and ML systems all around the world, with easily scalable open source software-defined storage solutions, powered by Ceph.
To learn more about how SUSE can help you reduce costs and complexity, and to more easily scale your storage, visit suse.com/programs/data-explosion/
Really, Epic post! Thanks for sharing