AI-Assisted Infrastructure at Scale

Share
Share

SUSE’s vision for AI-assisted infrastructure is rapidly becoming a reality, ushering in a new era where complexity is managed through simple, natural language commands.  The MCP Server tech preview for SUSE Multi-Linux Manager and Trento is just the latest in a portfolio-wide effort to ensure customers can transform IT operations from a reactive, manual process into a proactive, automated, and optimized system.

Imagine managing your entire Linux fleet by asking:

“Do we have any servers affected by a critical vulnerability?”

The system might reply:

“Yes, five systems require immediate patching. Two of those will need a reboot to complete the process. Proceed with scheduling?” The answer will also provide a detailed explanation of which machines are affected, the reasoning behind the analysis, as well as suggested mitigations and the reasons behind them.

And you command:

“Fix them.” And you watch as the suggested mitigations are applied.

That’s the direction we’re heading. At SUSE, we’re shaping a future where managing your entire infrastructure at scale becomes more intuitive, adaptive and aligned with business goals, where natural language, policy and automation all work together securely under human supervision.

More broadly, this is part of SUSE’s long-term strategy to deliver an AI-assisted infrastructure. One where Linux itself becomes context-aware, secure by design and seamlessly integrated with intelligent management and automation layers. Together, SUSE Linux Enterprise 16, SUSE Multi-Linux Manager and Trento (in SLES for SAP) form the foundation for this evolution, enabling organizations to prepare their environments today for the intelligent operations of tomorrow.

Therefore, the availability of the technology preview of the Model Context Protocol (MCP) Server for SUSE Multi-Linux Manager is taking a tangible step toward that vision.


Hero illustration depicting AI in a more detailed, human level

The strategic vision: AI integration and automation

This MCP Server is the foundational component that makes this vision possible. It acts as a secure, open-standard bridge that translates your natural language requests into direct actions across your Linux infrastructure.

Crucially, this technology is built for enterprise integration. The MCP Server exposes a standardized API designed to be consumed by MCP host components (like those in SUSE Linux Enterprise Server 16) and connected to the Large Language Model (LLM) of your choice.

Furthermore, this architecture also enables integration with third-party services, such as IT service management (ITSM) platforms, allowing AI to automatically log tickets, execute tasks based on business rules and even respond directly to business needs. The entire process remains open, transparent and secure, always under human supervision and full customer control.

Video Demo: See the MCP Server in Action

Here is a short demonstration of the conversational workflow in action, showing how natural language translates to system administration tasks across managed fleets.

Testing Environment Note: The MCP Server is packaged as a container, making it easy to deploy using Podman. It is designed to be tested alongside the MCP Host component, which is available as a Technology Preview in SUSE Linux Enterprise Server 16.

🔗 Try it now: MCP Server for Uyuni (Tech Preview)

Download and instructions: https://github.com/uyuni-project/mcp-server-uyuni/pkgs/container/mcp-server-uyuni

Now in Tech Preview

SUSE is publishing the MCP Server as early access to a technology preview. As this is an agile project, tools, attributes and names may change in subsequent versions based on community feedback and development requirements.

While the current release focuses on core functionality, the next version will introduce OAuth-based authentication. This will deliver an enterprise-ready identity and access mechanism, further strengthening security for production environments.

The toolkit: capabilities available for testing

This is your invitation to go hands-on. The MCP Server exposes a rich set of backend system-management functions to any connected LLM. You can test how an AI agent moves from gathering context (information) to triggering action (execution).

In this first tech preview, you can begin experimenting with the foundational tools that show how AI will interact directly with managed Linux systems. This interaction aims to be always secure, transparent, and under your control.

Category Key Tools Exposed Real-World Testing Scenario
Security & Auditing get_systems_needing_security_update_for_cve

check_all_systems_for_updates

Test the full conversational sequence: Identify servers affected by a CVE and confirm patch status.
Automation & Scheduling schedule_apply_pending_updates_to_system

schedule_system_reboot

cancel_action

Validate the reliability of scheduling a patch deployment or a system reboot based on natural language commands.
Visibility & Metrics get_list_of_active_systems

get_cpu_of_a_system

Can the AI accurately retrieve and summarize the active system inventory or CPU load across the fleet?
Inventory Control add_system

remove_system

list_activation_keys

Test the functions that manage the full lifecycle and provisioning of new systems.

Why test this agile tech preview now?

Your participation is vital to realizing the vision of AI-assisted infrastructure. This is your chance to directly influence the stability and architecture of the next generation of Linux management tools.

  • Validate the Integration: Test how the MCP Server exposes a robust, standardized API for large-scale integration. Confirm its suitability for linking with external services and your own Agentic AI projects.
  • Experience Agentic Capabilities: See the shift from simple status checks to agentic automation that can execute complex, multi-step tasks.
  • Influence the Roadmap: This is an agile project that will evolve quickly. Your feedback on the exposed tools and their functionality will directly shape the final product’s features and reliability.

We commit to delivering on our promise of an AI-ready infrastructure. Your contribution to testing this foundational component is a crucial step in that direction.

The beginning of the journey

The MCP Server for SUSE Multi-Linux Manager, together with the MCP Host in SUSE Linux 16 and Trento for SAP environments, marks the continuation of SUSE’s AI-assisted infrastructure era. These updates come on the heels of several key AI updates including Universal Proxy, the integrated Model Context Protocol (MCP) proxy within SUSE AI, SUSE Linux Enterprise Server 16‘s introduction of MCP components and Liz, a  context-aware AI agent included in SUSE Rancher Prime. As these capabilities evolve, they will enable administrators to move from reactive management to intelligent collaboration with their systems. A collaboration guided by open standards, human oversight and enterprise-grade security.

Join us at the start of this journey. Participate in the AI Universal Proxy project. Explore the tech previews. And share your feedback and help shape how SUSE defines the future of enterprise Linux operations.
Learn more about AI-ready infrastructure in SUSE Multi-Linux Manager and SUSE Linux 16 web pages.

Share
(Visited 1 times, 1 visits today)
Avatar photo
10 views
Rick Spencer Rick Spencer is the General Manager of Business Critical Linux at SUSE, where he drives innovation with his passion for open-source and Linux. He is an experienced technology executive with a career spanning organizations including Microsoft, Canonical and Bitnami. His career has been centred on open-source principles, fostering community engagement, and deep respect for user-centered design and customer-centered delivery. At SUSE, Rick and his team are focused on the core SUSE Linux Products and related services, delivering to other core teams in SUSE, and working with the open source community. This work is centred on their passion for open source, Linux, and community building & interaction. Rick is based in Maryland, US.