SUSE has announced SUSE AI Factory with NVIDIA, a new software stack designed to enable organizations to build, deploy, and scale Artificial Intelligence applications with a more controlled approach, from data centers to the edge and public cloud. The company describes it as a unified solution based on SUSE AI and NVIDIA AI Enterprise, aimed at bridging the gap between local development and large-scale enterprise deployment.
The announcement comes at a time when the enterprise AI conversation is no longer just about more powerful models, but also about governance, security, digital sovereignty, and operational simplicity. SUSE supports this vision with IDC’s forecast that by 2028, 60% of the Global 2000 companies will operate “AI factories” as core AI infrastructure, while governments adopting this approach could deploy AI up to five times faster than those that do not.
An AI factory, packaged for enterprise
What SUSE is offering here is not a model or a chatbot, but a standardized way to set up AI infrastructure. According to the company, SUSE AI Factory provides prevalidated blueprints, GitOps workflows, and a unified deployment and lifecycle experience so teams can move from sandbox to production without manually assembling disparate bits from multiple vendors.
This approach aligns well with market trends. Many companies no longer want just to “try AI,” but to industrialize it: deploying reproducible, auditable, and governable services across various environments, including isolated clusters, tactical edge, or private infrastructure. SUSE emphasizes that its solution is designed to consistently deploy AI workloads across any footprint, from local development stations to air-gapped environments.
What SUSE with NVIDIA offers
The new offering relies on specific NVIDIA components, including NVIDIA NIM microservices, open models like Nemotron, NVIDIA NeMo for building and managing agents, NVIDIA Runn:ai for GPU orchestration, NVIDIA’s Kubernetes Operators, the secure runtime OpenShell, and NemoClaw. It also incorporates SUSE’s K3s as the foundation to deploy autonomous agents with greater control.
Practically speaking, the architecture aims to cover four layers simultaneously: Kubernetes infrastructure, GPU operation, inference/model services, and agent security. This combination enables SUSE to refer to a “digital factory” rather than just a basic MLOps platform.
| Layer | SUSE Components Mentioned | Main Function |
|---|---|---|
| Infrastructure Base | SUSE AI, SUSE Rancher Prime, SUSE Linux Enterprise Server, K3s | Deployment, operation, and management of clusters and enterprise runtimes |
| AI & Serving | NVIDIA AI Enterprise, NIM microservices, Nemotron models | Inference services, open models, and AI software in production |
| Agents | NVIDIA NeMo, NemoClaw, OpenShell | Build, coordination, and secure execution of autonomous agents |
| GPU Resources | NVIDIA Runn:ai, Kubernetes Operators | GPU orchestration, automation, and stack operation |
These elements are sourced from SUSE’s communication and NVIDIA’s public documentation on AI Enterprise, NemoClaw, and OpenShell.
Security, sovereignty, and compliance as a business argument
SUSE places a strong emphasis on digital sovereignty. The company states that AI Factory with NVIDIA is designed for organizations that need to leverage advanced AI technology while keeping sensitive logic and proprietary data within their private infrastructure. It also explicitly mentions compliance with strict regulatory frameworks, including the EU AI Act. The solution promises a single point of enterprise accountability across the entire stack, including NVIDIA AI Enterprise components.
This messaging aligns with a broader trend in the European market: AI is now purchased not just for performance, but for the ability to maintain physical, legal, and operational control over data, models, and execution environments. NVIDIA further reinforces this narrative by positioning the SUSE partnership as a response to the demand for open infrastructure suited for regulated workloads requiring strict data governance.
Less DIY, more turnkey
A common challenge in enterprise AI is excessive assembly. Many organizations end up combining Kubernetes, observability, GPU orchestration, models, security policies, and deployment pipelines on their own. SUSE aims to turn that complexity into a more integrated offering, with prevalidated blueprints for common use cases and a unified experience based on Rancher or automated GitOps workflows.
While this doesn’t guarantee an effortless adoption, it addresses a real need: reducing setup time, lowering operational overhead, and preventing each AI project from starting from scratch. In this sense, SUSE AI Factory is less like an experimental platform and more like an enterprise AI standardization solution.
Fujitsu and the role of partners
SUSE has also sought to support the launch with partners. One partner mentioned is Fsas Technologies Europe, a Fujitsu company, which appears as a launch partner. Its CTO, Udo Würtz, notes that companies want to use AI but need assurances that their data remains under control. He presents this joint stack as a stable foundation for combining NVIDIA computing with SUSE’s open and secure infrastructure.
This point is important because it clarifies that SUSE doesn’t intend to sell just packaged software but a framework for integrators and partners to develop sector-specific, end-to-end sovereignty solutions. The commercial value of AI Factory, therefore, is not only in licensing but in the ecosystem built around it.
When will it arrive?
For now, SUSE has indicated that a preview of SUSE AI Factory with NVIDIA will be showcased at SUSECON, with general availability expected later this year. They have not announced a firm launch date or full details on pricing or packaging.
Overall, the announcement sends a clear message: the enterprise AI race is entering a phase where having top models won’t be enough. Many companies now want a combination of infrastructure, control, support, compliance, and deployment speed. This is precisely the space SUSE aims to occupy with NVIDIA.
Frequently Asked Questions
What exactly is SUSE AI Factory with NVIDIA?
It is an enterprise AI software stack built with SUSE AI and NVIDIA AI Enterprise, designed to deploy and operate AI applications consistently from local development to production on edge, data centers, or public cloud.
Which NVIDIA technologies are included?
Among others, NVIDIA NIM microservices, open models Nemotron, NVIDIA NeMo, Runn:ai, Kubernetes Operators, OpenShell, and NemoClaw.
Why does SUSE emphasize digital sovereignty so much?
Because this offering targets organizations that need to maintain control over data, models, and infrastructure, especially in regulated environments under frameworks like the EU AI Act.
When will it be available?
SUSE has indicated that the preview will be shown at SUSECON, with general availability expected later in 2026.
via: suse

