Cloudera and Dell integrate ObjectScale to bring private AI “where the data is”: a unified platform with governance, performance, and predictable costs

Cloudera and Dell Technologies have taken a coordinated step to address one of the paradoxes of AI in large organizations: it’s impossible to industrialize AI if the data is scattered across heterogeneous architectures, with each access requiring a move. At EVOLVE25—the data and AI event hosted by Cloudera on September 25 in the Glasshouse in New York—both companies announced the integration of Dell ObjectScale with Cloudera, a joint validation that enables running all Cloudera compute engines directly against ObjectScale storage. According to the company, the result is a private AI platform—the common “AI-in-a-Box”—designed for scalability, management, and economic clarity.

The ambition of this proposal is clear from the figures in their latest report, “The Evolution of AI: The State of Enterprise AI and Data Architecture”, published the same day: IT leaders continue to use varied storage architectures63% mention private cloud, 52% public cloud, and 42% data warehouses. In this mosaic, bringing data and models to a common ground without sacrificing security, governance, or latency has become the major hurdle. The integration with ObjectScale, Dell’s S3-compatible object storage, tries to shift compute to the data instead of the other way around.

“Companies need AI systems that grow with them, that keep data secure, and that offer clear and predictable costs,” summarized Abhas Ricky, Cloudera’s Chief Strategy Officer. “Combining Dell ObjectScale and Cloudera allows for industrializing AI use cases with governed data, deploying them efficiently, and creating intelligent agents—all with predictable economics and no hidden rates.”

What does “private AI” mean in this context?

The notion of “private AI” promoted by Cloudera and Dell is a platform within the perimeter controlled by the company—data center, private cloud, or hybrid environments—where data does not leave the premises and AI services (training, fine-tuning, inference, and agents) are deployed alongside corporate storage, with unified governance. For Cloudera, this piece is the Private AI platform presented at EVOLVE25; for Dell, ObjectScale provides the high-performance, low-latency object storage layer on which Cloudera’s engines operate.

The integration manifests in that shared customers can store structured and unstructured data in ObjectScale and consume them from various compute services of Cloudera:

  • Cloudera AI Workbench: a secure environment for building, training, and tuning AI models using governed data.
  • Cloudera Inference Service: the mechanism to deploy and serve models at scale efficiently and affordably.
  • Cloudera Agent Studio: the tool to design and orchestrate AI agents that automate tasks across business operations.

This trinity—Workbench, Inference, and Agent Studio—would correspond to the full life cycle of enterprise AI, but approached close to the data and with governed metadata: who accesses, for what purpose, which data/model versions are involved, and under what policies.

Moving data is expensive (and risky); bringing compute close to the object changes the game

Cloudera does not hide the motivation: organizations fail at industrializing AI when, for each use case, they need to consolidate data that resides in private cloud, public cloud, data warehouses, or legacy systems. It’s not just about the cost of moving petabytes, but also about risking compliance (for example, sector-specific regulations or sovereignty) and busting lineage.

ObjectScaleS3-compatible—operates as that single repository where both structured (tables, events) and unstructured (documents, images, audio, video) converge within Dell’s infrastructure, but exposed as standard object storage. Certification with Cloudera means that the platform’s engines—from Spark to specific AI services—can run directly against ObjectScale with low latency, without intermediate extracts or copies. The key, they say, is that the metadata—policies, catalogs, lineage—remains consistent.

“This collaboration reflects our commitment to giving clients more flexibility to manage and scale their data,” explained Travis Vigil, Senior Vice President of ISG Product Management at Dell Technologies. “With Dell ObjectScale integrated with Cloudera, we bring storage and AI closer for faster, better decision-making that drives growth.”

What can a regulated sector expect?

Banking, healthcare, government agencies, or energy have repeated the same mantra in recent years: they want AI but not at the cost of losing control of their data. Cloudera and Dell point precisely there: an end-to-end validated platform that works in their own data centers or private clouds, that governs access, and that measures economic efficiency with transparency. When talking about “predictable economy,” Cloudera invokes the idea that, with data in place and compute attached to object storage, operational costs—and opportunity costs—are contained.

In the company’s words, this combination “reduces total costs,” “simplifies” AI lifecycle management, and “deploys” private agents with confidence and efficiency. It doesn’t promise miracles: it ensures consistency between governance, performance, and economic clarity—a triad especially critical when justifying investments to auditors and regulators.

Who contributes what?

  • Cloudera: its data and AI platform—built on open source—which unifies security, governance, and AI services (Workbench, Inference, Agent Studio) and, crucially, the ability to “bring AI to wherever the data resides.”
  • Dell: ObjectScale, its scalable S3-compatible object storage designed for on-premises and private cloud environments, with Dell’s hallmark in infrastructure and integrations.

Together: a validated solutionAI-in-a-Box—that organizations can adopt without cobbling together a makeshift “Frankenstein” of parts and which promises low latency, governed access, and a consistent metadata foundation.

The pain points this aims to address

  1. Data location: when data does not reside in one place—and it never does—, AI becomes a series of extractions and copies.
  2. Governance: without a catalog and cross-cutting policies, it’s impossible to explain which model used what data and with what consent or legal basis.
  3. Latency: in analytics and especially in inference, bringing the model closer to the object reduces times and costs.
  4. Economy: opacity of costs—both in cloud and on-prem—makes forecasting ROI of an AI factory difficult.
  5. Operational complexity: too many point tools generate technical debt and risk.

The survey cited by Cloudera supports this diagnosis: most organizations combine private and public cloud, and still rely on traditional data warehouses. Without the ability to manage 100% of data—across all formats—where it resides, applying AI with full visibility seems impossible.

What does this change for data and AI teams?

For data scientists: AI Workbench becomes a safe space to train and fine-tune models without removing datasets from ObjectScale; metadata coherence makes traceability and reproducibility easier.

For platforms: Inference Service enables serving models close to the objects with efficiency, and Agent Studio adds a layer to design private agents that automate business processes.

For CISOs and compliance: the fact that accesses are governed and auditable—and data does not migrate to uncontrolled environments—simplifies compliance (e.g., access logs, minimization of transfers, retention).

For CFOs: “predictable economics” isn’t just a slogan. If data and compute stay within the corporate domain and latency drops, it becomes easier to model the cost per use case.

A note of realism: available integration, staged journey

Cloudera and Dell speak of a certified integration—that is, tested and validated—but not of a “” that alone solves architectures and processes. The initiative is part of the broader discourse around AI-in-a-Box and Private AI, which each organization will need to tailor to their operating model. The announcement invites registration for upcoming EVOLVE25 events and more info on the private AI platform with ObjectScale.

Why now?

Because AI has moved from experimentation to operation, and silos or gaps between data and models are no longer tolerated. Moving data across clouds or regions increases costs and risks compliance. Latencies also matter when AI assists daily processes—ranging from internal chatbots to real-time document retrieval and responses.

The Cloudera-Dell fit aims to address this with low-latency object data infrastructure and AI engines tightly integrated with that storage, supported by governed metadata documenting who did what with which data.


Frequently Asked Questions

What is Dell ObjectScale and how does it integrate with the Cloudera platform for private AI?
ObjectScale is Dell Technologies’ object storage, compatible with S3, designed to scale in data centers and private clouds. The certified integration enables Cloudera’s engines—such as AI Workbench, Inference Service, and Agent Studio—to operate directly against ObjectScale, with governed metadata and low latency, avoiding intermediate copies or data extractions.

What benefits does “bringing compute to the data” offer in a private AI architecture?
Proximity of training, fine-tuning, and inference reduces latency, limits data movement—thus lowering risk and cost—and simplifies compliance and governance. Keeping sensitive datasets within controlled environments in regulated sectors is key to scaling AI without compromising security or privacy.

What Cloudera components leverage ObjectScale for AI use cases?
Three parts: Cloudera AI Workbench (building and training models with governed data), Cloudera Inference Service (deploying and serving models at scale), and Cloudera Agent Studio (designing private agents that automate tasks). All directly access data stored in Dell ObjectScale.

How to get started with Cloudera + Dell ObjectScale for private AI in large enterprises?
The first step is inventorying data architectures (private cloud, public cloud, warehouses), identifying datasets and candidate use cases, and evaluating ObjectScale as a reference S3-compatible solution. From there, pilot with AI Workbench (training/fine-tuning) and Inference Service (near-object serving), integrating governance and metadata from the start to ensure traceability and predictable costs.

via: cloudera

Scroll to Top