Cisco announced at the Splunk .conf25 conference the launch of Cisco Data Fabric, an architecture aimed at transforming how organizations manage their machine data and turn it into fuel for artificial intelligence applications.
Powered by the Splunk platform, the new framework promises to significantly reduce the cost and complexity of handling information from sensors, servers, networks, and applications, while enabling its use in training custom models, agentic workflows, and real-time business and technical data correlation.
An Untapped Data Mine
“Organizations are sitting on a gold mine of machine data that has so far been too complex and costly to leverage for AI,” stated Jeetu Patel, Cisco’s President and Chief Product Officer. With Data Fabric, the company aims for that data—ranging from industrial metrics to application logs—to become real-time operational intelligence.
Key Aspects of Cisco Data Fabric
The architecture introduced includes several technological pillars:
- Unified data management at an extreme scale: simplifies transforming information in hybrid environments (edge, cloud, and on-premises) into actionable insights for security (SecOps), IT (ITOps), development (DevOps), and networking (NetOps).
- Real-time federated search and analysis: allows data to be queried directly at its source, with current support for Amazon S3 and future integration with Apache Iceberg, Delta Lake, Snowflake, and Azure.
- Open and flexible architecture: based on open standards, integrates plug-and-play tools, and enables innovation without proprietary dependencies.
- Time series models (scheduled for November 2025 on Hugging Face): for anomaly detection, forecasting, and automated root cause analysis.
- Splunk Machine Data Lake: a persistent, optimized repository for training AI models and conducting enterprise analytics.
- Cisco AI Canvas: a cloud-based collaborative space offering AI assistants and a “virtual war room” for team analysis, investigation, and visualization.
AI at the Core of the Strategy
“We want to provide customers with the fastest and safest path from data to action,” emphasized Kamal Hathi, SVP of Splunk. With AI capabilities integrated throughout the data lifecycle—from ingestion to search and collaboration—Cisco aims to enhance organizational productivity and digital resilience.
According to IDC, which evaluated the announcement, Data Fabric addresses a critical issue: the difficulty of unifying and securing large volumes of machine data to enable resilient AI systems. “Its federated approach eliminates the need to move data and accelerates access to practical insights,” said Archana Venkatraman, Director of Cloud Data Management Research.
Availability and Roadmap
Cisco Data Fabric is already available, with additional advancements scheduled through 2026:
- Replay S3 for federated analysis (October 2025).
- Time series model on Hugging Face (November 2025).
- AI Canvas and Machine Data Lake integration in 2026.
- New connectors and data sources also in 2026.
Impact on Businesses
Through this initiative, Cisco aims to position itself as a central player in the enterprise AI era, where energy efficiency, response speed, and data sovereignty have become strategic pillars.
The launch occurs at a time when companies across sectors—from banking and retail to heavy industry and telecommunications—need to shift from reaction to anticipation in their operations, something that the combination of Splunk and Cisco promises to enable with this new architecture.
Frequently Asked Questions (FAQ)
What is Cisco Data Fabric?
It’s a unified architecture that transforms machine data (logs, metrics, events) into AI-ready intelligence, reducing costs and complexity.
How does it help businesses?
It allows real-time data analysis, anomaly detection, failure prediction, custom model training, and faster, more secure decision-making.
What role does Splunk play in this solution?
Splunk provides its observability and security platform as the foundation for Data Fabric, including the Machine Data Lake and the AI Toolkit.
When will the complete solution be available?
The core is already available. Features like the time series model and new integrations are expected between late 2025 and 2026.