Confluent, Inc., a pioneering company in data streaming, has announced the general availability (GA) of its integrations with Delta Lake and Databricks Unity Catalog within Confluent Tableflow, along with early access (EA) availability for Microsoft OneLake. With these enhancements, Tableflow positions itself as a comprehensive, fully managed solution connecting operational, analytical, and artificial intelligence (AI) systems in hybrid and multicloud environments. Thanks to these new features, the platform enables materializing Apache Kafka® topics directly into Delta Lake or Apache Iceberg™ tables, incorporating automated quality controls, catalog synchronization, and advanced enterprise security levels.
Since its launch, Tableflow has revolutionized how organizations prepare and manage real-time data for analysis, eliminating complex ETL tasks and manual Lakehouse integrations that slow processes. With the general availability of Delta Lake and Unity Catalog integrations, along with OneLake support, Confluent bolsters its role within the multicloud ecosystem. These updates provide a unified platform that connects real-time and analytical data under strong enterprise governance, driving the development of AI applications and instant analytics that help companies maintain their competitive edge.
“Companies want to maximize their real-time data utilization, but the lack of integration between real-time data processing and analysis has always been a significant hurdle,” says Shaun Clowes, Confluent’s Product Director. “With Tableflow, we remove this barrier by enabling direct connectivity between Kafka and managed lakehouses. This means high-quality data is ready for analysis and AI right when it’s created,” he adds.
Enterprise-Ready for Production
The GA release introduces new enterprise-grade features that position Tableflow as one of the most comprehensive, reliable, and secure streaming-to-table solutions available today, allowing organizations to:
- Simplify analysis: Compatibility with Delta Lake (GA) instantly turns Kafka topics into Delta Lake tables stored in cloud object storage systems like Amazon S3 or Azure Data Lake Storage. It is now possible to enable Delta Lake and Iceberg formats simultaneously per topic for flexible, cross-format analysis.
- Unify governance: Support for Unity Catalog (GA) automatically syncs metadata, schemas, and access policies between Tableflow and Databricks Unity Catalog, ensuring centralized governance and consistent data management organization-wide.
- Enhance reliability: Dead Letter Queue captures and isolates malformed records without disrupting data flow. Backed by schema management, this error handling system provides greater transparency, faster recovery, and built-in data quality.
- Save time and reduce complexity: The Upsert functionality automatically updates and inserts records as data changes, keeping Delta Lake and Iceberg tables consistent, duplicate-free, and analysis-ready without manual maintenance.
- Strengthen security: Bring Your Own Key extends customer-managed encryption keys to Tableflow, giving full control over data at rest, ensuring compliance in highly regulated sectors like finance, healthcare, and government.
Building on existing capabilities such as schema evolution, compression, automated table maintenance, as well as integrations with Apache Iceberg, AWS Glue, and Snowflake Open Catalog, Tableflow now offers a comprehensive platform for teams needing real-time data that’s ready for analysis, compliant, and instantly resilient.
“Providing real-time insights from IoT data in smart buildings is essential to our mission,” says David Kinney, Attune’s Chief Solutions Architect. “With just a few clicks, Confluent Tableflow allows us to materialize Kafka topics into reliable, analysis-ready tables, giving us precise visibility into both customer interactions and device behavior. These high-quality datasets now fuel analysis, machine learning models, and generative AI applications—all built on a trustworthy database. Tableflow has simplified our data architecture and opened new ways to leverage data more effectively,” he adds.
Now Available in Microsoft OneLake
Tableflow is also now available in early access on Azure, integrated with OneLake, expanding its reach and offering customers greater flexibility for multicloud deployments. This development significantly impacts organizations using Azure Databricks and Microsoft Fabric, where Delta Lake and Unity Catalog integrations are now fully supported. Together, they offer a seamless, controlled analytics experience—from real-time data flows to cloud lakehouses. With these improvements, customers can now:
- Reduce time-to-insight: Instantly materialize Kafka topics as open tables in Microsoft OneLake and query them via Microsoft Fabric or their preferred tool using OneLake Table APIs, without manual ETL or schema management.
- Eliminate complexity and operational costs: Automate schema assignment, type conversion, and table maintenance for streaming data, supporting governance and reliability in native Azure analytics workflows.
- Enable analytics services and Azure AI: Seamlessly integrate Azure’s analytics and AI services using Microsoft OneLake Table APIs to enhance real-time insights and AI use cases. Easily manage deployments via Confluent Cloud UI, CLI, or Terraform.
The EA launch marks a major milestone in expanding Tableflow’s multicloud presence and strengthening Confluent’s collaboration with Microsoft and Databricks.
“Access to real-time data is critical for enabling quick, accurate decision-making,” says Dipti Borkar, Vice President and General Manager of Microsoft OneLake and ISV Ecosystem. “Now that Confluent Tableflow is available on Microsoft Azure, customers can stream Kafka events into OneLake as Apache Iceberg or Delta Lake tables, and query them instantly through Microsoft Fabric and popular third-party engines via OneLake Table APIs—reducing complexity and accelerating decision-making,” she states.

