Snowflake has unveiled a wave of new features focused on a clear goal: bringing agentive AI to where the data is (not the other way around) and reducing the time a company takes to go from idea to operational and governed agent from months to weeks. The company behind AI Data Cloud is now widely offering Snowflake Intelligence—its enterprise agent—, consolidates its lakehouse with Horizon Catalog and Openflow, and introduces a suite of developer tools that unifies creation, testing, and deployment of AI applications within a single secure platform.
This move is significant. Snowflake reports that, just in the last three months, more than 1,000 customers have used Snowflake Intelligence to deploy over 15,000 agents into production — names like Cisco, Toyota Motor Europe, TS Imagine, and the US Bobsled/Skeleton team among the early adopters. According to Christian Kleinerman, SVP of Product, this is the logical evolution of a decade: “Snowflake has been the cornerstone of data strategy for thousands of companies. The next step is to bring AI into those data so that each customer can unlock their own intelligence. We democratize the power of AI so that every employee can make better decisions, faster.”
Snowflake Intelligence: an agent to ask “what” and understand the “why”
Now generally available, Snowflake Intelligence is accessible to a base of more than 12,000 customers aspiring to become the natural interface to enterprise knowledge. Its promise: with a question in natural language, the agent researches, reason about the context, and delivers data-backed recommendations based on the organization’s governed data.
Under the hood, Snowflake leverages industry-leading models (including Anthropic) and adds proprietary research for performance and validation. They state that recent optimizations make the text to SQL translation up to three times faster while maintaining accuracy. Additionally, they introduce Agent GPA (Goal, Plan, Action), an evaluation framework that detects up to 95% of errors in tests with standard datasets, with “near-human” levels of detection. The goal isn’t just to excel on the first try but to reduce uncertainty: fewer doubtful answers, increased traceability, and greater trust in regulated environments.
Security and governance are fundamental, not add-ons. Snowflake Intelligence respects permissions and policies, operates within the AI Data Cloud perimeter, and prevents data exfiltration outside authorized domains. For Thierry Martin, Data & AI Director at Toyota Motor Europe, the impact is tangible: “We’ve shortened agent deployment from months to weeks. Our team and spends more time on what truly matters: business context and semantic models. This provides a competitive edge: deploying secure and compliant solutions faster and without moving data.”
Enterprise lakehouse: open data, connected, and provider-neutral
The second key announcement is the consolidation of Snowflake’s lakehouse as an AI agent platform. The company generalizes Openflow —the engine automating ingestion and orchestration from virtually any source— and enhances Horizon Catalog —the unified catalog providing semantic context, security, and governance across regions, clouds, and formats.
To reduce friction across ecosystems, Snowflake integrates directly with open APIs like Apache Polaris™ (Incubating) and Apache Iceberg™ REST Catalog into Horizon. Practically, this offers customers a business lakehouse that centralizes governance and provides interoperable access across open table formats. The operational takeaway? Fewer ad hoc bridges between catalogs, less vendor lock-in, and greater freedom for agents to access data where it resides, with consistent policies.
Additional advancements focus on near real-time use cases:
- Interactive Tables and Warehouses (private preview): enabling almost real-time experiences with agents and applications.
- Streaming analytics (upcoming private preview): acting on live data within seconds, integrating with historical data for scenarios like fraud detection, personalization, recommendations, or OEE/IoT.
- CDC near real-time from Oracle (private preview), built on Openflow, which pushes ongoing transactional updates into Snowflake AI Data Cloud.
To boost resilience, the Business Continuity & Disaster Recovery (public preview) extends to managed Iceberg tables, adding an extra safeguard against incidents. Post-acquisition of Crunchy Data, Snowflake announces Snowflake Postgres (coming soon), a fully managed Postgres instance within the platform, along with pg_lake (already open source), a set of extensions to integrate Postgres with the lakehouse. This is a clear signal of where enterprise data architecture is headed: open formats, native compatibility, and common catalog.
Developer tools: from idea to agent on one platform
The third pillar of the announcement is a developer suite aimed at reducing friction in the build-test-deploy cycle of AI applications.
- Cortex Code (private preview): a revamped AI assistant integrated into Snowflake’s UI, enabling interaction with the environment in natural language. It helps understand platform features, optimize complex queries, and tune results to save costs.
- Cortex AISQL (general availability): brings AI inference pipelines into declarative SQL within Dynamic Tables (also GA). With AI Redact (upcoming public preview), it can detect and mask sensitive data in unstructured content, enabling multimodal datasets with security and privacy.
- Workspaces (GA): Snowflake’s centralized development environment now integrates Git (GA) and VS Code (GA). The result: real collaboration, versioning, and working from your preferred IDE without leaving the platform.
- dbt Projects on Snowflake (GA): managing development, testing, deployment, and monitoring of dbt directly inside Snowflake, reducing auxiliary tools and tasks. For existing code, Snowpark Connect for Apache Spark (GA) enables running Spark workloads on Snowflake’s secure engine without extensive rewrites.
The main thread is clear: concentrate work within a governed platform to speed delivery and lower the Total Cost of Ownership (TCO). If data never moves, then permissions and traceability travel with the code.
What it means for CIOs and data teams (three takeaways)
1) From dashboards to conversations with data. With Snowflake Intelligence, access shifts from “consulting a report” to “asking and reasoning”. The value lies less in static dashboards and more in explanations (the “why”) with business context and active controls.
2) Truly interoperable lakehouse. With Horizon and Openflow supporting Polaris and Iceberg REST, the catalog becomes a central authority for governance and security over open table formats. This unlocks an agentive AI that sees and understands diverse data without remaking pipelines.
3) Shorter toolchain, less debt. With Workspaces, AISQL, dbt, and Snowpark Spark, there are fewer jumps between tools, less glue code, and more reproducibility. Standardization doesn’t eliminate flexibility; it channels it.
Immediate impact use cases
- Customer support: an agent interacts with ticket history, policies, and internal knowledge; answers the what and the why, and suggests next steps.
- Risk and fraud detection: streaming data with historical context to decide in seconds; catalog governance reduces false positives due to sampling biases.
- Operations and IoT: analyzing live telemetry with maintenance and stock management; alerts and explanations that teams can audit.
- Finance/FP&A: faster text→SQL queries; simulations on interactive tables without rebuilding the entire pipeline.
A cultural note: democratize responsibly
The company emphasizes the verb democratize. But “democratize” doesn’t mean “without governance”. The announcement rests on three guarantees:
- Identity and permissions: the agent sees only what it should.
- Traceability: there is a record of plan–action (Agent GPA), data used, and output.
- Open formats: interoperate to avoid blind dependencies.
An agentive AI without these three pillars is an appealing prototype; with them, it becomes business operational.
Highlighted roadmap
- GA: Snowflake Intelligence, Openflow, Horizon improvements; Cortex AISQL and Dynamic Tables; Workspaces with Git and VS Code; dbt Projects; Snowpark Connect for Apache Spark.
- Public preview: Business Continuity & DR for managed Iceberg; Snowflake Postgres (coming soon).
- Private preview: Interactive Tables and Warehouses; Streaming Analysis; CDC from Oracle on Openflow; Cortex Code; AI Redact (coming within AISQL).
- General: pg_lake (open source).
Fine print that clients will inquire about
- Does it perform on my existing sources and catalogs? Integration with Polaris and Iceberg REST reduces friction with open formats.
- What about privacy? AI Redact aims to mask sensitive content in text/image/audio before inference, and Horizon enforces policies.
- And costs? Cortex Code and optimizations in AISQL/Dynamic Tables focus on more efficient queries; TCO depends on usage, but less tooling and less data movement generally lower operational bills.
- How do I get started? Workspaces and dbt Projects support gradual migrations: first governance and catalog, then agents on high-value datasets.
Frequently Asked Questions
What is Snowflake Intelligence and how is it useful for enterprises?
It’s a corporate agent enabling questioning in natural language and receiving traceable answers based on governed data. It handles everything from complex queries (text→SQL) to explanations of the “why” behind metrics, respecting permissions and policies. It is generally available for Snowflake’s customer base.
How do Horizon Catalog and Openflow facilitate preparing data for agentive AI?
Horizon provides a unified catalog, semantic context, and cross-cloud, cross-region governance; Openflow automates integration and ingestion from multiple sources. The integration of Polaris and Iceberg REST centralizes security and access over open table formats, removing fragile “bridges” and reducing vendor dependence.
What are the new tools for developers to create and deploy agents?
Cortex Code (AI assistant in UI), AISQL pipelines (declarative with Dynamic Tables), AI Redact (sensitive data masking), Workspaces with Git/VS Code, dbt Projects inside Snowflake, and Snowpark Connect for existing workloads—all operate within a governed environment and support versioning.
What does compatibility with Apache Iceberg, Polaris, and Postgres imply?
It enables a compliant, interoperable enterprise lakehouse: open tables (Iceberg), open catalog (Polaris), managed Postgres inside Snowflake, plus pg_lake as open source. The result: centralized governance and tool flexibility without lock-in.

