Snowflake is bringing industry-leading enterprise AI to more users with the new developments of Snowflake Cortex AI and Snowflake ML.

Snowflake, the leading company in AI Data Cloud, has unveiled at its annual event for users, the Snowflake Summit 2024, a series of innovations and enhancements to its Snowflake Cortex AI platform. These new features are designed to drive the next generation of AI-based enterprise applications, making it easy, efficient, and reliable for customers to create AI-driven solutions.

One of the key highlights among the new features is the introduction of enhanced chat experiences, which allow organizations to develop chatbots in a matter of minutes. These chatbots are designed to interact directly with enterprise data, providing quick and accurate responses to user queries. This functionality enables companies to significantly enhance their ability to extract valuable insights from their data in real-time.

Furthermore, Snowflake is further democratizing access to artificial intelligence, allowing any user to customize AI applications for specific industry use cases. This is achieved through a new code-free interactive interface, providing access to leading industry Large Language Models (LLM) and offering precise serverless adjustments. These tools enable users to tailor and optimize AI models according to the specific needs of their sectors.

To accelerate the operationalization of models, Snowflake has introduced an integrated experience for machine learning (ML) through Snowflake ML. This tool allows developers to build, discover, and manage models and features throughout the ML lifecycle. Snowflake’s unified platform for generative AI and ML enables all areas of the enterprise to extract greater value from their data, providing security, governance, and full control to deliver responsible and trustworthy AI at scale.

With these innovations, Snowflake is positioned to lead the next wave of enterprise AI, providing organizations with the tools needed to develop advanced applications quickly and securely. The ability to customize and operate AI models efficiently will allow companies to stay competitive in an increasingly AI-driven business environment.

“Snowflake is at the forefront of enterprise AI, making AI easy, efficient, and trustworthy available to all users so they can tackle their most complex business challenges without compromising security or governance,” said Baris Gultekin, Head of AI, Snowflake. “Our latest advances in Snowflake Cortex AI remove barriers to entry so that all organizations can leverage AI to create powerful AI applications at scale and unlock unique differentiation with their enterprise data in AI Data Cloud.”

Offering the entire enterprise the power to communicate with data through new chat experiences

LLM-powered chatbots are a powerful way for any user to ask questions based on their company data using natural language, unlocking the insights organizations need for critical decision-making more quickly and efficiently. Snowflake introduces two new chat features, Snowflake Cortex Analyst and Snowflake Cortex Search, which enable users to develop these chatbots in minutes from their structured and unstructured data, without operational complexity. Cortex Analyst, built with Meta’s Llama 3 and Mistral Large models, allows companies to securely develop applications on their analytical data in Snowflake. Additionally, Cortex Search leverages Neeva’s cutting-edge retrieval and ranking technology (acquired by Snowflake in May 2023) along with Snowflake Arctic embed, allowing users to create search and analysis applications on text-based documents and other data sets through enterprise-level hybrid search – a vector and text combination – as a service.

“Data security and governance are of utmost importance for Zoom when leveraging AI for our business analytics. We trust Snowflake AI Data Cloud to support our internal business functions and develop customer insights as we continue to democratize AI across our organization,” said Awinash Sinha, Corporate CIO, Zoom. “By combining the power of Snowflake Cortex AI and Streamlit, we have been able to quickly create applications leveraging large pre-trained language models in just a few days. This is enabling our teams to quickly and easily access useful AI-driven answers.”

“While companies often use dashboards to consume information from their data for strategic decision-making, this approach has some drawbacks, such as information overload, limited flexibility, and time-consuming development,” said Mukesh Dubey, Product Owner Data Platform, CH NA, Bayer. “What if internal functional users could ask specific questions directly about their company data and get answers with basic visualizations? The core of this capability is high-quality responses to a natural language query about structured data, used sustainably from an operational perspective. This is exactly what Snowflake Cortex Analyst allows us to do. What excites me the most is that we have just scratched the surface, and we are looking forward to unlocking more value with Snowflake Cortex AI.”

Data security is key to creating production-level AI applications and chat experiences, and then scaling them across enterprises. As a result, Snowflake introduces Snowflake Cortex Guard, which leverages Meta’s Llama Guard, an LLM-based in-out protection that filters and tags harmful content in the organization’s data and assets, such as violence and hate, self-harm, or criminal activities. With Cortex Guard, Snowflake is further unlocking trusted AI for enterprises, helping customers ensure that available models are safe and usable.

Snowflake reinforces AI experiences to accelerate productivity

In addition to enabling easy development of customized chat experiences, Snowflake offers customers pre-built AI-powered experiences, which are powered by Snowflake’s world-class models. With Document AI, users can easily extract content from documents, such as invoice amounts or contract terms, using Snowflake’s best-in-class multimodal LLM, Snowflake Arctic-TILT, which surpasses GPT-4 and scored highest in the DocVQA benchmark test, the standard for visual question-answering on documents. Organizations like Northern Trust use Document AI to intelligently process documents at scale and reduce operational expenses with increased efficiency. Snowflake is also advancing its revolutionary text-to-SQL assistant, Snowflake Copilot, which combines the strengths of Mistral Large with Snowflake’s proprietary SQL generation model to accelerate productivity for all SQL users.

Unlock no-code AI development with the new Snowflake AI & ML Studio

Snowflake Cortex AI provides customers with a robust set of latest-generation models from leading providers like Google, Meta, Mistral AI, and Reka, as well as Snowflake’s open-source Arctic LLM, to accelerate AI development. Snowflake is democratizing the way any user can bring these powerful models to their company’s data with the new Snowflake AI & ML Studio, an interactive code-free interface for teams to start with AI development and produce their AI applications more quickly. Additionally, users can easily test and evaluate these models to find the most suitable and cost-effective for their specific use cases, ultimately accelerating the path to production and optimizing operational costs.

To help organizations further improve LLM performance and deliver more personalized experiences, Snowflake is introducing Cortex Fine-Tuning, accessible through AI & ML Studio or a simple SQL function. This serverless customization is available for a subset of Meta and Mistral AI models. These customized models can be easily utilized through a Cortex AI function, with managed access through Snowflake’s role-based access controls.

Streamline model and feature management with unified and governed MLOps through Snowflake ML

Once ML and LLM models are developed, most organizations struggle to continuously operate them in production over evolving data sets. Snowflake ML brings MLOps capabilities to AI Data Cloud, so that teams can seamlessly discover, manage, and govern their features, models, and metadata throughout the ML lifecycle, from data preprocessing to model management. These centralized MLOps functions also integrate with the rest of the Snowflake platform, including Snowflake Notebooks and Snowpark ML, to provide a seamless comprehensive experience.

Snowflake’s MLOps capabilities suite includes the Snowflake Model Registry, which allows users to govern access and usage of all types of AI models to deliver more personalized experiences and cost-saving automations with reliability and efficiency. Additionally, Snowflake announces Snowflake Feature Store, an integrated solution for data scientists and ML engineers to create, store, manage, and serve consistent ML features for training and model inference, and ML Lineage, so teams can track feature, data set, and model usage throughout the end-to-end ML lifecycle.

Scroll to Top