Snowflake partners with Mistral AI to bring cutting-edge linguistic models to businesses through Snowflake Cortex.

Snowflake (NYSE: SNOW), the Data Cloud company, and Mistral AI, one of the leading European providers of AI solutions, have announced a global agreement to bring Mistral AI’s most powerful language models directly to Snowflake customers in the Data Cloud. Through this multi-year partnership, which includes a parallel investment in Mistral’s Series A by Snowflake Ventures, Mistral AI and Snowflake will provide the capabilities that businesses need to seamlessly harness the power of leading Large Language Models (LLM), while maintaining security, privacy, and governance over their most valuable asset: their data.

Thanks to the new collaboration with Mistral AI, Snowflake customers now have access to Mistral Large, Mistral AI’s newest and most powerful LLM, with benchmark results that make it one of the highest-performing models in the world. Beyond benchmark results, Mistral AI’s new flagship model has unique reasoning capabilities, masters code and mathematics, and fluently speaks five languages – French, English, German, Spanish, and Italian – in line with Mistral AI’s commitment to promoting the cultural and linguistic particularities of generative AI technology. It can also process hundreds of pages of documents in a single request. In addition, Snowflake customers have access to Mixtral 8x7B, Mistral AI’s open-source model that surpasses OpenAI’s GPT3.5 in speed and quality in most benchmark tests, along with Mistral 7B, Mistral AI’s foundational model optimized for low latency with low memory requirement and high performance for its size. Mistral AI models are now available to customers in public preview as part of Snowflake Cortex, Snowflake’s fully managed vector search and LLM service that allows organizations to accelerate analysis and quickly create secure AI applications with their enterprise data.

“By partnering with Mistral AI, Snowflake is putting one of the most powerful LLMs in the market into the hands of our customers, enabling each user to build cutting-edge AI-driven applications in a simple and scalable way,” said Sridhar Ramaswamy, CEO of Snowflake. “With Snowflake as the foundation for reliable data, we are transforming the way businesses harness the power of LLMs through Snowflake Cortex so they can cost-effectively address new AI use cases within the security and privacy boundaries of the Data Cloud.”

“Snowflake’s commitment to security, privacy, and governance aligns with Mistral AI’s ambition to put cutting-edge AI in the hands of everyone and make it accessible everywhere. Mistral AI shares Snowflake’s values in developing efficient, useful, and reliable AI models to drive how organizations worldwide leverage generative AI,” said Arthur Mensch, CEO and co-founder of Mistral AI. “With our models available on Snowflake Data Cloud, we are further democratizing AI so that users can create more sophisticated AI applications that drive value at a global scale.”

At Snowday 2023, Snowflake Cortex first announced compatibility with leading industry LLMs in specialized tasks such as sentiment analysis, translation, and summarization, along with foundational LLMs – starting with the Llama 2 model from Meta AI – for use cases such as retrieval-augmented generation (RAG). Snowflake continues to invest in its generative AI efforts through the partnership with Mistral AI and the advancement of foundational LLM set in Snowflake Cortex, providing organizations an easy path to bring cutting-edge generative AI to every part of their business. To provide a serverless experience that puts AI within reach of a broad set of users, Snowflake Cortex eliminates long procurement cycles and complex GPU infrastructure management by partnering with NVIDIA to offer an accelerated computing platform across the stack leveraging NVIDIA Triton Inference Server among other tools.

With Snowflake Cortex LLM Functions in public preview, Snowflake users can harness AI with their enterprise data to support a wide range of use cases. By using specialized functions, any SQL-savvy user can leverage smaller LLMs to cost-effectively address specific tasks like sentiment analysis, translation, and summarization in seconds. For more complex use cases, Python developers can seamlessly transition from concept to AI applications across the stack, such as chatbots, by combining the power of foundational LLMs – including Mistral AI’s LLMs in Snowflake Cortex – with chat elements, in public preview coming soon, within Streamlit on Snowflake. This simplified experience also extends to RAG with built-in vector functions and Snowflake vector data types, both in public preview coming soon, while ensuring data never leaves the security and governance perimeter of Snowflake.

Snowflake is committed to driving innovation in AI not only for its customers and the Data Cloud ecosystem but for the technology community at large. That is why Snowflake has recently joined the AI Alliance, an international community of developers, researchers, and organizations dedicated to promoting open, safe, and responsible AI. Through the AI Alliance, Snowflake will continue to comprehensively and openly address both the challenges and opportunities of generative AI in order to further democratize its benefits.

Scroll to Top