The system was initially announced at the company's Data Cloud Summit in June of this year and is a fully managed service that provides businesses with an interactive interface to interact with data. Users simply need to ask business questions in simple English, and the agentic AI system takes care of the rest, including converting prompts to SQL, querying data, running checks, and providing the desired answers.
Organizational Readiness in the AI Era: From Technology to Transformation
Snowflake's Head of AI, Baris Gultekin, told VentureBeat that the service utilizes multiple large language models (LLMs) working together to ensure the delivery of about 90% accurate insights. He claimed that this surpasses the accuracy of existing LLM-based text-to-SQL solutions, including products from Databricks, and enables the acceleration of analytical workflows, allowing business users to instantly access the insights they need to make critical decisions.
Simplifying Analysis with Cortex Analyst
While businesses continue to invest heavily in AI-driven data generation and forecasting, data analysis still plays a transformative role in business success. Organizations extract valuable insights from historically structured data organized in tables to make decisions in areas such as marketing and sales.
However, the current situation is that the entire analytics ecosystem is largely driven by business intelligence (BI) dashboards, which visualize data and provide information using charts, graphs, and maps. While this approach is effective, it can sometimes be quite rigid, making it difficult for users to delve into specific metrics and often relying on overwhelmed analysts to obtain further insights.
"When you see an anomaly in the data on a dashboard, you immediately come up with three different questions to understand what happened. When you ask these questions, an analyst gets involved, performs the analysis, and provides answers in about a week. But then, you may have more follow-up questions, which can keep the analysis loop open and slow down the decision-making process," said Gultekin.
To bridge this gap, many have begun exploring the potential of large language models, which excel at unlocking insights from unstructured data, such as lengthy PDF documents. The idea is to feed the models with the raw structured data schema so that they can support a text-to-SQL conversational experience, allowing users to interact with data and ask relevant business questions in real-time.
However, as these LLM-based products emerged, Snowflake discovered a major issue - low accuracy. Based on benchmark tests representing real-world use cases within the company, the accuracy of analytical insights was only about 51% when using state-of-the-art models like GPT-4, while dedicated text-to-SQL solutions, including Databricks' Genie, achieved an accuracy of 79%.
"Accuracy is the most important when you ask business questions. 51% accuracy is unacceptable. We almost doubled this ratio to about 90% through a series of closely collaborating large language models used for Cortex Analyst," pointed out Gultekin.
When integrated into enterprise applications, Cortex Analyst receives business queries written in natural language and passes them to large language model (LLM) agents at different levels. These agents, based on the data in Snowflake's Data Cloud, provide accurate and non-misleading answers. The agents handle various tasks, from analyzing the intent of the question and determining if it can be answered, to generating and running SQL queries based on the question, and checking the correctness of the answers before returning them to the user.
"We built a system that can determine if a question can be answered or if it's ambiguous due to inaccessible data. If the question is ambiguous, we ask the user to rephrase it and provide suggestions. Only when it's determined that the large language models can answer the question, we pass it to a series of SQL-generating agent models that judge the correctness of the SQL, correct any errors, and then run the SQL to provide the answer," explained Gultekin.
The Head of AI did not disclose specific details about the models powering Cortex Analyst, but Snowflake has confirmed that it is using its own Arctic model as well as a combination of models from Mistral and Meta.
So, how does it work exactly?
To ensure that the LLM agents behind Cortex Analyst can understand the complete architecture of user data structures and provide accurate and context-aware responses, the company requests customers to provide semantic descriptions of their data assets during the setup phase. This addresses a major issue related to the original schema and allows the models to capture the intent of the questions, including the user's vocabulary and specific terminology.
"In practical applications, you have thousands of tables and hundreds of thousands of columns with strange names. For example, 'Rev 1' and 'Rev 2' might refer to iterations of revenue. Our customers can specify these metrics and their meanings in the semantic description, enabling the system to use them when providing answers," added Gultekin.
Currently, the company is offering Cortex Analyst as a REST API, allowing it to be integrated into any application and providing developers with flexibility to customize how they use the service and interact with the results based on the needs of business users. There is also an option to build dedicated applications using Streamlit with Cortex Analyst as the core engine.
During the private preview, around 40-50 companies, including pharmaceutical giant Bayer, have deployed Cortex Analyst to interact with their data and accelerate analytical workflows. As businesses continue to focus on adopting LLM without exceeding their budgets, it is expected that the public preview will increase this number. The service will enable companies to leverage LLM for analysis without going through all the implementation hassle and cost.
Snowflake has also confirmed that additional features, including support for multi-turn conversations for interactive experiences and support for more complex tables and schemas, will be available in the coming days.