Snowflake this week at its annual Snowflake Summit conference advanced its case for making its data lake the foundation for building and deploying artificial intelligence (AI) applications by privately previewing no-code tools, dubbed Snowflake AI & ML, that can be used to create and fine tune, for example, a chat bot based on a large language model (LLM).

Snowflake plans to also shortly preview two new chat capabilities. Snowflake Cortex Analyst, based on LLMs from Meta and Mistral, which allows organizations to securely build applications on top of their analytical data hosted in the Snowflake cloud. Cortex Search makes use of retrieval and ranking technology that Snowflake gained last year with the acquisition of Neeva alongside Snowflake Arctic, a set of LLMs from Snowflake to build applications using documents and other text-based datasets using a hybrid search service that can understands both vector and text data.

At the same time, Snowflake is developing its own AI models. A forthcoming Document AI model will allow users to extract content such as invoice amounts or contract terms from documents using Snowflake’s multimodal LLM dubbed Snowflake Arctic-TILT. Snowflake also plans to make generally available soon its text-to-SQL assistant, dubbed Snowflake Copilot, that combines Mistral LLM with Snowflake’s proprietary SQL generation model.

In addition, Snowflake is previewing Cortex Fine-Tuning, a serverless customization for Meta and Mistral LLMs, that makes it simpler to customize models using the Snowflake AI & ML Studio tool or a SQL function.

Snowflake is also adding a governance framework, dubbed Snowflake ML, that can be used to build, discover and govern the various models the company makes available via the Snowflake Cortex AI service. Snowflake will also soon add Snowflake Cortex Guard to identify harmful content using the Llama Guard platform developed by Meta.

Those capabilities complement a suite of machine learning operations (MLOps) tools that include a Snowflake Model Registry that is now generally available, a Snowflake Feature Store in public preview, and a private preview of ML Lineage for managing data sets. Snowflake is also previewing a Snowflake Notebooks tool for its AI Data Cloud with Snowflake Notebooks along with an application programming interface for Snowpark pandas for Python developers.

Snowflake has also embraced NVIDIA AI Enterprise software to integrate NeMo Retriever microservices into Snowflake Cortex AI. The Snowflake Arctic LLM is now also compatible with both NVIDIA NIM inference microservice for encapsulating AI models into containers and NVIDIA TensorRT-LLM software.

The company is also making is simpler to integrate MLOps and DevOps workflows by previewing a declarative Database Change Management capability along with integrations with Git repositories. DevOps engineers will also be able to soon leverage Snowflake’s Python API to manage resources and invoke an open source Snowflake command line interface (CLI).

Snowflake is also adding Snowflake Trail, a rich set of observability capabilities based on the open source OpenTelemetry agent software, for its Snowpark and Snowpark Container Services and previewing integrations between Snowflake Native App Framework with Snowpark Container Services.

Finally, Snowflake is adding support for the Apache Iceberg data format via a Polaris Catalog platform that it open sources in the next 90 days.

These latest additions extend a Snowflake effort to convince developers to build applications on a platform where large amounts of enterprise data already resides, says Christian Kleinerman, executive vice president of product for Snowflake. “All the data is in one place,” he says. “They can bring their business logic to the cloud.”

It’s not clear to what degree developers and data science teams are heeding that call, but the one thing that is certain is that data gravity is only going to become a bigger factor than ever in the age of AI.