Skip to main content

Working with AI

Snowplow supports agentic and LLM-powered workflows in several ways.

Snowplow CLI MCP server

The Snowplow CLI includes a Model Context Protocol (MCP) server that connects AI assistants to your Snowplow tracking plans. This can help you design tracking plans faster and more consistently.

Documentation index in llms.txt

This documentation follows the llms.txt standard, providing structured information to help an LLM use the site.

An index is available at llms.txt.

An extended version is also available at llms-full.txt, that includes the complete text of the current pages. For token efficiency, sections relating to older versions of components aren't included in this file. These sections are still listed in the llms.txt index, labeled as [previous version].

The llms-full.txt file is very large. It might be more effective to access individual pages as needed, using the Markdown access method described below.

Documentation pages as Markdown

Every documentation page is available as Markdown. To download a page's content, use the Download or Copy Markdown buttons above the page title.

Following the llms.txt standard, you can access the Markdown page directly by changing the trailing / in the URL to .md. For example:

  • HTML: https://docs.snowplow.io/docs/signals/concepts/
  • Markdown: https://docs.snowplow.io/docs/signals/concepts.md

Let your LLM know that this format is available, so it can retrieve content efficiently.

Signals

Use Signals to provide real-time behavioral context to your AI applications. It computes user attributes from your event stream and warehouse data, and makes them available to your applications via the Profiles Store API.

On this page

Want to see a custom demo?

Our technical experts are here to help.