# LLM Signal LLM Signal measures AI traffic, verifies source quality, and tracks brand visibility in LLM responses. ## Canonical Documentation - Docs home: https://www.llmsignal.app/docs - CLI docs: https://www.llmsignal.app/docs/cli - OpenAPI: https://www.llmsignal.app/openapi/agent-v1.yaml - llms.txt: https://www.llmsignal.app/llms.txt ## CLI - Package: @llmsignal/cli - npm: https://www.npmjs.com/package/@llmsignal/cli - Install: npx @llmsignal/cli@latest --help - Commands: snippet, ping, ingest, canary, bootstrap, status, verify ## Public API - POST https://www.llmsignal.app/api/ingest (public ingestion; requires siteId + apiKey) - GET/POST https://www.llmsignal.app/api/agent/v1/bootstrap (agent bootstrap payload; requires siteId + apiKey) - GET/POST https://www.llmsignal.app/api/agent/v1/status (agent health status; requires siteId + apiKey) ## Authenticated API - GET https://www.llmsignal.app/api/dashboard - GET/POST https://www.llmsignal.app/api/sites - GET/POST https://www.llmsignal.app/api/prompts ## Notes for AI Agents - Tenant isolation and Supabase RLS are required constraints. - Do not expose service-role or provider keys in client-side code. - Prefer machine-readable JSON errors and explicit status fields. ## Contact - support@llmsignal.app