Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.sequentum.com/llms.txt

Use this file to discover all available pages before exploring further.

Sequentum exposes a Model Context Protocol server at https://mcp.sequentum.com/mcp. It’s a typed wrapper over the REST API — same operations (list, run, schedule, monitor agents), surfaced as MCP tools that any MCP-aware LLM can call. The primary use case is automating Sequentum from inside an LLM-driven pipeline. A Python service or n8n workflow calls Claude with the Sequentum MCP server attached; the model picks the right agent, runs it, and your pipeline gets structured records back — no separate REST integration. Interactive use from Claude Desktop is a useful secondary surface for analysts; dev-tool clients (Cursor, Claude Code) are fine for prototyping but aren’t the production path.

Connect

Wire Sequentum into a pipeline, automation platform, or chat client — hosted or self-hosted.

Tools

Every MCP tool the server exposes, mapped to its REST equivalent.

Where to use MCP

PatternUse it for
Claude API + Sequentum MCP serverProduction pipelines where an LLM orchestrates Sequentum runs. The killer use case.
Automation platforms with MCP support (n8n, Zapier, AWS Step Functions)No-code / low-code pipeline glue.
Claude DesktopInteractive analyst workflows — ad-hoc data pulls, run inspection, schedule changes.
Self-hosted (npx -y sequentum-mcp)Air-gapped or on-prem environments that can’t reach mcp.sequentum.com.
For straight code-driven integration without an LLM in the loop, the REST API is the simpler choice. MCP earns its keep when a model is already orchestrating. Wire it up →