Documentation
Trellis users run the trellis-cli command-line tool to manage their grids. When you launch the CLI in serve mode, it starts a local HTTP API server that any Anthropic-compatible agent harness can connect to. Every Trellis account includes free access to a public demo grid; for best performance and security, we recommend creating a private grid.
Download the CLI
Install the
trellis-clibinary with a single command:wget trellis.unfoldml.com/cliSign up
Create an account to get access to the management dashboard. From there you can generate join tokens and (with a paid plan) create grids and invite your organization.
Run inference
Launch a serve node to expose a local Anthropic-compatible API.
trellis-cli serve --token <joinToken> --bootstrap-url <bootstrapURL>Agent harnesses like Claude Code can connect directly to your new local endpoint:
ANTHROPIC_BASE_URL=http://localhost:11434 ANTHROPIC_AUTH_TOKEN=local claudeLevel up to Managed or Self-Hosted
Check out our plans to create private grids with support for unlimited nodes.
Load a LLM on a grid
From the dashboard, you can deploy any HuggingFace model to your grid.
Invite your team
With a private grid, you can invite your teammates to join. Each user can launch nodes with their own join tokens, and you can track usage by email in the dashboard.