Overview
The Crabtalk Hub is a community registry for AI agent resources — MCP servers,
skills, agents, and commands. Resources are indexed as TOML manifests in the
hub repository and installed via the
crabtalk CLI.
Index Layout
Each manifest lives at <scope>/<name>.toml:
microsoft/playwright.toml
notion/notion.toml
crabtalk/search.toml
scope is typically the author, organization, or tool namespace. A single
.toml file can declare multiple related resources under the same package.
Installation
crabtalk hub install scope/name
This:
- Fetches the manifest from the hub index.
- Clones the source repository (if
package.repositoryis set) to~/.crabtalk/.cache/repos/<slug>/. - Copies the manifest to
~/.crabtalk/packages/<scope>/<name>.toml. - Merges resource definitions (MCP servers, agents, commands) into the user’s
crab.toml. - Runs
[package.setup]if present.
Directory Conventions
| Path | Purpose |
|---|---|
~/.crabtalk/packages/ | Installed package manifests |
~/.crabtalk/config/ | Service configuration files (.toml) |
~/.crabtalk/.cache/repos/ | Cached source repository clones |
~/.crabtalk/local/agents/ | Agent prompt files (.md) |
~/.crabtalk/local/skills/ | Installed skill directories (auto-loaded) |
~/.crabtalk/config.toml | Runtime configuration (MCP servers, agents, commands) |
Skill Discovery
Skills are auto-loaded — no manifest registration required. Crabtalk scans two locations for skill directories:
- Local skills —
~/.crabtalk/local/skills/<name>/ - Package skills —
~/.crabtalk/.cache/repos/<slug>/skills/<name>/
Any directory containing a SKILL.md file is picked up automatically. Skills
follow the agentskills standard and are invoked via
/skill-name in the REPL.
Manifest
Every hub package is a single TOML file with a required [package] section
followed by one or more resource sections: [mcps.*], [agents.*], and
[commands.*].
[package]
name = "my-package"
description = "Short description"
repository = "https://github.com/org/repo"
[mcps.my-server]
command = "npx"
args = ["-y", "my-mcp-server"]
[package]
Package metadata. The only required field is name.
| Field | Required | Default | Description |
|---|---|---|---|
name | yes | — | Package name |
description | no | "" | Short description for hub display |
logo | no | "" | Logo URL for hub display |
repository | no | "" | Source repository URL (required for skills and agents) |
branch | no | repo default | Branch to clone |
version | no | — | Package version |
authors | no | [] | Package authors |
license | no | — | License identifier |
keywords | no | [] | Searchable tags for hub discovery |
setup | no | — | Post-install setup (see below) |
[package.setup]
Run after installation. Two variants — use one or the other, not both.
Bash script — runs from the cached repo directory. If the value ends with
.sh, it is executed as a script file (bash setup.sh). Otherwise it is passed
as inline script (bash -c '...'):
[package.setup]
script = "setup.sh"
Or inline:
[package.setup]
script = '''
npx playwright install chromium
echo "done"
'''
Inference prompt — sent to the daemon. If the value ends with .md, it is
read as a file path relative to the repo root:
[package.setup]
prompt = "setup.md"
[mcps.*]
Registers an MCP server. The map key becomes the server identifier.
[mcps.notion]
command = "npx"
args = ["-y", "@notionhq/notion-mcp-server"]
env = { NOTION_TOKEN = "" }
| Field | Required | Default | Description |
|---|---|---|---|
command | yes* | "" | Executable to spawn (stdio transport) |
args | no | [] | Command-line arguments |
env | no | {} | Environment variables |
name | no | map key | Display name |
auto_restart | no | true | Auto-restart on failure |
url | no | — | HTTP URL for streamable HTTP transport |
* Either command or url must be set. Use command for stdio transport,
url for HTTP.
[agents.*]
Agent prompt bundles. Prompt files are discovered by convention from
agents/<key>.md in the source repository, or specified explicitly via the
prompt field.
[agents.garrytan]
description = "AI engineering workflow agent"
thinking = true
skills = ["agent-browser"]
| Field | Required | Default | Description |
|---|---|---|---|
description | yes | — | Agent description |
prompt | no | "" | Path to prompt .md file (relative to repo root) |
skills | no | [] | Skill names this agent can access |
model | no | — | Model override for this agent |
thinking | no | false | Enable thinking/reasoning mode |
mcps | no | [] | MCP server names this agent can access |
[commands.*]
Rust crate commands. Auto-installed via cargo install during hub install.
[commands.search]
description = "Meta-search aggregator"
crate = "crabtalk-search"
| Field | Required | Description |
|---|---|---|
description | yes | Human-readable description |
crate | yes | Crate name on crates.io (installed via cargo install) |
Installed commands are managed via crabtalk <command> start|stop|run|logs.
Publishing
How to develop and publish a package to the Crabtalk Hub.
1. Write a manifest
Create a <name>.toml file following the manifest format.
Start with the [package] section and add the resource sections you need.
[package]
name = "my-package"
description = "What it does"
repository = "https://github.com/org/my-package"
keywords = ["relevant", "tags"]
[mcps.my-server]
command = "npx"
args = ["-y", "@scope/my-mcp-server"]
2. Test locally
Validate your manifest before publishing:
crabtalk hub test path/to/my-package.toml
This parses the manifest, checks that required fields are present, and verifies that the resources can be resolved.
3. Submit to the hub
- Fork the hub repository.
- Add your manifest at
<scope>/<name>.toml. Use your organization name or username as the scope. - Open a pull request.
my-org/my-package.toml
Checklist
Before submitting:
-
package.nameis set -
package.descriptionis clear and concise -
package.repositoryis set (required for agents and skills) -
package.keywordshelp users find your package -
package.setupworks on a clean machine (if present) - MCP servers start without errors
- Agent prompts exist at the expected paths in the source repo
-
crabtalk hub testpasses
Writing an MCP Server Package
An MCP server package registers an external tool server that the daemon connects to at runtime. This is the most common package type — wrapping an existing MCP server for one-command installation.
Minimal manifest
[package]
name = "my-server"
description = "What the server does"
keywords = ["relevant", "tags"]
[mcps.my-server]
command = "npx"
args = ["-y", "@scope/my-mcp-server"]
That’s it. On crabtalk hub install, this config is merged into the user’s
config.toml and the server starts automatically.
Choosing a transport
MCP servers connect via stdio or HTTP. Use one or the other.
Stdio (most common) — the daemon spawns the process directly:
[mcps.my-server]
command = "npx"
args = ["-y", "@scope/my-mcp-server"]
HTTP — the server runs independently and the daemon connects to it:
[mcps.my-server]
url = "http://localhost:3000/mcp"
Environment variables
If the server needs API keys or configuration, declare them with empty defaults. The user fills them in after install:
[mcps.notion]
command = "npx"
args = ["-y", "@notionhq/notion-mcp-server"]
env = { NOTION_TOKEN = "" }
Post-install setup
If the server needs a setup step (e.g., downloading browser binaries), add
[package.setup]:
[package.setup]
script = "npx playwright install chromium"
See the Playwright example for a complete manifest.
Playwright
MCP server with a post-install setup script.
[package]
name = "playwright"
description = "Browser automation with Playwright MCP server and CLI tools"
logo = "https://crabtalk.ai/logos/playwright.png"
repository = "https://github.com/microsoft/playwright-cli"
keywords = ["playwright", "browser", "automation", "mcp"]
[package.setup]
script = "npx playwright install chromium"
[mcps.playwright]
command = "npx"
args = ["-y", "@playwright/mcp"]
What’s happening
[package]— Standard metadata.repositorypoints to the source repo (used if the package also declares skills).[package.setup]— After install, runsnpx playwright install chromiumvia bash to download the browser binary. This is the script variant of setup.[mcps.playwright]— Registers an MCP server using stdio transport. On install, this config is merged into the user’scrab.toml. The server spawns vianpx -y @playwright/mcp.
Writing an Agent Package
An agent package bundles a system prompt, optional skills, and configuration into an installable unit. This is how you distribute a complete agent persona.
Minimal manifest
[package]
name = "my-agent"
description = "What the agent does"
repository = "https://github.com/org/my-agent"
keywords = ["agent"]
[agents.my-agent]
description = "One-line agent description"
The repository field is required — the installer clones it to find the
prompt file and any bundled skills.
Prompt discovery
Agent prompts are discovered by convention. For an agent key my-agent, the
installer looks for agents/my-agent.md in the source repository and copies
it to ~/.crabtalk/local/agents/my-agent.md.
You can also specify the path explicitly:
[agents.my-agent]
description = "One-line agent description"
prompt = "prompts/custom-path.md"
Bundling skills
If your repo includes skills under a skills/ directory, they are
auto-discovered at runtime. You can declare which skills the agent uses:
[agents.my-agent]
description = "One-line agent description"
skills = ["code-review", "deploy"]
Agent options
[agents.my-agent]
description = "One-line agent description"
thinking = true # enable reasoning mode
model = "claude-sonnet" # model override
mcps = ["playwright"] # MCP servers this agent can access
Cloning a specific branch
If the agent lives on a non-default branch:
[package]
name = "my-agent"
repository = "https://github.com/org/my-agent"
branch = "crabtalk"
Interactive setup
For agents that need configuration (API keys, preferences), use a setup prompt instead of a bash script. The prompt is sent to the daemon for inference:
[package.setup]
prompt = "setup.md"
See the Gstack example for a complete manifest.
Gstack
Agent package with an inference-based setup prompt.
[package]
name = "gstack"
version = "0.0.1"
authors = ["garrytan <garrytan@gmail.com>"]
license = "MIT"
repository = "https://github.com/crabtalk/gstack"
branch = "crabtalk"
keywords = ["agent", "stack"]
description = "garrytan's AI engineering workflow agent powered by crabtalk"
[package.setup]
prompt = "setup.md"
[agents.garrytan]
description = "AI engineering workflow agent — suggests the right skill at the right time"
thinking = true
What’s happening
[package]— Uses the optionalversion,authors,license, andbranchfields.branch = "crabtalk"tells the installer to clone that specific branch instead of the repo default.[package.setup]— The prompt variant.setup.mdis read from the repo root and sent to the daemon for inference. This is useful for interactive setup that requires the agent’s help (e.g., configuring API keys).[agents.garrytan]— Declares an agent withthinking = truefor reasoning mode. The prompt file is discovered by convention atagents/garrytan.mdin the source repo.
Writing a Command Package
A command package registers a Rust binary crate as a managed service. The hub
installs the crate via cargo install and the daemon manages its lifecycle.
Minimal manifest
[package]
name = "my-command"
description = "What the command does"
repository = "https://github.com/org/my-command"
keywords = ["relevant", "tags"]
[commands.my-command]
description = "What the command does"
crate = "crabtalk-my-command"
On crabtalk hub install, the crate is installed from crates.io via
cargo install crabtalk-my-command.
Naming convention
Command binaries follow the pattern crabtalk-<name>. This allows the CLI to
dispatch crabtalk <name> to the correct binary — the same pattern Cargo uses
for subcommands.
Lifecycle management
Once installed, commands are managed via:
crabtalk <name> start # start the service
crabtalk <name> stop # stop the service
crabtalk <name> run # run in foreground
crabtalk <name> logs # view service logs
Multiple commands
A single package can declare multiple commands if they ship from the same repository:
[commands.search]
description = "Meta-search aggregator"
crate = "crabtalk-search"
[commands.telegram]
description = "Telegram bot gateway"
crate = "crabtalk-telegram"
See the Search and WeChat examples for complete manifests.
Search
Command service — registers a locally-installed binary as a managed command.
[package]
name = "search"
description = "Meta-search aggregator (Bing, Brave, DuckDuckGo, Mojeek, Wikipedia)"
logo = "https://crabtalk.ai/logos/search.png"
repository = "https://github.com/crabtalk/crabtalk"
keywords = ["search", "web", "bing", "brave", "duckduckgo", "mojeek", "wikipedia"]
[commands.search]
description = "Meta-search aggregator (Bing, Brave, DuckDuckGo, Mojeek, Wikipedia)"
crate = "crabtalk-search"
What’s happening
[package]— Standard metadata. Therepositoryis the main crabtalk monorepo since the search binary is built from it.[commands.search]— Thecrabtalk-searchcrate is auto-installed viacargo installduringhub install. Once installed, managed viacrabtalk search start|stop|run|logs.
Command service — WeChat bot gateway for agents.
[package]
name = "wechat"
description = "WeChat bot gateway for agents"
logo = "https://crabtalk.ai/logos/wechat.png"
repository = "https://github.com/crabtalk/crabtalk"
keywords = ["wechat", "bot", "gateway", "chat"]
[commands.wechat]
description = "WeChat bot gateway for agents"
crate = "crabtalk-wechat"
What’s happening
[package]— Standard metadata. Therepositoryis the main crabtalk monorepo since the wechat binary is built from it.[commands.wechat]— Thecrabtalk-wechatcrate is auto-installed viacargo installduringhub install. Once installed, managed viacrabtalk wechat start|stop|run|logs.