Local MCP boundary
Agents connect to the running Clanker Cloud app runtime; the CLI is the open-source launcher for stdio or localhost-style MCP clients.
Clanker Cloud gives MCP-capable agents a local way to ask grounded questions about cloud and Kubernetes infrastructure from the app runtime. The agent connects locally instead of guessing from repository files or using a hosted privileged bridge.
Use the app for Kubernetes troubleshooting, AWS and multi-cloud context, Cloudflare and edge investigation, cost questions, GitHub and CI/CD context, and review-before-apply workflows. The open-source CLI launches the same MCP surface for agent clients that need a command.
The MCP surface is the agent door into live infrastructure context: local runtime, local credentials, provider evidence, and reviewed actions.
Agents connect to the running Clanker Cloud app runtime; the CLI is the open-source launcher for stdio or localhost-style MCP clients.
The same runtime can gather AWS, GCP, Azure, Kubernetes, Cloudflare, GitHub, Vercel, Railway, Verda, and other provider context.
MCP-capable tools such as Claude Code, Codex, OpenClaw, Hermes-style agents, and internal agents can use the local tool surface.
Read-only questions are first-class. Apply-style changes still require reviewed plans and explicit approval.
The current product positioning covers cloud providers, Kubernetes, GitHub, and bring-your-own AI keys from one local operating surface.
Open the app on the machine that already has kubeconfig and provider credentials, then confirm the workspace can see the cloud or Kubernetes context you want the agent to use.
Open Clanker Cloud
Connect or select local provider context
Ask a grounded app query before giving an agent broader MCP accessUse the open-source CLI launcher when your MCP client needs to start the same local Clanker runtime directly.
clanker mcp --transport stdioUse HTTP when your local client connects to a localhost endpoint exposed by the Clanker runtime. The CLI launcher listens at /mcp.
clanker mcp --transport httpPoint the MCP client at the local CLI launcher while keeping provider credentials on the same machine that runs Clanker Cloud.
{
"mcpServers": {
"clanker": {
"command": "clanker",
"args": ["mcp", "--transport", "stdio"]
}
}
}| Query intent | Example query | Evidence used |
|---|---|---|
| Kubernetes troubleshooting | Why is checkout returning 502 in prod? | Ingress, services, endpoints, pods, events, rollout context. |
| AWS investigation | What changed around the cost spike this week? | Cost Explorer, resource metadata, regions, tags, recent deploy context. |
| Cloudflare and edge | Which public routes reach EKS and bypass auth? | DNS, Workers routes, WAF, tunnels, ingress, load balancers. |
| Review before apply | Draft the safest plan to add private storage for uploads. | Provider state, requested intent, tags, cost estimate, plan artifact. |
The agent can ask against provider and cluster state instead of relying only on repo files.
The MCP client uses the local Clanker runtime and the access already trusted on that machine.
Investigation is easy to run. High-impact changes stay behind reviewed plans and explicit approval.
Yes. Clanker Cloud exposes local MCP access that can route questions to Kubernetes and cloud context through the app runtime and the open-source CLI launcher, using the kubeconfig and provider access already trusted on the operator machine.
An MCP-capable agent can launch or connect to Clanker MCP locally, then ask grounded infrastructure questions through the same local runtime that powers the Clanker Cloud app instead of guessing from source code alone.
No. MCP gives agents a local tool surface, but high-impact infrastructure writes still belong behind reviewed plans and explicit approval.
Start with the app-first Claude Code and Kubernetes example, then read the local credential boundary before giving agents broader context.