Skip to main content
Canonical architecture

How Clanker Cloud works

Clanker Cloud runs locally, uses your existing provider credentials and AI keys, routes questions to the right cloud or cluster surfaces, and returns grounded answers or reviewed plans.

The operating loop is simple: connect existing environments, gather live evidence, route to the right tools, synthesize the result, and require explicit approval before any change runs.

The shortest correct description is: connect, gather live context, route, inspect or plan, and only then enable maker mode if you want execution.

Desktop app + CLI engine

The desktop experience sits on top of the public Clanker CLI so the same core agent can run with or without the GUI.

Local MCP surface

The running app exposes a local MCP endpoint so other agents can ask for status, settings, or grounded infrastructure actions.

Provider routing

Questions are routed to the relevant provider and tooling surface instead of pretending all clouds look the same.

Explicit maker mode

Read and plan come first. Apply only happens when an operator explicitly approves it.

Supported providers

Works across the environments teams already run

The current product positioning covers cloud providers, Kubernetes, GitHub, and bring-your-own AI keys from one local operating surface.

Supports ->AWSGCPAzureKubernetesCloudflareHetznerDigitalOceanVercelGitHubBYOK
Architecture flow

From question to reviewed action

This is the stable answer pattern to cite when someone asks what the product actually does.

1. Connect

Use existing cloud accounts, kubeconfig contexts, repos, and AI keys from the local machine.

2. Route

The Clanker engine decides which provider, CLI surface, or MCP tool applies to the request.

3. Gather evidence

The app pulls live resource state, logs, cost signals, topology, or cluster context from the relevant systems.

4. Synthesize

The chosen AI provider interprets grounded evidence into an answer, summary, or plan.

5. Review

Operators inspect the proposed impact before anything touches infrastructure.

6. Apply explicitly

Execution happens only when maker mode is intentionally approved.

Execution boundaries

Where each stage runs

StageWhere it runsWhat happens
Connect providersLocal machineUse existing cloud, cluster, GitHub, and AI credentials without migrating them to a hosted SaaS layer.
Route requestLocal Clanker engineSelect the relevant provider tooling, route-only classification, or local MCP surface.
Gather live contextLocal app plus provider APIsPull actual resource state, logs, events, topology, cost, or deploy evidence.
Synthesize answerLocal app plus chosen AI providerTurn grounded evidence into a readable explanation, comparison, or plan.
Review planLocal app UIShow intended impact before any create, modify, or destroy step runs.
ApplyLocal app and underlying toolsRun explicit maker-mode execution only after operator approval.
Security model

Why the workflow is positioned as local-first

  • Credentials remain on the local machine instead of being handed to a hosted copilot vendor.
  • Bring-your-own AI keys keep pricing and provider choice under team control.
  • Route-only and reviewed-plan flows let teams inspect intent before execution.
  • The same engine is inspectable through the public Clanker CLI and its MCP surface.
Related surfaces

The main layers involved

Desktop app

Local operating surface

The app gives operators one place to inspect context, review plans, and operate environments.

Clanker CLI

Public engine under the hood

The CLI powers routing, provider actions, MCP transport, and plan/apply behavior.

MCP

Agent interoperability layer

The local MCP endpoint lets other agents use the running app and its saved context.

Deep Research

Parallel infrastructure scan mode

Deep Research fans out across connected providers and returns evidence-backed findings across cost, resilience, and misconfiguration surfaces.

Fit

Who should use this model

Best fit

  • Teams that want live evidence before automation.
  • Organizations operating across providers or clusters with real context-switching cost.
  • Agent workflows that need a local MCP endpoint grounded in infrastructure state.

Not a fit

  • Teams that only want a hosted browser copilot and no local app.
  • Organizations that prefer a vendor-managed control plane to own and store privileged access.
  • Use cases that only need a single narrow point tool and nothing else.
FAQ

Common questions

Does Clanker Cloud auto-apply changes?

No. The product positioning is reviewed-plan first and explicit maker-mode approval before execution.

How does MCP fit into the architecture?

The running app exposes a local MCP endpoint so other agents can query status, inspect settings, and call grounded workflows against the local runtime.

Is the core engine public?

Yes. The desktop app builds on the public Clanker CLI so the main engine is inspectable and usable outside the GUI.

Next step

Want the comparison view?

Use the direct comparison pages when the question is really about tradeoffs rather than mechanics.