Skip to main content
Canonical definition

What is Clanker Cloud?

Clanker Cloud is a local-first desktop app for infrastructure operations. It lets teams ask questions about live environments, inspect topology and cost signals, review change plans, and then approve execution from one workspace.

It is designed for builders shipping AI-built apps, lean DevOps and platform teams, AI researchers, and agent-driven workflows that need live infrastructure context without handing privileged credentials to a hosted copilot layer.

Clanker Cloud is an AI workspace for infrastructure: a local-first desktop app that gathers live context first, generates reviewed plans second, and only acts when an operator approves.

Local-first by default

Credentials stay on the operator machine and the app uses the same local cloud and cluster access patterns teams already run.

Provider-agnostic

Works across AWS, GCP, Azure, Kubernetes, Cloudflare, Hetzner, DigitalOcean, GitHub, and bring-your-own AI keys.

Open-source engine

Built on the public Clanker CLI so teams can inspect the underlying engine and use the same core agent from the terminal.

Built for action with guardrails

The workflow is read-first, plan-second, and explicit-approval-only before infrastructure changes happen.

Supported providers

Works across the environments teams already run

The current product positioning covers cloud providers, Kubernetes, GitHub, and bring-your-own AI keys from one local operating surface.

Supports ->AWSGCPAzureKubernetesCloudflareHetznerDigitalOceanVercelGitHubBYOK
Who it is for

Three core ICPs

The best fit is a team that already has real infrastructure and wants faster answers without another hosted trust boundary.

Builders

Founders and full-stack teams shipping AI-built apps

Move from repo to production plan, keep credentials local, and keep logs, topology, and cost context attached after launch.

DevOps

Lean infra, platform, and SRE teams

Investigate incidents, correlate provider state, review changes, and keep multi-cloud context in one operating surface.

Agents

AI researchers and agent-driven workflows

Expose a local MCP surface, use bring-your-own keys, and ground agent output in live environment evidence.

What it includes

Core workflow surfaces

Ask

Plain-English infrastructure investigation

Ask what changed, what is failing, what talks to what, or where the cost anomaly came from.

Inspect

Topology and live environment context

Ground answers in real resource state, dependencies, cluster signals, and provider evidence.

Plan

Reviewed plans before execution

See the intended impact before anything is created, modified, or destroyed.

Extend

MCP and custom-agent interoperability

Use the local MCP endpoint or the CLI to connect Clanker Cloud into agent workflows and internal tooling.

Security model

What stays under operator control

The product positioning comes directly from the architecture rather than from policy copy alone.

  • Cloud credentials and cluster contexts stay on the machine running the app.
  • AI provider calls use bring-your-own keys instead of a reseller token layer.
  • The engine gathers live evidence before generating a plan or recommendation.
  • Changes require explicit operator approval instead of silent background apply behavior.
  • The same core engine is available through the public Clanker CLI.
Best fit

When it fits and when it does not

Best fit

  • Teams operating across multiple providers, clusters, repos, or runtime surfaces.
  • Builders who want deploy planning and post-deploy context in one workflow.
  • Organizations that care about credential custody, BYOK, and review-before-apply control.

Not a fit

  • Teams that only want a hosted chat assistant with no local runtime or provider setup.
  • Organizations looking for a fully managed SaaS control plane to own all credentials for them.
  • Single-purpose use cases where a narrow point tool already solves the whole workflow.
FAQ

Common questions

Does Clanker Cloud replace cloud consoles entirely?

No. Cloud consoles remain the source of truth. Clanker Cloud acts as the faster operating layer for investigation, planning, topology, and cross-provider context.

Which providers does Clanker Cloud support?

The current positioning covers AWS, GCP, Azure, Kubernetes, Cloudflare, Hetzner, DigitalOcean, GitHub, and bring-your-own AI provider keys.

Why does local-first matter here?

Because the architecture keeps credentials and operator control on the machine running the app instead of moving privileged access into a hosted copilot layer.

Next step

Need the workflow view?

Read the stable architecture explainer or compare Clanker Cloud against the hosted-copilot pattern.