Open Source The Shield Setup Why Heimdall? View GitHub
Quickstart · Clone To Running Agent

Setup Heimdall
From Git Clone To Verified Traffic

This guide follows the upstream repository quickstart for the transparent local-agent flow, from cloning the repo to running the local agent and verifying placeholder-based secret injection.

1. Controlled Host

Proxy Server

Run proxy-server on a machine your developer workstations can reach. This is where secrets stay, where placeholders are resolved, and where the tunnel listener runs.

2. Developer Workstation

Local Agent

Run heimdall-local-agent on the developer machine. In transparent mode it intercepts HTTPS traffic locally, then forwards approved requests through the authenticated tunnel.

3. Optional Control Plane

Admin Panel

Enable the built-in panel if you want runtime management for clients, stored secrets, AWS-backed secrets, and audit logs. It lives under /panel/.

The Recommended Transparent Flow

Use this path when you want the developer machine to work without setting HTTPS_PROXY per app. If you are validating on a VPS, CI runner, or a root-owned workload, the upstream docs recommend starting with the explicit proxy guide instead.

0
Clone The Repo

Get the source locally

Heimdall is open source and the setup starts with the public GitHub repository.

git clone https://github.com/BenTimor/Heimdall.git
cd Heimdall
1
Prerequisites

Make sure you have the required tools and access

  • Node.js 20+
  • pnpm
  • Rust 1.75+ only if you plan to build the local agent from source
  • A real secret to inject, such as OPENAI_API_KEY
  • Elevated privileges for the install step: an elevated shell on Windows or sudo on Linux
2
Proxy Server

Prepare the proxy and tunnel listener

On the host that will keep the real secrets, install dependencies, generate certificates, and create the server config from the example file.

cd proxy-server
pnpm install
pnpm run generate-ca
pnpm run generate-tunnel-cert proxy.example.com
cp config/server-config.example.yaml config/server-config.yaml

Then update config/server-config.yaml so it has a reachable public host, at least one authenticated client, the secrets you want to resolve, and the tunnel listener enabled.

proxy:
  host: "0.0.0.0"
  port: 8080
  publicHost: "proxy.example.com"

ca:
  certFile: "certs/ca.crt"
  keyFile: "certs/ca.key"

secrets:
  OPENAI_API_KEY:
    provider: "env"
    path: "OPENAI_API_KEY"
    allowedDomains: ["api.openai.com"]

auth:
  enabled: true
  clients:
    - machineId: "dev-machine-01"
      token: "some-secure-token-here"

logging:
  level: "info"
  audit:
    enabled: true

tunnel:
  enabled: true
  host: "0.0.0.0"
  port: 8443
  tls:
    certFile: "certs/tunnel.crt"
    keyFile: "certs/tunnel.key"
  heartbeatIntervalMs: 30000
  heartbeatTimeoutMs: 90000

Start the proxy after exporting the real secret value on that host:

export OPENAI_API_KEY="sk-your-real-key"
pnpm run dev
3
Optional Panel

Enable the admin panel and connect to it safely

The panel is optional, but it is the easiest way to manage clients, stored secrets, AWS-backed secrets, and audit logs at runtime. To enable it, add or uncomment the panel section in proxy-server/config/server-config.yaml.

panel:
  enabled: true
  port: 9090
  host: "127.0.0.1"
  dbPath: "data/heimdall.db"
  defaultAdminPassword: "change-me-immediately"
  sessionTtlHours: 24
  encryptionKeyFile: "data/encryption.key"

Same machine access

Keep panel.host on 127.0.0.1, start the server, then open http://127.0.0.1:9090/panel/ in your browser.

Remote access

The upstream deployment guide recommends SSH port-forwarding instead of exposing the panel broadly.

ssh -L 9090:127.0.0.1:9090 your-user@proxy.example.com

After the tunnel is open, browse to http://127.0.0.1:9090/panel/ locally.

  • Default username: admin
  • Default password: whatever you set in panel.defaultAdminPassword
  • Change that password immediately after the first login
4
Trust Material

Copy the Heimdall CA certificate to the developer machine

Copy proxy-server/certs/ca.crt from the server to the developer machine as something like heimdall-ca.crt.

  • It lets the local agent trust the tunnel server certificate
  • It is also installed into the workstation trust store so apps trust Heimdall's MITM certificates
5
Local Agent

Download or build heimdall-local-agent

Use a release archive if one is published for your team, or build the local agent from source.

cd local-agent
cargo build --release
cp config/agent-config.example.yaml config/agent-config.yaml

Then edit config/agent-config.yaml so it points at your tunnel host and uses credentials that match an entry under auth.clients on the proxy.

server:
  host: "proxy.example.com"
  port: 8443
  ca_cert: "/path/to/heimdall-ca.crt"

auth:
  machine_id: "dev-machine-01"
  token: "some-secure-token-here"

transparent:
  enabled: true
  host: "0.0.0.0"
  port: 19443
  method: "auto"
6
Transparent Mode Install

Install Heimdall on the developer machine

Run the install step with elevated privileges so Heimdall can install the CA certificate and enable transparent interception.

heimdall-local-agent install \
  --config ./agent-config.yaml \
  --ca-cert /path/to/heimdall-ca.crt

If you built from source instead of using a packaged binary, use:

target/release/heimdall-local-agent install \
  --config config/agent-config.yaml \
  --ca-cert /path/to/heimdall-ca.crt
  • This installs the Heimdall CA certificate into the workstation trust store
  • This enables transparent interception
  • This saves enough state for uninstall to reverse the changes later
Linux note: transparent mode redirects outbound IPv4 and IPv6 traffic with iptables and ip6tables. Root-owned client processes are excluded to avoid tunnel loops.
7
Run And Verify

Start the local agent and confirm placeholder injection works

heimdall-local-agent run --config ./agent-config.yaml

If you built from source instead of using a packaged binary, use:

target/release/heimdall-local-agent run --config config/agent-config.yaml

With the transparent install in place, apps should work without setting HTTPS_PROXY per process. Verify with the placeholder token shown in the repo docs:

curl -4 https://api.openai.com/v1/models \
  -H "Authorization: Bearer __OPENAI_API_KEY__"

curl -6 https://api.openai.com/v1/models \
  -H "Authorization: Bearer __OPENAI_API_KEY__"
  • The application only sees the placeholder
  • The real secret is resolved and injected by the proxy server
  • On Linux, run the verification from a non-root shell

Helpful extra commands from the local-agent docs:

heimdall-local-agent test --config config/agent-config.yaml
heimdall-local-agent status
8
Connect Your App

Send placeholders instead of live secrets

The current repo examples use placeholders such as __OPENAI_API_KEY__. Your app or agent should send that placeholder value, and the proxy replaces it only when the destination is allowed by allowedDomains.

Authorization: Bearer __OPENAI_API_KEY__

That means your AI agent never gets the real secret in its prompt, memory, or logs.

Where To Go After First Setup

Once the transparent flow is working, these upstream docs cover the deeper operator paths.