Quickstart

Get Knowledge Tree running with Docker Compose and run your first discovery in minutes.

Prerequisites

  • Docker and Docker Compose
  • 4 GB RAM minimum (8 GB recommended for LLM enrichment)
  • Cloud credentials for at least one provider (AWS, Azure, GCP) or a Kubernetes cluster

Step 1: Clone and Configure

git clone https://github.com/knowledge-tree/knowledge-tree.git
cd knowledge-tree
cp configs/dev.yaml configs/local.yaml

Edit configs/local.yaml to enable the discovery plugins for your environment. At minimum, enable one plugin (e.g., Kubernetes or DNS) to see results immediately.

Step 2: Start with Docker Compose

cd deploy/docker
docker compose up -d

This starts PostgreSQL (with AGE and pgvector extensions), the API server, the enricher service, and the web UI. The API is available athttp://localhost:8080.

Step 3: Run Your First Discovery

# Using the CLI
./kt-discover run --config configs/local.yaml

# Or trigger via API
curl -X POST http://localhost:8080/api/v1/discovery/run \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY"

Step 4: Explore

  • Web UI — Open http://localhost:8080 for the React dashboard
  • Service catalogGET /api/v1/services
  • Graph queryPOST /api/v1/graph/query with a Cypher query
  • Cost intelligenceGET /api/v1/cost/intelligence
  • ComplianceGET /api/v1/compliance
First Run with Kubernetes
If you have a Kubernetes cluster with kubeconfig configured, the Kubernetes plugin works out of the box with no credentials. Just enable it in your config and run discovery.

Step 5: Configure Enrichment (Optional)

To enable AI-powered summaries and runbooks, configure an LLM provider in your config:

llm:
  provider: "ollama"
  base_url: "http://localhost:11434"
  model: "llama3:70b"
  max_tokens: 4096

Or use a cloud provider like OpenRouter, OpenAI, or Bedrock. The enricher will automatically generate descriptions, runbooks, and security analyses for discovered resources.

Next Steps