Chat Interface

Ask natural-language questions about your infrastructure. Knowledge Tree queries the graph and returns grounded answers.

Overview

The chat interface lets you ask questions about your infrastructure in plain English. It queries the knowledge graph, assembles context packs, and sends them to a configured LLM for a grounded response.

Endpoints

# WebSocket (streaming responses)
ws://localhost:8080/api/v1/chat

# REST (single response)
POST /api/v1/chat/message
Content-Type: application/json
{"message": "What depends on the production database?"}

How It Works

  1. Your question is analyzed to identify relevant resources
  2. A context pack is assembled from the knowledge graph (resource details, relationships, documentation)
  3. The context pack and your question are sent to the configured LLM
  4. The LLM generates a response grounded in your actual infrastructure data

Example Questions

  • "What resources are in us-east-1?"
  • "Which services depend on the orders database?"
  • "Show me all unencrypted EBS volumes"
  • "What changed in the last discovery run?"
  • "Give me a runbook for the API gateway service"
  • "What is our estimated monthly AWS spend?"

Configuration

The chat uses the same LLM configuration as the enrichment service. Configure it in your config file:

llm:
  provider: "openrouter"
  base_url: "https://openrouter.ai/api/v1"
  api_key: "$" + "{OPENROUTER_API_KEY}"
  model: "moonshotai/kimi-k2.5"
  max_tokens: 4096