Architecture
How Knowledge Tree's components fit together: discovery plugins, the knowledge graph, the enrichment pipeline, and the API layer.
High-Level Overview
Knowledge Tree follows a pipeline architecture with four main stages: discovery, storage, enrichment, and serving. Each stage is decoupled and can scale independently.
Discovery Plugins ──▶ Knowledge Graph ──▶ Enrichment ──▶ REST API + MCP
(Go/gRPC) (PostgreSQL + (Python + + Web UI
AGE + pgvector) LLMs)Discovery Layer
Discovery plugins run as separate processes, communicating with the main server over gRPC using HashiCorp's go-plugin system. Each plugin implements the DiscoveryPlugin interface and emits resources and relationships as they are discovered.
- Plugin Manager — Loads, starts, and monitors plugin processes. Handles lifecycle (start, stop, health checks).
- Orchestrator — Schedules discovery runs per scope using robfig/cron. Manages concurrency limits and error handling.
- Pipeline — Receives events from plugins, applies redaction, and writes to storage in batched transactions.
Storage Layer
All data lives in PostgreSQL with two extensions:
- Apache AGE — Adds property graph support. Resources are labeled nodes with typed, directed edges. Supports Cypher query syntax.
- pgvector — Stores vector embeddings for semantic similarity search used by AI-generated content and the MCP server.
The storage layer exposes three interfaces:
| Interface | Purpose | Key Operations |
|---|---|---|
GraphStore | Property graph (nodes + edges) | AddNode, QueryNodes, Traverse, SearchNodes |
MetadataStore | Relational metadata | Scopes, Runs, Config, Audit entries |
VectorStore | Embedding search | StoreEmbedding, SearchEmbeddings |
Enrichment Layer
The Python enricher service (port 8001) connects to local or cloud LLMs to generate AI-powered content. It supports Ollama, OpenAI-compatible APIs, and AWS Bedrock with temperature tuning per enrichment type.
| Type | Temperature | Output |
|---|---|---|
| Service Description | 0.2 | Structured resource catalog entry |
| Security Analysis | 0.1 | JSON with CIS benchmark findings |
| Executive Summary | 0.4 | Business-oriented infrastructure overview |
| Runbook | 0.2 | Markdown with CLI commands and thresholds |
| Architecture | 0.3 | Pattern analysis with Mermaid diagrams |
| Anomaly Detection | 0.1 | JSON diff report with severity ratings |
API Layer
The Go API server exposes a REST API (port 8080) built on chi with middleware for auth, CORS, rate limiting, and scope isolation.
Plugin System
Plugins communicate via gRPC over a Unix socket using HashiCorp's go-plugin framework. The protocol is defined in Protocol Buffers. Plugins run as separate OS processes, isolated from the main server.
The SDK (sdk/ package) provides a public Go API for writing custom discovery plugins.