LLM providers
Use any OpenAI-compatible API, local Ollama models, or air-gapped providers.
Overview
Use any OpenAI-compatible API, local Ollama models, or air-gapped providers.
Coming soon
This section is a skeleton.
Detailed copy, runnable examples, and screenshots land here as the feature stabilizes. For the most accurate current behaviour, check the source repository or reach out to the team.