Skip to main content

How it works

complior fix --ai
When --ai is used, the document generation pipeline adds an extra step: LLM fills [TO BE COMPLETED] and <!-- GUIDANCE --> markers using passport data as context.
Without --aiWith --ai
Documents scaffolded with placeholdersLLM fills placeholders (90–95% complete)
Compliance flags already setSame flags, richer content
25–80% pre-filled from passport90–95% filled
Instant+10–30 seconds per document

Safety

The LLM call is wrapped through @complior/sdk proxy with prohibited, sanitize, and rate-limit hooks active. The LLM cannot bypass compliance checks.
LLM does NOT modify passport JSON. It only enriches document markdown files. Compliance flags (fria_completed, policy_generated) are set identically with or without --ai.

Fallback

If LLM call fails (network error, rate limit, invalid key), the fixer returns the document from the scaffold stage — with placeholders but all compliance flags intact. No data loss, no silent failure.

BYOK configuration

Configure your LLM provider in .complior/config.toml:
[llm]
provider = "openai"          # openai, anthropic, openrouter, ollama
model = "gpt-4o"
api_key_env = "OPENAI_API_KEY"  # reads from environment variable