When using OpenAI as the LLM provider (default), the API key needs the following capabilities. These match what the original TS port required.
- Model: write/respond — for
chat.completions.parse(structured output via Pydantic).
- Create a Project API key restricted to a single project.
- Set the role to All for the model API.
- Cap the project's monthly spend in Settings → Billing.
- Set
LLM_API_KEYon thesf-pulse-python-envenv group.
To use Anthropic instead, set LLM_API_KEY to a key that begins with sk-ant-. The factory auto-detects the provider from the prefix; no other config is needed.
The Anthropic provider uses tool-use to coerce structured output. Models that support tool use (claude-3-5-sonnet, claude-3-5-haiku, claude-haiku-4-5, etc.) work out of the box. Set LLM_MODEL to override the default.