Skip to content

feat: add xAI Grok support via OpenAI Responses format#73

Closed
Shriiii01 wants to merge 10 commits intoekailabs:mainfrom
Shriiii01:feature/xai-responses-support
Closed

feat: add xAI Grok support via OpenAI Responses format#73
Shriiii01 wants to merge 10 commits intoekailabs:mainfrom
Shriiii01:feature/xai-responses-support

Conversation

@Shriiii01
Copy link
Copy Markdown
Contributor

Fixes #19. Enables Codex to use Grok models through /v1/responses.

Shriiii01 and others added 5 commits February 7, 2026 16:54
- Add OllamaProvider class with OpenAI-compatible API support
- Register Ollama in ProviderRegistry with model selection rules
- Add Ollama configuration to AppConfig (baseUrl, apiKey, enabled)
- Add Ollama to chat_completions_providers_v1.json catalog with 16 popular models
- Add ollama.yaml pricing file (free/local models)
- Update ProviderName type to include 'ollama'
- Add OLLAMA_BASE_URL and OLLAMA_API_KEY to .env.example

Ollama runs models locally and exposes an OpenAI-compatible API at
http://localhost:11434/v1 by default. Users can configure a custom
base URL via OLLAMA_BASE_URL environment variable.
- Added Ollama to responses_providers_v1.json catalog
- Created OllamaResponsesPassthrough class implementing Responses API
- Registered Ollama in responses-passthrough-registry.ts

Ollama supports the OpenResponses API specification at /v1/responses endpoint,
providing future-proof support as /chat/completions may be deprecated.
Fixes ekailabs#19. Enables Codex to use Grok models through /v1/responses.
@Shriiii01 Shriiii01 force-pushed the feature/xai-responses-support branch 2 times, most recently from f1e4754 to a34a2d0 Compare March 7, 2026 20:46
@sm86 sm86 closed this Apr 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: Support xAI Grok Models via OpenAI Responses Format

2 participants