Skip to content

Conversation

@Octane0411
Copy link

Summary

This PR introduces multi-provider support, allowing the agents to work with other LLM providers (like OpenAI, Gemini, DeepSeek) in addition to Anthropic.

It implements an Adapter pattern (provider_utils.py) that wraps OpenAI-compatible clients to match the Anthropic interface. Crucially, this change is fully backward compatible with Anthropic and existing .env configurations. If no provider is specified, it defaults to Anthropic, so existing setups will continue to work without modification.

Changes

  • Added provider_utils.py: A utility module that handles client initialization and adapts OpenAI responses to the Anthropic format.
  • Updated Agents (v0 - v4): Replaced direct Anthropic client initialization with get_client() and get_model() from provider_utils.
  • Updated requirements.txt: Added openai dependency.
  • Updated .env.example: Added configuration examples for OpenAI, Gemini, and custom providers.

Related Issue

Closes #12

Testing

Tested locally with Gemini API and verified that all the agents work correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Add multi-provider support (GPT, Gemini, etc.)

1 participant