Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 31 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,7 @@ OpenEvolve implements a sophisticated **evolutionary coding pipeline** that goes
<details>
<summary><b>Advanced LLM Integration</b></summary>

- **Universal API**: Works with OpenAI, Google, local models, and proxies
- **Universal API**: Works with OpenAI, Google, Claude Code CLI, local models, and proxies
- **Intelligent Ensembles**: Weighted combinations with sophisticated fallback
- **Test-Time Compute**: Enhanced reasoning through proxy systems (see [OptiLLM setup](#llm-provider-setup))
- **Plugin Ecosystem**: Support for advanced reasoning plugins
Expand Down Expand Up @@ -233,7 +233,7 @@ OpenEvolve implements a sophisticated **evolutionary coding pipeline** that goes

### Requirements
- **Python**: 3.10+
- **LLM Access**: Any OpenAI-compatible API
- **LLM Access**: Any OpenAI-compatible API, or [Claude Code CLI](https://docs.anthropic.com/en/docs/claude-code)
- **Optional**: Docker for containerized runs

### Installation Options
Expand Down Expand Up @@ -356,6 +356,35 @@ llm:

</details>

<details>
<summary><b>🔮 Claude Code CLI (No API Key)</b></summary>

Use the [Claude Code CLI](https://docs.anthropic.com/en/docs/claude-code) as the LLM backend — no API keys needed, authentication uses the CLI's OAuth session.

```bash
# Install and authenticate
npm install -g @anthropic-ai/claude-code
claude login
```

```yaml
# config.yaml
llm:
provider: "claude_code"
models:
- name: "sonnet"
weight: 0.8
max_tokens: 16000
max_budget_usd: 1.0
- name: "haiku"
weight: 0.2
max_tokens: 8000
```

See the [Claude Code quickstart example](examples/claude_code_quickstart/) for a complete walkthrough.

</details>

## Examples Gallery

<div align="center">
Expand Down
72 changes: 72 additions & 0 deletions examples/claude_code_quickstart/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# Claude Code CLI Quickstart

This example shows how to use the [Claude Code CLI](https://docs.anthropic.com/en/docs/claude-code) as the LLM backend for OpenEvolve. No API keys are needed — authentication uses the CLI's existing OAuth session.

## Prerequisites

1. **Install Claude Code CLI:**
```bash
npm install -g @anthropic-ai/claude-code
```

2. **Authenticate:**
```bash
claude login
```

3. **Install OpenEvolve:**
```bash
pip install openevolve
```

## Run

```bash
python openevolve-run.py \
examples/claude_code_quickstart/initial_program.py \
examples/claude_code_quickstart/evaluator.py \
--config examples/claude_code_quickstart/config.yaml \
--iterations 50
```

## How It Works

The `config.yaml` sets `provider: "claude_code"` which routes all LLM calls through the `claude -p` subprocess instead of the OpenAI-compatible API. The CLI handles authentication, model selection, and billing.

### Key Config Options

| Field | Description | Default |
|-------|-------------|---------|
| `provider` | Set to `"claude_code"` to use the CLI backend | `"openai"` |
| `name` | Claude model name (`sonnet`, `haiku`, `opus`) | `"sonnet"` |
| `max_budget_usd` | Per-call spending cap in USD | `1.0` |
| `timeout` | CLI timeout in seconds | `300` |
| `retries` | Number of retry attempts on failure | `3` |
| `retry_delay` | Seconds between retries | `5` |

### Ensemble Example

You can mix Claude models in an ensemble, just like with OpenAI models:

```yaml
llm:
provider: "claude_code"
models:
- name: "sonnet"
weight: 0.8
max_tokens: 16000
- name: "haiku"
weight: 0.2
max_tokens: 8000
```

### Programmatic Usage

You can also inject the Claude Code backend at runtime without modifying config files:

```python
from openevolve.llm.claude_code import init_claude_code_client

for model_cfg in config.llm.models:
model_cfg.init_client = init_claude_code_client
```
57 changes: 57 additions & 0 deletions examples/claude_code_quickstart/config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# Configuration for function minimization using Claude Code CLI as the LLM backend.
# No API keys needed — authentication uses `claude login` (OAuth session).
#
# Prerequisites:
# 1. Install Claude Code CLI: npm install -g @anthropic-ai/claude-code
# 2. Authenticate: claude login
#
# Run:
# python openevolve-run.py \
# examples/claude_code_quickstart/initial_program.py \
# examples/claude_code_quickstart/evaluator.py \
# --config examples/claude_code_quickstart/config.yaml \
# --iterations 50

max_iterations: 50
checkpoint_interval: 10

llm:
provider: "claude_code"
models:
- name: "sonnet"
weight: 0.8
max_tokens: 16000
timeout: 300
max_budget_usd: 1.0
- name: "haiku"
weight: 0.2
max_tokens: 8000
timeout: 120
max_budget_usd: 0.5
retries: 3
retry_delay: 5

prompt:
system_message: >
You are an expert programmer specializing in optimization algorithms.
Your task is to improve a function minimization algorithm to find the
global minimum of a complex function with many local minima.
The function is f(x, y) = sin(x) * cos(y) + sin(x*y) + (x^2 + y^2)/20.
Focus on improving the search_algorithm function to reliably find the
global minimum, escaping local minima that might trap simple algorithms.

database:
population_size: 50
archive_size: 20
num_islands: 3
elite_selection_ratio: 0.2
exploitation_ratio: 0.7
similarity_threshold: 0.99

evaluator:
timeout: 60
cascade_thresholds: [1.3]
parallel_evaluations: 3

diff_based_evolution: true
max_code_length: 20000
Loading
Loading