-
Notifications
You must be signed in to change notification settings - Fork 591
Description
Summary
The SDK's TypeScript types include a provider configuration option in createSession() that suggests support for custom OpenAI-compatible providers (like Ollama), but the CLI completely ignores this configuration and always connects to GitHub's hosted models.
This prevents developers from:
- Using local Ollama models for privacy-sensitive code review
- Running the SDK in air-gapped/offline environments
- Avoiding API costs by using local models
- Testing with self-hosted model infrastructure
Expected Behavior
Based on the TypeScript types, this should work:
const session = await client.createSession({
model: 'deepseek-coder-v2:16b',
provider: {
type: 'openai',
baseUrl: 'http://localhost:11434/v1' // Ollama endpoint
},
streaming: true
});Expected: SDK connects to http://localhost:11434/v1 instead of GitHub's API.
Actual Behavior
The provider config is silently ignored:
- Session creates successfully ✅
- CLI always connects to GitHub's hosted API ❌
- Local Ollama models are never called ❌
- No error or warning that provider config was ignored ❌
Evidence / Investigation Results
1. TypeScript Types Suggest Feature Exists
Location: node_modules/@github/copilot-sdk/dist/index.d.ts
export interface SessionOptions {
model: string;
provider?: {
type: 'openai';
baseUrl: string;
apiKey?: string;
};
streaming?: boolean;
systemMessage?: string;
tools?: Tool[];
}The types clearly define provider as an optional parameter with baseUrl support.
2. CLI Ignores Provider Config
Test Code:
import { CopilotClient } from '@github/copilot-sdk';
const client = new CopilotClient({
logLevel: 'debug',
cliPath: './node_modules/.bin/copilot'
});
// Ollama is running at localhost:11434
const session = await client.createSession({
model: 'deepseek-coder-v2:16b',
provider: {
type: 'openai',
baseUrl: 'http://localhost:11434/v1' // Should use this
},
streaming: true
});
await session.send({ prompt: 'Say hello' });Result:
✅ Session created
📨 Event: user.message
📨 Event: assistant.turn_start
📨 Event: session.usage_info
📨 Event: pending_messages.modified (repeats indefinitely)
⏰ Timeout after 30s (no response from Ollama)
Ollama Logs (running with ollama serve):
# Ollama receives NO requests from the SDK
# Confirmed with: ollama serve --debug
# Only sees health checks from curl, zero requests from Copilot CLI3. Environment Variable Overrides Don't Work
Attempted various environment variables to force provider config:
# None of these work
export GITHUB_COPILOT_API_URL="http://localhost:11434/v1"
export COPILOT_API_URL="http://localhost:11434/v1"
export COPILOT_ENDPOINT="http://localhost:11434/v1"
export OPENAI_API_BASE="http://localhost:11434/v1"
export OPENAI_BASE_URL="http://localhost:11434/v1"Result: All ignored, CLI still connects to GitHub.