Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions docs/architecture/acp-client-runtime/plan.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# ACP Client Runtime Plan

## Runtime Boundary

Add `src/main/presenter/acpClientPresenter/` as the internal ACP runtime boundary. The first implementation keeps the current provider-facing API stable by making `AcpProvider` a compatibility adapter over the runtime.

The runtime owns:

- connection/process lifecycle and debug backlog;
- session lifecycle and MCP forwarding;
- prompt concurrency;
- client-side handlers for permissions, filesystem, terminal, and auth;
- event mapping from ACP updates to DeepChat stream/workspace events.

## Implementation Sequence

1. Create SDD docs and conformance checklist.
2. Add the ACP runtime facade and prompt controller.
3. Move provider construction to the runtime facade while keeping public behavior compatible.
4. Remove warmup `session/new` probing and let real session responses drive config/mode/model state.
5. Register persisted load-session handlers before `session/load` so replayed updates are not dropped.
6. Route debug actions through the initialized connection state and real MCP selections.
7. Harden fs and terminal workdir guards.
8. Refresh settings on ACP agent change events and release running processes before repair.
9. Add focused regression tests, then run formatting, i18n, and lint checks.

## Compatibility

- Existing ACP registry/manual agent configuration remains valid.
- Existing ACP conversation/session records remain valid.
- Provider id `acp` and model ids remain unchanged.
- No renderer IPC contract changes are required for this slice.
47 changes: 47 additions & 0 deletions docs/architecture/acp-client-runtime/spec.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# ACP Client Runtime

## Problem

DeepChat's ACP integration currently behaves like a provider helper instead of a full ACP client runtime. This causes protocol drift and product bugs:

- warmup can create a temporary `session/new` with empty MCP servers, so agent configuration is probed with a different context than real turns;
- debug `initialize` can be invoked on a connection that was already initialized;
- settings updates for install/repair/enable are not always reflected automatically;
- ACP sessions, file-system access, terminal access, permissions, and session updates do not have a single runtime boundary.

## Goals

- Introduce an internal ACP client runtime boundary that owns connection, session, handler, mapping, workspace, registry, and debug concerns.
- Align DeepChat with ACP's client responsibilities: spawn stdio agents, initialize once, create/load sessions with the current cwd and MCP servers, forward prompts, handle client-side fs/terminal/permission calls, and map `session/update`.
- Preserve DeepChat's existing public model selector, Workspace, MCP, Skills, Remote Control, IPC, and renderer route surfaces.
- Match Zed's configuration boundary: DeepChat forwards cwd/env/MCP/model/mode/config options; ACP agents read their own native configuration directly.

## Non-Goals

- Do not add a Claude-specific or Codex-specific local configuration resolver.
- Do not reset or migrate already persisted ACP summary/session state beyond compatible schema additions.
- Do not redesign the model selector UI in this change.

## Protocol Matrix

| ACP area | DeepChat behavior |
| --- | --- |
| `initialize` | Sent once per process connection after stdio spawn. Capabilities and auth methods are cached on the connection handle. |
| `authenticate` | Routed through the ACP auth/terminal handler when required by the agent. |
| `session/new` | Uses the resolved conversation workdir and DeepChat MCP selections filtered by agent transport capabilities. |
| `session/load` | Used only when the agent declares load support and DeepChat has a persisted ACP session id. |
| `session/prompt` | One active prompt per ACP session; response `stopReason` is persisted as turn completion state. |
| `session/update` | Mapped to DeepChat message/content/tool/plan/diff/terminal/permission events. |
| `session/request_permission` | Routed through DeepChat permission UI/policy and remote-control-safe resolution. |
| `fs/read_text_file` and `fs/write_text_file` | Guarded by realpath validation against registered workdirs. |
| Terminal lifecycle | Bound to the session workdir and routed through DeepChat terminal management. |
| Debug | Shows lifecycle, request, response, notification, permission, stderr, and error entries without re-initializing an existing connection. |

## Acceptance Criteria

- Warmup never creates a throwaway session solely to fetch config state.
- Debug initialize starts or reports the initialized connection and does not send a second ACP initialize request.
- Real sessions are the source of models/modes/config options.
- Settings refresh after enable/install/repair/uninstall without requiring a manual reopen.
- ACP fs and terminal operations are scoped to the registered workdir.
- Non-ACP providers keep their existing context-budget behavior.
16 changes: 16 additions & 0 deletions docs/architecture/acp-client-runtime/tasks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# ACP Client Runtime Tasks

- [x] Add SDD spec, plan, and tasks documents.
- [x] Add internal ACP runtime facade and prompt controller.
- [x] Stop warmup from creating temporary sessions with empty MCP servers.
- [x] Use real MCP selections for debug `session/new` and `session/load`.
- [x] Make debug `initialize` report initialized connection state instead of sending a second initialize request.
- [x] Register load-session handlers before `session/load` replay can emit updates.
- [x] Enforce realpath workspace guards for ACP fs reads/writes.
- [x] Bind terminal cwd to the registered ACP session workdir.
- [x] Refresh ACP settings on agent-change events.
- [x] Release running ACP processes before repair.
- [x] Add targeted ACP regression coverage.
- [x] Run `pnpm run format`.
- [x] Run `pnpm run i18n`.
- [x] Run `pnpm run lint`.
26 changes: 26 additions & 0 deletions docs/issues/acp-context-budget-bypass/plan.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# ACP Context Budget Bypass Plan

## Implementation

- Add an internal `providerId === 'acp'` budget-bypass helper in `AgentRuntimePresenter`.
- For ACP new turns and resume turns, use existing persisted summary state but skip new
compaction-intent preparation caused by DeepChat context pressure.
- Build ACP request contexts with an effectively unbounded context budget so DeepChat does not trim
history based on ACP model metadata.
- In the provider loop, bypass `preflightRequestContext`, context-pressure recovery, overflow error
creation, and effective max-token shrinking for ACP. Keep rate-limit, abort, and steer handling.
- Treat ACP resume tool-budget checks as unbounded so DeepChat does not replace tool results with
context-window errors before the ACP provider sees the continuation.

## Compatibility

- The bypass applies to all ACP agents, including registry and custom agents.
- Non-ACP behavior remains unchanged.
- Existing persisted summaries remain available and are appended when present; this change only
prevents ACP turns from creating new budget-driven summaries.

## Tests

- Add main-process tests for oversized ACP prompt delivery and lack of budget overflow errors.
- Add coverage proving ACP context pressure does not start DeepChat compaction.
- Keep existing non-ACP overflow tests as regression coverage.
36 changes: 36 additions & 0 deletions docs/issues/acp-context-budget-bypass/spec.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# ACP Context Budget Bypass Spec

> Status: Draft
> Date: 2026-05-12

## Background

ACP agents such as Claude Code maintain their own conversation and model context policy. DeepChat
currently routes ACP sessions through the generic agent runtime context preflight, which can block
ACP prompts before the ACP agent receives them with:

`Request was not sent because it cannot fit within the model context window after applying the safety margin.`

That check is appropriate for DeepChat-managed model calls, but ACP-backed requests should be
delegated to the ACP agent.

## Goals

- Skip DeepChat model-context preflight and recovery for every `providerId === 'acp'` request.
- Let ACP agents receive the prompt and handle context-window pressure themselves.
- Preserve current behavior for non-ACP providers.

## Acceptance Criteria

- ACP requests that exceed DeepChat's estimated context budget still reach the ACP provider.
- ACP requests do not trigger DeepChat context-pressure compaction or request trimming solely due to
DeepChat's context-window estimate.
- ACP request max tokens are not shrunk by DeepChat's safety-margin preflight.
- Non-ACP providers keep existing preflight, compaction recovery, and overflow failure behavior.
- No public API, IPC, schema, or renderer UI changes are introduced.

## Non-Goals

- Redesign ACP prompt formatting.
- Reset or migrate existing session summaries.
- Change ACP workdir, permission, rate-limit, abort, or session lifecycle behavior.
9 changes: 9 additions & 0 deletions docs/issues/acp-context-budget-bypass/tasks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# ACP Context Budget Bypass Tasks

- [x] Add SDD spec and plan.
- [x] Add ACP budget-bypass helper.
- [x] Skip ACP budget compaction and request trimming for new and resume turns.
- [x] Bypass provider-loop preflight and max-token shrinking for ACP.
- [x] Treat ACP resume tool budget as unbounded.
- [x] Add focused main-process regression tests.
- [x] Run targeted tests and required project checks.
16 changes: 16 additions & 0 deletions docs/issues/acp-pr-1614-review-fixes/plan.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Implementation Plan

## Change

- Update ACP process cwd resolution and initialization timeout cleanup in place.
- Harden ACP session cleanup paths with guarded cleanup and original-error rethrowing.
- Await ACP turn persistence in `AcpProvider.runPrompt`, using small helper methods to keep error handling explicit.
- Replace the duplicated ACP event union with the shared contract plus provider-only tool event variants.
- Update the reviewed locale entries for the lifecycle event kind.

## Validation

- Add focused test coverage for relative terminal cwd resolution.
- Run focused ACP tests where changed.
- Run repository-required `pnpm run format`, `pnpm run i18n`, and `pnpm run lint`, plus
`pnpm run typecheck`.
21 changes: 21 additions & 0 deletions docs/issues/acp-pr-1614-review-fixes/spec.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# ACP PR 1614 Review Fixes

## Problem

PR review feedback on the ACP client runtime changes found several small reliability and consistency issues that should be fixed before merge.

## Acceptance Criteria

- Terminal cwd values supplied by ACP agents are resolved relative to the active session workdir when they are not absolute, and cwd escapes still fall back safely.
- ACP session initialization cleanup preserves the original initialization error even if unbinding fails.
- Persisted ACP session hook disposal cannot prevent fallback to `newSession`.
- ACP connection initialization clears its timeout handle after `Promise.race` settles.
- ACP turn persistence awaits start and finish writes and reports persistence failures instead of silently dropping them.
- ACP event types use the shared contract as the source of truth.
- Newly added lifecycle event labels are localized for reviewed locales.

## Non-goals

- No broad ACP behavior redesign.
- No full localization rewrite outside the reviewed lifecycle labels.
- No database schema changes beyond the existing ACP turn persistence shape.
7 changes: 7 additions & 0 deletions docs/issues/acp-pr-1614-review-fixes/tasks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Tasks

- [x] Add SDD artifacts.
- [x] Fix validated PR review findings.
- [x] Add or update focused tests.
- [x] Run validation commands.
- [x] Push the synchronized PR branch.
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
import type { LLM_PROVIDER, IConfigPresenter, AcpAgentConfig } from '@shared/presenter'
import type { ProviderMcpRuntimePort } from '@/presenter/llmProviderPresenter/runtimePorts'
import { AcpProcessManager, type AcpProcessHandle } from '@/presenter/llmProviderPresenter/acp'
import type { AcpConnectionRef, StartAcpConnectionInput } from '../types'
import { PROTOCOL_VERSION } from '@agentclientprotocol/sdk'

export class AcpConnectionManager {
readonly processManager: AcpProcessManager

constructor(
provider: LLM_PROVIDER,
configPresenter: IConfigPresenter,
mcpRuntime?: ProviderMcpRuntimePort
) {
this.processManager = new AcpProcessManager({
providerId: provider.id,
resolveLaunchSpec: (agentId, workdir) =>
configPresenter.resolveAcpLaunchSpec(agentId, workdir),
getAgentState: (agentId) => configPresenter.getAcpAgentState(agentId),
getNpmRegistry: async () => mcpRuntime?.getNpmRegistry?.() ?? null,
getUvRegistry: async () => mcpRuntime?.getUvRegistry?.() ?? null
})
}

async startConnection(input: StartAcpConnectionInput): Promise<AcpConnectionRef> {
const handle = await this.processManager.getConnection(input.agent, input.workdir)
return this.toRef(handle)
}

async release(agentId: string): Promise<void> {
await this.processManager.release(agentId)
}

toRef(handle: AcpProcessHandle): AcpConnectionRef {
return {
id: `${handle.agentId}:${handle.workdir}`,
agentId: handle.agentId,
workdir: handle.workdir,
protocolVersion: String(PROTOCOL_VERSION),
capabilities: handle.agentCapabilities,
authMethods: handle.authMethods,
status: handle.status === 'ready' ? 'ready' : 'error'
}
}

async getConnection(agent: AcpAgentConfig, workdir?: string): Promise<AcpProcessHandle> {
return this.processManager.getConnection(agent, workdir)
}
}
43 changes: 43 additions & 0 deletions src/main/presenter/acpClientPresenter/connection/AcpDebugLog.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
import { nanoid } from 'nanoid'
import type { AcpDebugEventEntry, AcpDebugEventKind } from '@shared/presenter'

const MAX_DEBUG_EVENTS_PER_AGENT = 300

export class AcpDebugLog {
private readonly eventsByAgent = new Map<string, AcpDebugEventEntry[]>()

append(
agentId: string,
entry: Omit<AcpDebugEventEntry, 'id' | 'timestamp' | 'agentId'>
): AcpDebugEventEntry {
const event: AcpDebugEventEntry = {
...entry,
id: nanoid(),
timestamp: Date.now(),
agentId
}
const events = this.eventsByAgent.get(agentId) ?? []
events.push(event)
if (events.length > MAX_DEBUG_EVENTS_PER_AGENT) {
events.splice(0, events.length - MAX_DEBUG_EVENTS_PER_AGENT)
}
this.eventsByAgent.set(agentId, events)
return event
}

appendLifecycle(agentId: string, action: string, payload?: unknown): AcpDebugEventEntry {
return this.append(agentId, { kind: 'lifecycle' as AcpDebugEventKind, action, payload })
}

list(agentId: string): AcpDebugEventEntry[] {
return [...(this.eventsByAgent.get(agentId) ?? [])]
}

clear(agentId?: string): void {
if (agentId) {
this.eventsByAgent.delete(agentId)
return
}
this.eventsByAgent.clear()
}
}
1 change: 1 addition & 0 deletions src/main/presenter/acpClientPresenter/handlers/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export { AcpFsHandler, AcpTerminalManager } from '@/presenter/llmProviderPresenter/acp'
78 changes: 78 additions & 0 deletions src/main/presenter/acpClientPresenter/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
import type { IConfigPresenter, LLM_PROVIDER } from '@shared/presenter'
import {
AcpSessionPersistence,
type AcpProcessHandle,
type AcpSessionRecord
} from '@/presenter/llmProviderPresenter/acp'
import type { ProviderMcpRuntimePort } from '@/presenter/llmProviderPresenter/runtimePorts'
import { AcpConnectionManager } from './connection/AcpConnectionManager'
import { AcpSessionRuntime } from './session/AcpSessionRuntime'
import { AcpPromptController } from './session/AcpPromptController'
import { AcpEventMapper } from './mapper/AcpEventMapper'
import type { AcpConnectionRef, CancelAcpPromptInput, StartAcpConnectionInput } from './types'

export class AcpClientPresenter {
readonly connectionManager: AcpConnectionManager
readonly sessionRuntime: AcpSessionRuntime
readonly promptController = new AcpPromptController()
readonly eventMapper = new AcpEventMapper()
readonly sessionPersistence: AcpSessionPersistence

constructor(input: {
provider: LLM_PROVIDER
configPresenter: IConfigPresenter
sessionPersistence: AcpSessionPersistence
mcpRuntime?: ProviderMcpRuntimePort
}) {
this.sessionPersistence = input.sessionPersistence
this.connectionManager = new AcpConnectionManager(
input.provider,
input.configPresenter,
input.mcpRuntime
)
this.sessionRuntime = new AcpSessionRuntime({
providerId: input.provider.id,
processManager: this.connectionManager.processManager,
sessionPersistence: input.sessionPersistence,
configPresenter: input.configPresenter
})
}

get processManager() {
return this.connectionManager.processManager
}

get sessionManager() {
return this.sessionRuntime.sessionManager
}

async startConnection(input: StartAcpConnectionInput): Promise<AcpConnectionRef> {
return this.connectionManager.startConnection(input)
}

async cancel(input: CancelAcpPromptInput): Promise<void> {
const session = this.sessionManager.getSessionById(input.sessionId)
await session?.connection.cancel({ sessionId: input.sessionId })
this.promptController.cancel(input.sessionId)
}

toConnectionRef(handle: AcpProcessHandle): AcpConnectionRef {
return this.connectionManager.toRef(handle)
}

toSessionRef(session: AcpSessionRecord) {
return {
id: session.sessionId,
acpSessionId: session.sessionId,
conversationId: session.conversationId,
connectionId: `${session.agentId}:${session.workdir}`,
workdir: session.workdir,
modeId: session.currentModeId,
status: session.status
}
}
}

export type * from './types'
export { AcpPromptController, type AcpPromptTurn } from './session/AcpPromptController'
export { AcpDebugLog } from './connection/AcpDebugLog'
Loading