@@ -17,6 +17,97 @@ The `@trigger.dev/sdk` provides a custom [ChatTransport](https://sdk.vercel.ai/d
1717
1818No custom API routes needed. Your chat backend is a Trigger.dev task.
1919
20+ <Accordion title = " How it works (sequence diagrams)" >
21+
22+ ### First message flow
23+
24+ ``` mermaid
25+ sequenceDiagram
26+ participant User
27+ participant useChat as useChat + Transport
28+ participant API as Trigger.dev API
29+ participant Task as chat.task Worker
30+ participant LLM as LLM Provider
31+
32+ User->>useChat: sendMessage("Hello")
33+ useChat->>useChat: No session for chatId → trigger new run
34+ useChat->>API: triggerTask(payload, tags: [chat:id])
35+ API-->>useChat: { runId, publicAccessToken }
36+ useChat->>useChat: Store session, subscribe to SSE
37+
38+ API->>Task: Start run with ChatTaskWirePayload
39+ Task->>Task: onChatStart({ chatId, messages, clientData })
40+ Task->>Task: onTurnStart({ chatId, messages })
41+ Task->>LLM: streamText({ model, messages, abortSignal })
42+ LLM-->>Task: Stream response chunks
43+ Task->>API: streams.pipe("chat", uiStream)
44+ API-->>useChat: SSE: UIMessageChunks
45+ useChat-->>User: Render streaming text
46+ Task->>API: Write __trigger_turn_complete
47+ API-->>useChat: SSE: turn complete + refreshed token
48+ useChat->>useChat: Close stream, update session
49+ Task->>Task: onTurnComplete({ messages, stopped: false })
50+ Task->>Task: Wait for next message (warm → suspend)
51+ ```
52+
53+ ### Multi-turn flow
54+
55+ ``` mermaid
56+ sequenceDiagram
57+ participant User
58+ participant useChat as useChat + Transport
59+ participant API as Trigger.dev API
60+ participant Task as chat.task Worker
61+ participant LLM as LLM Provider
62+
63+ Note over Task: Suspended, waiting for message
64+
65+ User->>useChat: sendMessage("Tell me more")
66+ useChat->>useChat: Session exists → send via input stream
67+ useChat->>API: sendInputStream(runId, "chat-messages", payload)
68+ Note right of useChat: Only sends new message (not full history)
69+
70+ API->>Task: Deliver to messagesInput
71+ Task->>Task: Wake from suspend
72+ Task->>Task: Append to accumulated messages
73+ Task->>Task: onTurnStart({ turn: 1 })
74+ Task->>LLM: streamText({ messages: [all accumulated] })
75+ LLM-->>Task: Stream response
76+ Task->>API: streams.pipe("chat", uiStream)
77+ API-->>useChat: SSE: UIMessageChunks
78+ useChat-->>User: Render streaming text
79+ Task->>API: Write __trigger_turn_complete
80+ Task->>Task: onTurnComplete({ turn: 1 })
81+ Task->>Task: Wait for next message (warm → suspend)
82+ ```
83+
84+ ### Stop signal flow
85+
86+ ``` mermaid
87+ sequenceDiagram
88+ participant User
89+ participant useChat as useChat + Transport
90+ participant API as Trigger.dev API
91+ participant Task as chat.task Worker
92+ participant LLM as LLM Provider
93+
94+ Note over Task: Streaming response...
95+
96+ User->>useChat: Click "Stop"
97+ useChat->>API: sendInputStream(runId, "chat-stop", { stop: true })
98+ API->>Task: Deliver to stopInput
99+ Task->>Task: stopController.abort()
100+ LLM-->>Task: Stream ends (AbortError)
101+ Task->>Task: cleanupAbortedParts(responseMessage)
102+ Note right of Task: Remove partial tool calls,<br/>mark streaming parts as done
103+ Task->>API: Write __trigger_turn_complete
104+ API-->>useChat: SSE: turn complete
105+ Task->>Task: onTurnComplete({ stopped: true })
106+ Task->>Task: Wait for next message
107+ ```
108+
109+ </Accordion >
110+
20111<Note >
21112 Requires ` @trigger.dev/sdk ` version ** 4.4.0 or later** and the ` ai ` package ** v5.0.0 or later** .
22113</Note >
0 commit comments