Skip to content

Commit dd5da5a

Browse files
committed
use input streams and rename chatTask and chatState to chat.task and chat.state
1 parent 41764a5 commit dd5da5a

File tree

9 files changed

+295
-197
lines changed

9 files changed

+295
-197
lines changed

docs/guides/ai-chat.mdx

Lines changed: 20 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -24,22 +24,24 @@ No custom API routes needed. Your chat backend is a Trigger.dev task.
2424

2525
### 1. Define a chat task
2626

27-
Use `chatTask` from `@trigger.dev/sdk/ai` to define a task that handles chat messages. The payload is automatically typed as `ChatTaskPayload`.
27+
Use `chat.task` from `@trigger.dev/sdk/ai` to define a task that handles chat messages. The payload is automatically typed as `ChatTaskPayload` with abort signals.
2828

2929
If you return a `StreamTextResult` from `run`, it's **automatically piped** to the frontend.
3030

3131
```ts trigger/chat.ts
32-
import { chatTask } from "@trigger.dev/sdk/ai";
32+
import { chat } from "@trigger.dev/sdk/ai";
3333
import { streamText, convertToModelMessages } from "ai";
3434
import { openai } from "@ai-sdk/openai";
3535

36-
export const myChat = chatTask({
36+
export const myChat = chat.task({
3737
id: "my-chat",
38-
run: async ({ messages }) => {
38+
run: async ({ messages, signal }) => {
3939
// messages is UIMessage[] from the frontend
40+
// signal fires on stop or run cancel
4041
return streamText({
4142
model: openai("gpt-4o"),
4243
messages: convertToModelMessages(messages),
44+
abortSignal: signal,
4345
});
4446
// Returning a StreamTextResult auto-pipes it to the frontend
4547
},
@@ -116,35 +118,36 @@ export function Chat({ accessToken }: { accessToken: string }) {
116118
The easiest approach — return the `streamText` result from `run` and it's automatically piped to the frontend:
117119

118120
```ts
119-
import { chatTask } from "@trigger.dev/sdk/ai";
121+
import { chat } from "@trigger.dev/sdk/ai";
120122
import { streamText, convertToModelMessages } from "ai";
121123
import { openai } from "@ai-sdk/openai";
122124

123-
export const simpleChat = chatTask({
125+
export const simpleChat = chat.task({
124126
id: "simple-chat",
125-
run: async ({ messages }) => {
127+
run: async ({ messages, signal }) => {
126128
return streamText({
127129
model: openai("gpt-4o"),
128130
system: "You are a helpful assistant.",
129131
messages: convertToModelMessages(messages),
132+
abortSignal: signal,
130133
});
131134
},
132135
});
133136
```
134137

135-
### Complex: use pipeChat() from anywhere
138+
### Complex: use chat.pipe() from anywhere
136139

137-
For complex agent flows where `streamText` is called deep inside your code, use `pipeChat()`. It works from **anywhere inside a task** — even nested function calls.
140+
For complex agent flows where `streamText` is called deep inside your code, use `chat.pipe()`. It works from **anywhere inside a task** — even nested function calls.
138141

139142
```ts trigger/agent-chat.ts
140-
import { chatTask, pipeChat } from "@trigger.dev/sdk/ai";
143+
import { chat } from "@trigger.dev/sdk/ai";
141144
import { streamText, convertToModelMessages } from "ai";
142145
import { openai } from "@ai-sdk/openai";
143146

144-
export const agentChat = chatTask({
147+
export const agentChat = chat.task({
145148
id: "agent-chat",
146149
run: async ({ messages }) => {
147-
// Don't return anything — pipeChat is called inside
150+
// Don't return anything — chat.pipe is called inside
148151
await runAgentLoop(convertToModelMessages(messages));
149152
},
150153
});
@@ -159,17 +162,17 @@ async function runAgentLoop(messages: CoreMessage[]) {
159162
});
160163

161164
// Pipe from anywhere — no need to return it
162-
await pipeChat(result);
165+
await chat.pipe(result);
163166
}
164167
```
165168

166-
### Manual: use task() with pipeChat()
169+
### Manual: use task() with chat.pipe()
167170

168-
If you need full control over task options, use the standard `task()` with `ChatTaskPayload` and `pipeChat()`:
171+
If you need full control over task options, use the standard `task()` with `ChatTaskPayload` and `chat.pipe()`:
169172

170173
```ts
171174
import { task } from "@trigger.dev/sdk";
172-
import { pipeChat, type ChatTaskPayload } from "@trigger.dev/sdk/ai";
175+
import { chat, type ChatTaskPayload } from "@trigger.dev/sdk/ai";
173176
import { streamText, convertToModelMessages } from "ai";
174177
import { openai } from "@ai-sdk/openai";
175178

@@ -183,7 +186,7 @@ export const manualChat = task({
183186
messages: convertToModelMessages(payload.messages),
184187
});
185188

186-
await pipeChat(result);
189+
await chat.pipe(result);
187190
},
188191
});
189192
```

0 commit comments

Comments
 (0)