chore: Add cursor rules for AI integrations contributions#19167
chore: Add cursor rules for AI integrations contributions#19167RulaKhaled merged 4 commits intodevelopfrom
Conversation
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
Codecov Results 📊Generated by Codecov Action |
| - Vercel AI: `packages/core/src/tracing/vercel-ai/constants.ts:8-23` | ||
| - LangChain: `packages/core/src/tracing/langchain/index.ts:199-207` |
There was a problem hiding this comment.
m: I think we should not encode the line number here as it'll inevitably change without us realizing it needs to change here too and it might lead to the model confusing things. I'd just list packages/core/src/tracing/vercel-ai|langchain basically.
|
|
||
| **Non-streaming:** Use `startSpan()`, set attributes immediately from response | ||
|
|
||
| **Streaming:** Use `startSpanManual()` with this pattern: |
There was a problem hiding this comment.
m: I'd also mention here that it should still prefer adding event listeners / hooks to streams if available instead of going always with the async generator route.
|
|
||
| - OpenAI: `packages/core/src/tracing/openai/streaming.ts` | ||
| - Anthropic: `packages/core/src/tracing/anthropic-ai/streaming.ts` | ||
| - Detection: `packages/core/src/tracing/openai/index.ts:183-221` |
There was a problem hiding this comment.
m: Same here, instead of pointing to the lines maybe we can call out in plain text how isStreamRequested works and point to the file.
|
|
||
| 2. **Node.js:** Add performance optimization in `packages/node/src/integrations/tracing/{provider}/index.ts` | ||
| - Use `callWhenPatched()` to defer processor registration | ||
| - Only register when package is actually imported (see vercelai:36) |
There was a problem hiding this comment.
m: Same here, regarding line numbers.
| 2. **Node.js Instrumentation:** Patch module exports in `instrumentation.ts` | ||
| - Wrap client constructor | ||
| - Check `_INTERNAL_shouldSkipAiProviderWrapping()` (for LangChain) | ||
| - See openai/instrumentation.ts:70-86 |
There was a problem hiding this comment.
m: Same here, regarding line numbers.
|
|
||
| 3. **Node.js Integration:** Export instrumentation function | ||
| - Use `generateInstrumentOnce()` helper | ||
| - See openai/index.ts:6-9 |
|
|
||
| 2. **Node.js Instrumentation:** Auto-inject callbacks | ||
| - Patch runnable methods to add handler automatically | ||
| - **Important:** Disable underlying AI provider wrapping (langchain/instrumentation.ts:103-105) |
node-overhead report 🧳Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.
|
This PR adds
.cursor/rules/adding-a-new-ai-integration.mdc, a complete reference guide for implementing AI provider integrations (OpenAI, Anthropic, Vercel AI, LangChain, etc.) in the Sentry JavaScript SDK.Closes #19168 (added automatically)