Skip to content

fix(deno): Clear pre-existing OTel global before registering TracerProvider#19723

Open
sergical wants to merge 8 commits intodevelopfrom
fix/deno-otel-trace-disable
Open

fix(deno): Clear pre-existing OTel global before registering TracerProvider#19723
sergical wants to merge 8 commits intodevelopfrom
fix/deno-otel-trace-disable

Conversation

@sergical
Copy link
Member

@sergical sergical commented Mar 9, 2026

Summary

  • Calls trace.disable() before trace.setGlobalTracerProvider() in @sentry/deno's OTel tracer setup
  • This fixes silent registration failure when Supabase Edge Runtime (or Deno's native OTel) pre-registers a TracerProvider on the @opentelemetry/api global (Symbol.for('opentelemetry.js.api.1'))
  • Without this fix, OTel-instrumented spans (e.g. gen_ai.* from AI SDK, or any library using @opentelemetry/api) never reach Sentry because Sentry's TracerProvider fails to register as the global. Sentry's own startSpan() API is unaffected since it bypasses the OTel global.

Context

Supabase Edge Runtime (Deno 2.1.4+) registers its own TracerProvider before user code runs. The OTel API's trace.setGlobalTracerProvider() is a no-op if a provider is already registered (it only logs a diag warning), so Sentry's tracer silently gets ignored.

What works without the fix: Sentry.startSpan() — goes through Sentry's internal pipeline, not the OTel global.

What breaks without the fix: Any spans created via @opentelemetry/api (AI SDK's gen_ai.* spans, HTTP instrumentations, etc.) — these hit the pre-existing Supabase provider instead of Sentry's.

Calling trace.disable() clears the global, allowing trace.setGlobalTracerProvider() to succeed. This matches the pattern already used in cleanupOtel() in the test file and is safe because:

  1. It only runs once during Sentry.init()
  2. Any pre-existing provider is immediately replaced by Sentry's
  3. It's gated behind skipOpenTelemetrySetup so users with custom OTel setups can opt out
  4. The Cloudflare package was investigated and doesn't have the same issue

Test plan

  • Updated should override pre-existing OTel provider with Sentry provider unit test — simulates a pre-existing provider and verifies Sentry overrides it
  • Updated should override native Deno OpenTelemetry when enabled unit test — verifies Sentry captures spans even when OTEL_DENO=true
  • E2E test app (dev-packages/e2e-tests/test-applications/deno/) — Deno server with pre-existing OTel provider, 5 tests:
    • Error capture (Sentry.captureException)
    • Sentry.startSpan transaction
    • OTel tracer.startSpan despite pre-existing provider (core regression test)
    • OTel tracer.startActiveSpan (AI SDK pattern)
    • Sentry + OTel interop (OTel child inside Sentry parent)
  • Verified manually with Supabase Edge Function + AI SDK: Sentry.startSpan() spans appeared in Sentry both before and after the fix, but gen_ai.* OTel spans only appeared after the fix

🤖 Generated with Claude Code

Closes #19724

…ovider

Supabase Edge Runtime (and Deno's native OTel) pre-registers on the
`@opentelemetry/api` global, causing `trace.setGlobalTracerProvider()`
to silently fail. Call `trace.disable()` first so Sentry's provider
always wins.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@github-actions
Copy link
Contributor

github-actions bot commented Mar 9, 2026

size-limit report 📦

⚠️ Warning: Base artifact is not the latest one, because the latest workflow run is not done yet. This may lead to incorrect results. Try to re-run all tests to get up to date results.

Path Size % Change Change
@sentry/browser 25.64 kB +0.05% +12 B 🔺
@sentry/browser - with treeshaking flags 24.14 kB +0.03% +7 B 🔺
@sentry/browser (incl. Tracing) 42.62 kB +0.44% +184 B 🔺
@sentry/browser (incl. Tracing, Profiling) 47.28 kB +0.4% +187 B 🔺
@sentry/browser (incl. Tracing, Replay) 81.42 kB +0.22% +171 B 🔺
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 71 kB +0.18% +125 B 🔺
@sentry/browser (incl. Tracing, Replay with Canvas) 86.12 kB +0.21% +178 B 🔺
@sentry/browser (incl. Tracing, Replay, Feedback) 98.37 kB +0.18% +167 B 🔺
@sentry/browser (incl. Feedback) 42.45 kB +0.03% +9 B 🔺
@sentry/browser (incl. sendFeedback) 30.31 kB +0.05% +14 B 🔺
@sentry/browser (incl. FeedbackAsync) 35.36 kB +0.04% +12 B 🔺
@sentry/browser (incl. Metrics) 26.92 kB +0.48% +126 B 🔺
@sentry/browser (incl. Logs) 27.07 kB +0.48% +128 B 🔺
@sentry/browser (incl. Metrics & Logs) 27.74 kB +0.46% +127 B 🔺
@sentry/react 27.39 kB +0.04% +9 B 🔺
@sentry/react (incl. Tracing) 44.95 kB +0.41% +180 B 🔺
@sentry/vue 30.08 kB +0.03% +7 B 🔺
@sentry/vue (incl. Tracing) 44.48 kB +0.42% +183 B 🔺
@sentry/svelte 25.66 kB +0.04% +9 B 🔺
CDN Bundle 28.27 kB +0.35% +97 B 🔺
CDN Bundle (incl. Tracing) 43.5 kB +0.55% +237 B 🔺
CDN Bundle (incl. Logs, Metrics) 29.13 kB +0.42% +121 B 🔺
CDN Bundle (incl. Tracing, Logs, Metrics) 44.34 kB +0.56% +243 B 🔺
CDN Bundle (incl. Replay, Logs, Metrics) 68.2 kB +0.16% +107 B 🔺
CDN Bundle (incl. Tracing, Replay) 80.32 kB +0.23% +183 B 🔺
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) 81.22 kB +0.27% +218 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback) 85.86 kB +0.24% +205 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) 86.76 kB +0.27% +226 B 🔺
CDN Bundle - uncompressed 82.56 kB +0.26% +211 B 🔺
CDN Bundle (incl. Tracing) - uncompressed 128.5 kB +0.34% +433 B 🔺
CDN Bundle (incl. Logs, Metrics) - uncompressed 85.43 kB +0.29% +244 B 🔺
CDN Bundle (incl. Tracing, Logs, Metrics) - uncompressed 131.37 kB +0.36% +466 B 🔺
CDN Bundle (incl. Replay, Logs, Metrics) - uncompressed 209.06 kB +0.11% +212 B 🔺
CDN Bundle (incl. Tracing, Replay) - uncompressed 245.35 kB +0.17% +401 B 🔺
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) - uncompressed 248.21 kB +0.18% +434 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 258.26 kB +0.16% +401 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) - uncompressed 261.11 kB +0.17% +434 B 🔺
@sentry/nextjs (client) 47.37 kB +0.39% +184 B 🔺
@sentry/sveltekit (client) 43.07 kB +0.42% +178 B 🔺
@sentry/node-core 52.27 kB +0.06% +31 B 🔺
@sentry/node 174.76 kB +0.04% +53 B 🔺
@sentry/node - without tracing 97.43 kB +0.06% +50 B 🔺
@sentry/aws-serverless 113.23 kB +0.04% +45 B 🔺

View base workflow run

Copy link
Member

@Lms24 Lms24 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without this fix, AI SDK OTel spans (gen_ai.*) never reach Sentry because the Sentry TracerProvider is never actually set as the global

H: Could you clarify this? I suspect that either no spans at all are sent or all of them should be sent. What makes gen_ai spans special here? This sounds to me like an agent partially analyzed a problem and didn't grasp the full scope of it. I don't think we can merge this until we know the consequences and the current state of tracing.

export function setupOpenTelemetryTracer(): void {
// Clear any pre-existing OTel global registration (e.g. from Supabase Edge Runtime
// or Deno's built-in OTel) so Sentry's TracerProvider gets registered successfully.
trace.disable();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m l: I'm wondering if this backfires for people using Sentry with a custom OTel setup or deliberately with Deno's native tracing (OTLP exporter). The good news is that we don't document this setup for Deno, so I think we can just ignore it for the moment and walk back on this change if anyone complains.

Update: I just saw that we gate this function call with skipOpenTelemetrySetup, so users can opt out of it. That's good. So I guess the worst consequence here is that anyone using native tracing with Sentry might need to set this flag now. Which we can classify as a fix because that's how we intended the SDK to work anyway. Downgraded from logaf M to L

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment, no direct action required: I'm fine with these unit tests for now but to be clear these don't prove that the fix works as intended in an actual app. Long-term I'd like us to at least add one e2e app for Deno (or an integration tests setup like for Node) to more reliably verify this. This goes back to my main review comment that I don't think we fully grasped the scope of the current behavior yet. If we had such a test, we could more reliably say that at least some spans are sent.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed on the e2e gap — the unit tests verify the mechanism but not real-world behavior. We tested manually with a Supabase Edge Function + AI SDK and confirmed gen_ai.* spans appear in Sentry with the fix and don't without it. Happy to share the test app as a reference.

The e2e infrastructure is all there (Verdaccio, test-utils, Playwright). A Deno e2e app would just need a Deno.serve() server with Sentry + a route triggering OTel-instrumented spans, then Playwright tests asserting they arrive. Could be a follow-up PR.

Copy link
Member Author

@sergical sergical left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you clarify this? I suspect that either no spans at all are sent or all of them should be sent. What makes gen_ai spans special here?

The scope is OTel-instrumented spans specifically:

  • Sentry.startSpan() works fine — it goes through Sentry's internal span pipeline, not the OTel global TracerProvider. We confirmed this: custom Sentry.startSpan spans showed up in Sentry.
  • Spans from OTel-instrumented libraries are lost — AI SDK (and anything else using @opentelemetry/api's tracer.startSpan()) hits the pre-existing Supabase provider instead of Sentry's. The gen_ai.* spans from AI SDK were the ones we noticed missing, but it would affect any OTel instrumentation.

This was confirmed in the test application I was running https://github.com/sergical/supabase-ai-test/blob/main/supabase/functions/ai-chat/index.ts#L14

Supabase Edge Runtime registers a TracerProvider on the OTel global before user code runs. trace.setGlobalTracerProvider() is a no-op when a provider already exists (just logs a diag warning), so Sentry's provider silently fails to register. Sentry's own API bypasses this because it doesn't go through the OTel global to create spans.

I'll update the PR description to be more precise about this.

sergical and others added 2 commits March 10, 2026 14:26
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@github-actions
Copy link
Contributor

github-actions bot commented Mar 10, 2026

node-overhead report 🧳

Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.

Scenario Requests/s % of Baseline Prev. Requests/s Change %
GET Baseline 10,967 - 8,966 +22%
GET With Sentry 1,915 17% 1,700 +13%
GET With Sentry (error only) 7,454 68% 5,974 +25%
POST Baseline 1,135 - 1,189 -5%
POST With Sentry 545 48% 590 -8%
POST With Sentry (error only) 1,009 89% 1,039 -3%
MYSQL Baseline 3,888 - 3,250 +20%
MYSQL With Sentry 462 12% 451 +2%
MYSQL With Sentry (error only) 3,178 82% 2,632 +21%

View base workflow run

- Add @sentry/deno to package.json so matrix builder triggers on SDK changes
- Add Deno setup step in build.yml for CI runtime
- Fix deno.json relative import paths in copyToTemp.ts for temp dir copies

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Bugbot Autofix prepared a fix for the issue found in the latest run.

  • ✅ Fixed: E2E test filters look for root span in child spans array
    • Changed the transaction wait filters to match on event.transaction for root spans so the tests no longer hang.

Create PR

Or push these changes by commenting:

@cursor push 23e98514d3
Preview (23e98514d3)
diff --git a/dev-packages/e2e-tests/test-applications/deno/tests/transactions.test.ts b/dev-packages/e2e-tests/test-applications/deno/tests/transactions.test.ts
--- a/dev-packages/e2e-tests/test-applications/deno/tests/transactions.test.ts
+++ b/dev-packages/e2e-tests/test-applications/deno/tests/transactions.test.ts
@@ -3,7 +3,7 @@
 
 test('Sends transaction with Sentry.startSpan', async ({ baseURL }) => {
   const transactionPromise = waitForTransaction('deno', event => {
-    return event?.spans?.some(span => span.description === 'test-sentry-span') ?? false;
+    return event?.transaction === 'test-sentry-span';
   });
 
   await fetch(`${baseURL}/test-sentry-span`);
@@ -62,7 +62,7 @@
 
 test('OTel span appears as child of Sentry span (interop)', async ({ baseURL }) => {
   const transactionPromise = waitForTransaction('deno', event => {
-    return event?.spans?.some(span => span.description === 'sentry-parent') ?? false;
+    return event?.transaction === 'sentry-parent';
   });
 
   await fetch(`${baseURL}/test-interop`);

This Bugbot Autofix run was free. To enable autofix for future PRs, go to the Cursor dashboard.

sergical and others added 3 commits March 10, 2026 15:13
Replace `existsSync` check + `readFileSync` with a single `readFileSync`
wrapped in try-catch to eliminate the CodeQL-flagged race condition.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Deno couldn't find `@opentelemetry/api` in node_modules because it
wasn't listed in package.json. Adding `"nodeModulesDir": "auto"` lets
Deno auto-install npm: specifier deps on startup.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Revert `nodeModulesDir: "auto"` and add `@opentelemetry/api` to
package.json instead. This lets pnpm install OTel into node_modules
so Deno resolves it in manual mode without creating a `.deno/` dir
that duplicates playwright.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix(deno): Clear pre-existing OTel global before registering TracerProvider

2 participants