Skip to content

fix: improve regression-tests workflow structure and concurrency#2948

Open
jorgemoya wants to merge 3 commits intocanaryfrom
fix-regressions-test-catalyst
Open

fix: improve regression-tests workflow structure and concurrency#2948
jorgemoya wants to merge 3 commits intocanaryfrom
fix-regressions-test-catalyst

Conversation

@jorgemoya
Copy link
Copy Markdown
Contributor

@jorgemoya jorgemoya commented Mar 23, 2026

Summary

  • Remove top-level workflow concurrency group — each job now manages its own concurrency independently
  • Simplify detect-provider job: remove is-canary-branch output, consolidate duplicated logic, and hoist is-preview / PKG_NAME to avoid repeated conditionals
  • Fix preview concurrency groups: use deployment.ref instead of deployment.sha so same-branch runs actually cancel stale audits (cancel-in-progress was a no-op before since each SHA produced a unique group)
  • Fix production concurrency groups: use ref-based groups for preview environments (stale baseline audits get cancelled) and SHA-based groups for production (each canary deploy stays isolated)
  • Simplify unlighthouse-audit-production condition from (is-preview == 'true' && production-url != '') || is-canary-branch == 'true' to just production-url != ''
  • Fix unlighthouse-report condition to use is-preview == 'false' && production-url != '' instead of the removed is-canary-branch output

Test plan

  • Push two commits quickly to an open PR → first run's preview + production baseline audits get cancelled by the second
  • Merge to canary → production audit runs to completion, never cancelled
  • Two different PRs deploying simultaneously → separate groups (different refs), no interference
  • Cloudflare and unknown provider paths still set all required outputs

🤖 Generated with Claude Code

@changeset-bot
Copy link
Copy Markdown

changeset-bot bot commented Mar 23, 2026

⚠️ No Changeset found

Latest commit: 0cf7429

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@vercel
Copy link
Copy Markdown

vercel bot commented Mar 23, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
catalyst Ready Ready Preview, Comment Mar 24, 2026 1:57pm

Request Review

@jorgemoya jorgemoya changed the title fix: prevent cancel in progress for regression-tests fix: split regression tests into preview and production workflows Mar 23, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 23, 2026

Bundle Size Report

Comparing against baseline from e198d89 (2026-03-24).

No bundle size changes detected.

@jorgemoya jorgemoya force-pushed the fix-regressions-test-catalyst branch from 1ebcd6c to 6d4c38b Compare March 23, 2026 22:15
@jorgemoya jorgemoya marked this pull request as ready for review March 23, 2026 22:17
@jorgemoya jorgemoya requested a review from a team as a code owner March 23, 2026 22:17
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 23, 2026

Unlighthouse Performance Comparison — Vercel

Comparing PR preview deployment Unlighthouse scores vs production Unlighthouse scores.

Summary Score

Aggregate score across all categories as reported by Unlighthouse.

Prod Desktop Prod Mobile Preview Desktop Preview Mobile
Score 92 94 91 94

Category Scores

Category Prod Desktop Prod Mobile Preview Desktop Preview Mobile
Performance 74 99 75 76
Accessibility 95 95 95 95
Best Practices 100 100 95 100
SEO 100 100 100 100

Core Web Vitals

Metric Prod Desktop Prod Mobile Preview Desktop Preview Mobile
LCP 4.9 s 2.0 s 3.7 s 4.4 s
CLS 0.037 0 0.05 0.191
FCP 1.2 s 1.2 s 1.2 s 1.2 s
TBT 10 ms 0 ms 30 ms 0 ms
Max Potential FID 60 ms 50 ms 70 ms 60 ms
Time to Interactive 4.9 s 2.0 s 5.6 s 4.4 s

Full Unlighthouse report →


concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.event.deployment_status.target_url }}
group: regression-preview-${{ github.event.deployment.sha }}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you can move this down to the job level so you don't need to split this into two separate workflow files https://docs.github.com/en/actions/how-tos/write-workflows/choose-when-workflows-run/control-workflow-concurrency (and key off the outputs from detect-provider)

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated 👍

@jorgemoya jorgemoya changed the title fix: split regression tests into preview and production workflows fix: use SHA-based concurrency for regression tests Mar 24, 2026
@jorgemoya jorgemoya changed the title fix: use SHA-based concurrency for regression tests fix: use ref-based concurrency groups for regression tests Mar 24, 2026
@jorgemoya jorgemoya changed the title fix: use ref-based concurrency groups for regression tests fix: improve regression-tests workflow structure and concurrency Mar 24, 2026
@jorgemoya jorgemoya requested a review from chanceaclark March 24, 2026 13:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants