feat(app): harden export flow and release gates#4
Conversation
- add codex verification and branch hygiene guardrails - bootstrap husky, version sync, and local toolchain support - align the desktop finish lane with generated-file protections Tests: bash .codex/scripts/run_verify_commands.sh
- add the release runbook and export-quality ADR - capture perf baselines and supporting proof scripts - bundle release verification helpers for repeatable closeout Tests: bash .codex/scripts/run_verify_commands.sh
- switch history export to PDF and update the desktop bridge - mask saved API keys and add frontend and Rust-side coverage - refresh docs to match the shipped export and privacy behavior Tests: bash .codex/scripts/run_verify_commands.sh
- replace the invalid `--bundles none` CLI usage in desktop smoke builds - keep the smoke job focused on compile-only validation across GitHub runners Tests: not run (workflow-only change)
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 550e2f49bf
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| required: ['content_input_id'], | ||
| properties: { content_input_id: { type: 'string' } }, |
There was a problem hiding this comment.
Use camelCase keys in generated command contract
The generated contract is described as the frontend command surface, but request schemas still require snake_case keys (for example content_input_id here, plus api_key, page_size, and source_url elsewhere in the same file). In this commit, the actual invoke payloads were switched to camelCase in src/lib/tauriApi.ts (contentInputId, apiKey, pageSize, sourceUrl), so consumers following the generated OpenAPI will send incorrect arguments and hit missing-parameter errors at runtime.
Useful? React with 👍 / 👎.
| process.exit(2); | ||
| } | ||
|
|
||
| const ratio = (c - b) / b; |
There was a problem hiding this comment.
Handle zero baselines before computing perf ratios
compare-metric.mjs divides by the baseline value without checking for zero, and this same commit seeds baselines like .perf-baselines/build-time.json and .perf-baselines/bundle.json with 0. Any non-zero current metric will produce Infinity and be treated as a regression, which makes the comparison script unusable (or permanently failing) when these baseline files are used.
Useful? React with 👍 / 👎.
What
Why
How
Testing
bash .codex/scripts/run_verify_commands.sh✅Lockfile rationale
pnpm-lock.yamlchanged to capture the release-closeout dependency graph and generated contract tooling updates already exercised in the local verify lane.Baseline governance
label applied: local release-closeout baseline refresh.perf-baselines/*files with the release-closeout branch if perf signal regresses on GitHubRisk / Notes
openapi/openapi.generated.jsonchanged as part of the contract/release lane