Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
5810cbe
docs: close phase 5 status gap
flyingrobots Mar 17, 2026
219ba2c
feat(tooling): harden post-phase5 dev loop
flyingrobots Mar 17, 2026
a6ffe8f
test(tooling): disable signing in verify-local fixtures
flyingrobots Mar 17, 2026
e1e8e4b
feat(wasm): remove legacy read adapters for abi v2
flyingrobots Mar 17, 2026
dfd31bd
feat(wasm): land abi v3 logical clock boundary
flyingrobots Mar 18, 2026
9d29321
chore(hooks): record hook timings in local csv logs
flyingrobots Mar 18, 2026
7874587
feat(replay): rebuild full worldline state from provenance
flyingrobots Mar 18, 2026
4ab2bd7
feat(replay): cut over playback to full worldline checkpoints
flyingrobots Mar 18, 2026
8f868fb
feat(replay): accelerate provenance replay from full-state checkpoints
flyingrobots Mar 18, 2026
2c2086a
feat(replay): unify playback with provenance replay
flyingrobots Mar 19, 2026
1d94c7d
docs: mark phase 7 complete in implementation plan
flyingrobots Mar 19, 2026
9f62438
fix(tooling): skip deleted markdown paths in docs lint
flyingrobots Mar 19, 2026
3ae7143
fix(replay): harden scheduler and checkpoint invariants
flyingrobots Mar 19, 2026
c8df037
fix(ttd-browser): adapt browser cursors to typed ticks
flyingrobots Mar 19, 2026
e753280
chore(tooling): clean up PR feedback follow-ups
flyingrobots Mar 19, 2026
a78999e
test(playback): prove checkpoint restore path
flyingrobots Mar 19, 2026
7124311
test(warp-core): fix append-only provenance invariants
flyingrobots Mar 19, 2026
754bd88
test(playback): remove redundant checkpoint clone
flyingrobots Mar 20, 2026
de346c9
test(playback): document stable isolated-node fixture roots
flyingrobots Mar 20, 2026
a7b3355
fix(replay): harden replay checkpoints and review tooling
flyingrobots Mar 20, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 58 additions & 0 deletions .githooks/_timing.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
#!/usr/bin/env bash
# SPDX-License-Identifier: Apache-2.0
# © James Ross Ω FLYING•ROBOTS <https://github.com/flyingrobots>

hook_timing_now_ns() {
if command -v python3 >/dev/null 2>&1; then
python3 - <<'PY'
import time

print(time.monotonic_ns())
PY
else
printf '%s000000000\n' "$(date +%s)"
fi
}
Comment on lines +5 to +15
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Fallback mixing could produce invalid deltas if python3 availability changes mid-hook.

If python3 is available at hook_timing_prepare but not at hook_timing_append (or vice versa), you'll subtract epoch-seconds-as-nanoseconds from monotonic nanoseconds, producing garbage. The negative clamp at line 41-43 catches the symptom but masks the cause.

This is local dev tooling and the scenario is unlikely, so low priority.

Possible fix: cache the timing method at prepare time
 hook_timing_prepare() {
   DX_HOOK_REPO_ROOT="$1"
   DX_HOOK_NAME="$2"
+  if command -v python3 >/dev/null 2>&1; then
+    DX_HOOK_TIMING_METHOD="python"
+  else
+    DX_HOOK_TIMING_METHOD="date"
+  fi
   DX_HOOK_START_NS="$(hook_timing_now_ns)"
   DX_HOOK_TIMING_RECORDED=0
 }

Then use DX_HOOK_TIMING_METHOD in hook_timing_now_ns to ensure consistent method.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.githooks/_timing.sh around lines 5 - 15, The current hook_timing_now_ns may
use python3 or date depending on availability, which can change between
hook_timing_prepare and hook_timing_append and produce invalid deltas; modify
hook_timing_prepare to detect and export a consistent timing method (e.g. set
DX_HOOK_TIMING_METHOD="monotonic_ns" or "epoch_s_as_ns") and then update
hook_timing_now_ns to read DX_HOOK_TIMING_METHOD and always use the cached
method (calling python3 only if DX_HOOK_TIMING_METHOD indicates monotonic_ns,
otherwise using date), ensuring prepare/append use the same clock source; keep
existing negative-clamp logic but remove reliance on ephemeral command detection
at call time.


hook_timing_prepare() {
DX_HOOK_REPO_ROOT="$1"
DX_HOOK_NAME="$2"
DX_HOOK_START_NS="$(hook_timing_now_ns)"
DX_HOOK_TIMING_RECORDED=0
}

hook_timing_append() {
local exit_code="${1:-$?}"
if [[ "${DX_HOOK_TIMING_RECORDED:-0}" == "1" ]]; then
return 0
fi
DX_HOOK_TIMING_RECORDED=1

local repo_root="${DX_HOOK_REPO_ROOT:-}"
local hook_name="${DX_HOOK_NAME:-}"
local start_ns="${DX_HOOK_START_NS:-}"
if [[ -z "$repo_root" || -z "$hook_name" || -z "$start_ns" ]]; then
return 0
fi

local end_ns elapsed_ns elapsed_ms csv_dir csv_file timestamp_utc
end_ns="$(hook_timing_now_ns)"
elapsed_ns=$(( end_ns - start_ns ))
if (( elapsed_ns < 0 )); then
elapsed_ns=0
fi
elapsed_ms=$(( elapsed_ns / 1000000 ))
csv_dir="${repo_root}/.dx-debug"
csv_file="${csv_dir}/${hook_name}-times.csv"
timestamp_utc="$(date -u +"%Y-%m-%dT%H:%M:%SZ")"

mkdir -p "$csv_dir" 2>/dev/null || return 0
if [[ ! -f "$csv_file" ]]; then
printf 'timestamp_utc,elapsed_ms,exit_code,pid\n' >>"$csv_file" 2>/dev/null || return 0
fi
Comment on lines +49 to +52
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

CSV header write has a minor race condition.

Two hooks starting simultaneously could both see a missing file and both append headers. For local dev tooling this is cosmetic.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.githooks/_timing.sh around lines 49 - 52, The header-write has a race: two
concurrent hook runs can both see no file and append duplicate headers; fix by
obtaining an exclusive lock on "$csv_file" before checking/writing (use a file
descriptor with flock), e.g. open "$csv_file" for append (exec
200>>"$csv_file"), flock 200, then check if the file is empty/has no header
(e.g. [[ ! -s "$csv_file" ]] or grep for the header) and only then write the
header to the locked FD (printf 'timestamp_utc,elapsed_ms,exit_code,pid\n'
>&200), finally release the lock (flock -u) and close the FD; reference
variables: csv_file, csv_dir and the header string
'timestamp_utc,elapsed_ms,exit_code,pid\n'.

printf '%s,%s,%s,%s\n' \
"$timestamp_utc" \
"$elapsed_ms" \
"$exit_code" \
"$$" >>"$csv_file" 2>/dev/null || true
}
7 changes: 7 additions & 0 deletions .githooks/commit-msg
Original file line number Diff line number Diff line change
@@ -1,6 +1,13 @@
#!/usr/bin/env bash
# SPDX-License-Identifier: Apache-2.0
set -euo pipefail
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
# shellcheck source=.githooks/_timing.sh
source "$HOOK_DIR/_timing.sh"
hook_timing_prepare "$REPO_ROOT" "commit-msg"
trap 'hook_timing_append $?' EXIT

FILE="${1:-}"
if [[ -z "$FILE" || ! -f "$FILE" ]]; then
echo "[commit-msg] Missing commit message file. Usage: commit-msg <path>" >&2
Expand Down
12 changes: 11 additions & 1 deletion .githooks/pre-commit
Original file line number Diff line number Diff line change
@@ -1,6 +1,13 @@
#!/usr/bin/env bash
# SPDX-License-Identifier: Apache-2.0
set -euo pipefail
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
# shellcheck source=.githooks/_timing.sh
source "$HOOK_DIR/_timing.sh"
hook_timing_prepare "$REPO_ROOT" "pre-commit"
trap 'hook_timing_append $?' EXIT
cd "$REPO_ROOT"

# 0) Sweep stale build artifacts on odd hours (keeps target/ from ballooning)
# Manual: ./scripts/sweep-stale-artifacts.sh [days]
Expand Down Expand Up @@ -117,7 +124,10 @@ fi

# 9) Markdown formatting and linting for staged .md files
# Use -z and --diff-filter=ACMRT to exclude deletions and handle whitespace safely
mapfile -d '' MD_FILES < <(git diff --cached --name-only -z --diff-filter=ACMRT -- '*.md')
MD_FILES=()
while IFS= read -r -d '' md_file; do
MD_FILES+=("$md_file")
done < <(git diff --cached --name-only -z --diff-filter=ACMRT -- '*.md')
if [[ ${#MD_FILES[@]} -gt 0 ]] && command -v npx >/dev/null 2>&1; then
echo "pre-commit: linting markdown files"

Expand Down
10 changes: 8 additions & 2 deletions .githooks/pre-push
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,11 @@
# for the full workspace gates only when the changed paths justify it.
set -euo pipefail

REPO_ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
exec "$REPO_ROOT/scripts/verify-local.sh" pre-push
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
# shellcheck source=.githooks/_timing.sh
source "$HOOK_DIR/_timing.sh"
hook_timing_prepare "$REPO_ROOT" "pre-push"
trap 'hook_timing_append $?' EXIT
cd "$REPO_ROOT"
"$REPO_ROOT/scripts/verify-local.sh" pre-push
32 changes: 30 additions & 2 deletions .githooks/pre-push-parallel
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,12 @@
# SPDX-License-Identifier: Apache-2.0
# Parallel pre-push hook - runs stages concurrently with isolated target dirs
set -euo pipefail
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
# shellcheck source=.githooks/_timing.sh
source "$HOOK_DIR/_timing.sh"
hook_timing_prepare "$REPO_ROOT" "pre-push-parallel"
cd "$REPO_ROOT"

PINNED_FROM_FILE=$(awk -F '"' '/^channel/ {print $2}' rust-toolchain.toml 2>/dev/null || echo "")
PINNED="${PINNED:-${PINNED_FROM_FILE:-1.90.0}}"
Expand Down Expand Up @@ -31,16 +37,38 @@ fi

echo "🐰 BunBun 🐇 (parallel mode)"

LOGDIR=""
LOGDIR=$(mktemp -d)
# Trap EXIT, INT, TERM: kill background jobs then clean up
cleanup() {
local exit_code="${1:-0}"
# Kill any background jobs from this script
# Avoid GNU-only xargs -r by checking if pids is non-empty
pids=$(jobs -p 2>/dev/null) || true
[ -n "$pids" ] && echo "$pids" | xargs kill 2>/dev/null || true
rm -rf "$LOGDIR"
if [[ -n "${LOGDIR:-}" ]]; then
rm -rf "$LOGDIR"
fi
hook_timing_append "$exit_code"
}
on_exit() {
local exit_code=$?
cleanup "$exit_code"
}
on_signal() {
local signal_name="$1"
local exit_code=1
if [[ "$signal_name" == "INT" ]]; then
exit_code=130
elif [[ "$signal_name" == "TERM" ]]; then
exit_code=143
fi
cleanup "$exit_code"
exit "$exit_code"
}
trap cleanup EXIT INT TERM
trap on_exit EXIT
trap 'on_signal INT' INT
trap 'on_signal TERM' TERM
Comment on lines +54 to +71
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/usr/bin/env bash
set -euo pipefail

printf '[repo]\n'
sed -n '54,71p' .githooks/pre-push-parallel

printf '\n[demo]\n'
bash <<'BASH' || true
set -euo pipefail
count=0
cleanup() { count=$((count + 1)); printf 'cleanup:%s\n' "$count"; }
on_exit() { cleanup; }
on_signal() { cleanup; exit 130; }
trap on_exit EXIT
trap 'on_signal' INT
kill -INT $$
BASH

Repository: flyingrobots/echo

Length of output: 453


🏁 Script executed:

#!/bin/bash
# Examine the full cleanup function and timing append logic
cat -n .githooks/pre-push-parallel | sed -n '1,100p'

Repository: flyingrobots/echo

Length of output: 3715


🏁 Script executed:

#!/bin/bash
# Check for early exits before trap registration (line 69)
grep -n "exit\|return" .githooks/pre-push-parallel | head -20

Repository: flyingrobots/echo

Length of output: 536


🏁 Script executed:

#!/bin/bash
# Look for hook_timing_append to understand the timing side effects
rg "hook_timing_append|LOGDIR" .githooks/pre-push-parallel -A 2 -B 2

Repository: flyingrobots/echo

Length of output: 1428


🏁 Script executed:

#!/bin/bash
# Examine the timing module to understand append side effects
cat .githooks/_timing.sh

Repository: flyingrobots/echo

Length of output: 1561


Disarm the EXIT trap in the signal path.

Line 66 calls cleanup, and line 67 immediately exits into the EXIT trap on line 69. Interrupted runs therefore execute cleanup() twice. While the timing guard in _timing.sh prevents duplicate CSV rows, the double cleanup still wastes cycles killing already-dead jobs and deleting the already-removed temp directory.

🔧 Proposed fix
 on_signal() {
   local signal_name="$1"
   local exit_code=1
   if [[ "$signal_name" == "INT" ]]; then
     exit_code=130
   elif [[ "$signal_name" == "TERM" ]]; then
     exit_code=143
   fi
+  trap - EXIT
   cleanup "$exit_code"
   exit "$exit_code"
 }

Run the demo in the original comment to confirm Bash trap behavior.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
on_exit() {
local exit_code=$?
cleanup "$exit_code"
}
on_signal() {
local signal_name="$1"
local exit_code=1
if [[ "$signal_name" == "INT" ]]; then
exit_code=130
elif [[ "$signal_name" == "TERM" ]]; then
exit_code=143
fi
cleanup "$exit_code"
exit "$exit_code"
}
trap cleanup EXIT INT TERM
trap on_exit EXIT
trap 'on_signal INT' INT
trap 'on_signal TERM' TERM
on_exit() {
local exit_code=$?
cleanup "$exit_code"
}
on_signal() {
local signal_name="$1"
local exit_code=1
if [[ "$signal_name" == "INT" ]]; then
exit_code=130
elif [[ "$signal_name" == "TERM" ]]; then
exit_code=143
fi
trap - EXIT
cleanup "$exit_code"
exit "$exit_code"
}
trap on_exit EXIT
trap 'on_signal INT' INT
trap 'on_signal TERM' TERM
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.githooks/pre-push-parallel around lines 54 - 71, The on_signal handler
calls cleanup() and then exits, which re-triggers the on_exit EXIT trap and
causes cleanup to run twice; in the on_signal function (referencing on_signal,
on_exit, and cleanup) disarm the EXIT trap before invoking cleanup (e.g., call
trap - EXIT) so cleanup is only executed once, then proceed to call cleanup
"$exit_code" and exit "$exit_code".


# Stage functions - each uses its own target dir and the pinned toolchain
run_fmt() {
Expand Down
7 changes: 7 additions & 0 deletions .githooks/pre-push-sequential
Original file line number Diff line number Diff line change
@@ -1,6 +1,13 @@
#!/usr/bin/env bash
# SPDX-License-Identifier: Apache-2.0
set -euo pipefail
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
# shellcheck source=.githooks/_timing.sh
source "$HOOK_DIR/_timing.sh"
hook_timing_prepare "$REPO_ROOT" "pre-push-sequential"
trap 'hook_timing_append $?' EXIT
cd "$REPO_ROOT"
# Resolve the pinned toolchain from rust-toolchain.toml, fallback to explicit env or a sane default
PINNED_FROM_FILE=$(awk -F '"' '/^channel/ {print $2}' rust-toolchain.toml 2>/dev/null || echo "")
PINNED="${PINNED:-${PINNED_FROM_FILE:-1.90.0}}"
Expand Down
8 changes: 7 additions & 1 deletion .githooks/pre-rebase
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
#!/usr/bin/env bash
# SPDX-License-Identifier: Apache-2.0
set -euo pipefail
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
# shellcheck source=.githooks/_timing.sh
source "$HOOK_DIR/_timing.sh"
hook_timing_prepare "$REPO_ROOT" "pre-rebase"
trap 'hook_timing_append $?' EXIT
echo "[pre-rebase] Rebase is disallowed for this repository. Use merge instead." >&2
exit 1

2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ target-clippy
target-test
target-doc
.githooks/timing.jsonl
.dx-debug/*
blog/*
Comment on lines +14 to +15
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Gitignore patterns may not cover nested subdirectories.

.dx-debug/* and blog/* only match entries directly under those directories. If nested paths like .dx-debug/2026/timing.csv exist, they won't be ignored. Use .dx-debug/ (trailing slash, no asterisk) or .dx-debug/** for recursive coverage.

-.dx-debug/*
-blog/*
+.dx-debug/
+blog/
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.dx-debug/*
blog/*
.dx-debug/
blog/
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.gitignore around lines 14 - 15, The .gitignore entries `.dx-debug/*` and
`blog/*` only ignore top-level files; change them to recursive directory
patterns to cover nested files (e.g., replace `.dx-debug/*` with `.dx-debug/` or
`.dx-debug/**` and `blog/*` with `blog/` or `blog/**`) so paths like
`.dx-debug/2026/timing.csv` and nested files under `blog/` are ignored.

docs/.vitepress/cache

# Editor cruft
Expand Down
29 changes: 29 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,24 @@

## Unreleased

### feat(tooling): harden the post-Phase-5 dev loop

- **Changed** local verification success stamps now key off the actual checked
tree instead of `HEAD`, so commit-only churn can reuse the same clean proof
across manual and hook-triggered runs without ignoring unstaged or untracked
changes.
- **Added** per-lane and per-run timing records under
`.git/verify-local/timing.jsonl`, keeping local timing artifacts off the
tracked repo while making verifier cost visible.
- **Added** `scripts/pr-status.sh` plus `make pr-status` as a one-shot GitHub
summary for PR number, head SHA, unresolved thread count, review decision,
merge state, and grouped checks.
- **Fixed** `cargo nextest` targeted verification now respects crate target
shape instead of hardcoding `--lib --tests`, which keeps bin-only crates from
tripping the local fast path.
- **Changed** the full local tooling lane now runs all hook regression scripts
under `tests/hooks/test_*.sh` instead of one hardcoded verifier test.

### feat(tooling): split local verification into parallel lanes

- **Changed** the local full verifier now runs as curated parallel lanes with
Expand Down Expand Up @@ -60,6 +78,17 @@
- **Changed** `ttd-browser` migrated to the entry-based provenance API after
the Phase 4 hard cut removed the old provenance convenience methods.

### docs(post-phase5): sync roadmap truth after the merge

- **Changed** the ADR-0008 / ADR-0009 implementation plan now marks Phases 0-5
implemented and records Phase 5 as shipped.
- **Changed** ADR-0010 is now accepted, aligning the observational/admin split
with the implemented observation contract rather than leaving it in a
hypothetical state.
- **Added** `docs/march-16.plan.md` as the post-merge execution bridge for
dev-loop hardening, Phase 6 adapter deletion, explicit tick types, and the
later replay / transport horizons.

### fix(warp-core): close final Phase 3 PR review threads

- **Fixed** `Engine::commit_with_state()` now restores both the engine-owned
Expand Down
5 changes: 4 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ SHELL := /bin/bash
PORT ?= 5173
BENCH_PORT ?= 8000

.PHONY: hooks verify-ultra-fast verify-fast verify-pr verify-full verify-full-sequential docs docs-build docs-ci
.PHONY: hooks verify-ultra-fast verify-fast verify-pr verify-full verify-full-sequential pr-status docs docs-build docs-ci
hooks:
@git config core.hooksPath .githooks
@chmod +x .githooks/* 2>/dev/null || true
Expand All @@ -28,6 +28,9 @@ verify-full:
verify-full-sequential:
@VERIFY_LANE_MODE=sequential ./scripts/verify-local.sh full

pr-status:
@./scripts/pr-status.sh "$(PR)"

.PHONY: dags dags-fetch
dags:
@cargo xtask dags
Expand Down
Loading
Loading