Notes for maintainers and contributors.
Three-layer Docker image strategy for fast iteration:
ocaml-devcontainer-base (~20-30 min, compiler + system tools, rebuild rare)
└── ocaml-devcontainer (~15-20 min, opam packages, rebuild when tools update)
├── [tutorial-specific] (seconds, user-created FROM image)
├── ocaml-devcontainer-rocq (~10-15 min, adds Rocq proof assistant)
└── ocaml-devcontainer-tsan (~20-30 min, adds ocaml+tsan switch)
oxcaml-devcontainer-base (~25-35 min, OxCaml compiler + system tools, rebuild rare)
└── oxcaml-devcontainer (~10-15 min, opam packages, rebuild when tools update)
└── [tutorial-specific] (seconds, user-created FROM image)
- Base image (
base/Dockerfile): Ubuntu 24.04, opam, singleocamlswitch (OCaml 5.4.0), platform tools (dune, ocaml-lsp-server, merlin, utop), editors (vim, emacs-nox), debugging tools (gdb, lldb, valgrind, rr, perf, strace, ltrace, bpftrace, hyperfine). - Dev image (
dev/Dockerfile): Additional opam packages — testing (alcotest, ppx_inline_test, ppx_expect, qcheck), profiling (landmarks, memtrace, runtime_events_tools, printbox), libraries (base), concurrency (backoff). - Rocq image (
rocq/Dockerfile): Adds Rocq (formerly Coq) on top of the dev image. Sameocamlswitch, no special build requirements. - TSan image (
tsan/Dockerfile): Adds anocaml+tsanswitch with all the same tools. Requiresvm.mmap_rnd_bits=28at build time.
- Base image (
oxcaml-base/Dockerfile): Same system dependencies as the OCaml base image. Singleoxcamlswitch (OxCaml 5.2.0+ox via oxcaml/opam-repository) with platform tools. - Dev image (
oxcaml-dev/Dockerfile): core, base,await,parallel, alcotest, ppx_inline_test, ppx_expect.
All images are published to Docker Hub and GHCR as multi-arch (amd64 + arm64) manifests.
# Build OCaml base image (compiler — takes ~20-30 min)
docker build -t ocaml-devcontainer-base base/
# Build OCaml dev image (tools — takes ~15-20 min)
docker build -t ocaml-devcontainer dev/
# Build OCaml Rocq image (adds Rocq — takes ~10-15 min)
docker build -t ocaml-devcontainer-rocq rocq/
# Build OCaml TSan image (adds ocaml+tsan switch — takes ~20-30 min)
# IMPORTANT: TSan requires reduced ASLR entropy on the build host
sudo sysctl -w vm.mmap_rnd_bits=28
docker build -t ocaml-devcontainer-tsan tsan/
# Build OxCaml base image (OxCaml compiler — takes ~25-35 min)
docker build -t oxcaml-devcontainer-base oxcaml-base/
# Build OxCaml dev image (tools — takes ~10-15 min)
docker build -t oxcaml-devcontainer --build-arg BASE_IMAGE=oxcaml-devcontainer-base oxcaml-dev/To limit memory usage during opam installs, pass --build-arg OPAMJOBS=2.
Note: Only the TSan image build requires the
vm.mmap_rnd_bits=28sysctl. The base and dev images build without it. See google/sanitizers#1716.
The -from-scratch configurations build from source instead of pulling pre-built images:
# OCaml
devcontainer up --workspace-folder . --config .devcontainer-from-scratch/devcontainer.json
# OCaml + Rocq
devcontainer up --workspace-folder . --config .devcontainer-rocq-from-scratch/devcontainer.json
# OCaml + TSan
devcontainer up --workspace-folder . --config .devcontainer-tsan-from-scratch/devcontainer.json
# OxCaml
devcontainer up --workspace-folder . --config .devcontainer-oxcaml-from-scratch/devcontainer.jsonTest scripts live in test/ and run inside the container:
| Script | What it tests |
|---|---|
test-ocaml.sh [switch] |
Compiler + tools verification |
test-lsp.sh [switch] |
Full LSP protocol testing |
test-profiling.sh [switch] |
landmarks, memtrace, olly |
test-rocq.sh |
Rocq installation and compilation |
test-dune-pkg.sh |
Dune package management workflow |
test-vscode.sh |
VS Code devcontainer integration |
test-neovim.sh |
Neovim exec pathway + LSP |
test-emacs.sh |
Emacs TRAMP + eglot integration |
test-claude.sh |
Claude Code installation |
test-oxcaml-switch.sh |
OxCaml compiler, packages, local_ allocations |
Default switch is ocaml. To test the TSan switch, run against the TSan image:
./test/test-ocaml.sh ocaml+tsanTriggered by pushes to main that touch base/ or dev/, version tags, or manual dispatch.
changes ──► build-base-{amd64,arm64} ──► merge-base (multi-arch manifest)
│
└──► build-dev-{amd64,arm64} ──► merge-dev (multi-arch manifest)
Each architecture builds on a native runner (no cross-compilation). Dev image builds depend only on their own architecture's base image, so amd64 and arm64 pipelines run in parallel.
Triggered after successful build-push.yml completion, pushes to main that touch rocq/, or manual dispatch. Builds on top of the dev image. No special build requirements.
Triggered after successful build-push.yml completion, pushes to main that touch tsan/, or manual dispatch. Builds on top of the dev image.
Same fan-out/fan-in pattern as the OCaml pipeline. Triggered by pushes to main that touch oxcaml-base/ or oxcaml-dev/.
Triggered on push/PR to main and after successful image builds.
- Main image tests:
test-ocaml,test-lsp,test-profiling— singleocamlswitch, no--privileged. - TSan image tests:
test-ocaml-tsan,test-lsp-tsan,test-profiling-tsan— matrix['ocaml', 'ocaml+tsan'],--privilegedfor TSan, ASLR sysctl. - Other tests run once against the default switch.
Triggered on push/PR to main and after successful Rocq image builds.
Tests run on the ocaml switch: test-rocq.sh, test-ocaml.sh, test-lsp.sh, test-profiling.sh, plus editor tests.
Triggered on push/PR to main and after successful OxCaml image builds.
Tests run on the single oxcaml switch: test-oxcaml-switch and test-lsp.
| Secret | Purpose |
|---|---|
DOCKERHUB_USERNAME |
Docker Hub username |
DOCKERHUB_TOKEN |
Docker Hub access token |
GITHUB_TOKEN |
Automatic — used for GHCR push |
Mount a persistent dune cache to speed up rebuilds across container restarts:
{
"mounts": [
"source=dune-cache,target=/home/vscode/.cache/dune,type=volume"
]
}Enable caching via environment:
export DUNE_CACHE=enabledThe base image includes several debugging tools:
dune build
gdb _build/default/src/main.exevalgrind --leak-check=full ./_build/default/src/main.exerr (Record & Replay)
Requires hardware perf counters — works on bare metal and some VMs, not in most cloud containers.
rr record ./_build/default/src/main.exe
rr replay