Skip to content

Latest commit

 

History

History
204 lines (136 loc) · 9.26 KB

File metadata and controls

204 lines (136 loc) · 9.26 KB

Hacking on ocaml-devcontainer

Notes for maintainers and contributors.

Image architecture

Three-layer Docker image strategy for fast iteration:

ocaml-devcontainer-base   (~20-30 min, compiler + system tools, rebuild rare)
    └── ocaml-devcontainer  (~15-20 min, opam packages, rebuild when tools update)
        ├── [tutorial-specific]  (seconds, user-created FROM image)
        ├── ocaml-devcontainer-rocq  (~10-15 min, adds Rocq proof assistant)
        └── ocaml-devcontainer-tsan  (~20-30 min, adds ocaml+tsan switch)

oxcaml-devcontainer-base  (~25-35 min, OxCaml compiler + system tools, rebuild rare)
    └── oxcaml-devcontainer (~10-15 min, opam packages, rebuild when tools update)
        └── [tutorial-specific]  (seconds, user-created FROM image)

OCaml images

OxCaml images

All images are published to Docker Hub and GHCR as multi-arch (amd64 + arm64) manifests.

Building images locally

Build commands

# Build OCaml base image (compiler — takes ~20-30 min)
docker build -t ocaml-devcontainer-base base/

# Build OCaml dev image (tools — takes ~15-20 min)
docker build -t ocaml-devcontainer dev/

# Build OCaml Rocq image (adds Rocq — takes ~10-15 min)
docker build -t ocaml-devcontainer-rocq rocq/

# Build OCaml TSan image (adds ocaml+tsan switch — takes ~20-30 min)
# IMPORTANT: TSan requires reduced ASLR entropy on the build host
sudo sysctl -w vm.mmap_rnd_bits=28
docker build -t ocaml-devcontainer-tsan tsan/

# Build OxCaml base image (OxCaml compiler — takes ~25-35 min)
docker build -t oxcaml-devcontainer-base oxcaml-base/

# Build OxCaml dev image (tools — takes ~10-15 min)
docker build -t oxcaml-devcontainer --build-arg BASE_IMAGE=oxcaml-devcontainer-base oxcaml-dev/

To limit memory usage during opam installs, pass --build-arg OPAMJOBS=2.

Note: Only the TSan image build requires the vm.mmap_rnd_bits=28 sysctl. The base and dev images build without it. See google/sanitizers#1716.

Using the local build

The -from-scratch configurations build from source instead of pulling pre-built images:

# OCaml
devcontainer up --workspace-folder . --config .devcontainer-from-scratch/devcontainer.json

# OCaml + Rocq
devcontainer up --workspace-folder . --config .devcontainer-rocq-from-scratch/devcontainer.json

# OCaml + TSan
devcontainer up --workspace-folder . --config .devcontainer-tsan-from-scratch/devcontainer.json

# OxCaml
devcontainer up --workspace-folder . --config .devcontainer-oxcaml-from-scratch/devcontainer.json

Running tests

Test scripts live in test/ and run inside the container:

Script What it tests
test-ocaml.sh [switch] Compiler + tools verification
test-lsp.sh [switch] Full LSP protocol testing
test-profiling.sh [switch] landmarks, memtrace, olly
test-rocq.sh Rocq installation and compilation
test-dune-pkg.sh Dune package management workflow
test-vscode.sh VS Code devcontainer integration
test-neovim.sh Neovim exec pathway + LSP
test-emacs.sh Emacs TRAMP + eglot integration
test-claude.sh Claude Code installation
test-oxcaml-switch.sh OxCaml compiler, packages, local_ allocations

Default switch is ocaml. To test the TSan switch, run against the TSan image:

./test/test-ocaml.sh ocaml+tsan

CI/CD

OCaml build pipeline (build-push.yml)

Triggered by pushes to main that touch base/ or dev/, version tags, or manual dispatch.

changes ──► build-base-{amd64,arm64} ──► merge-base (multi-arch manifest)
              │
              └──► build-dev-{amd64,arm64} ──► merge-dev (multi-arch manifest)

Each architecture builds on a native runner (no cross-compilation). Dev image builds depend only on their own architecture's base image, so amd64 and arm64 pipelines run in parallel.

Rocq build pipeline (build-push-rocq.yml)

Triggered after successful build-push.yml completion, pushes to main that touch rocq/, or manual dispatch. Builds on top of the dev image. No special build requirements.

TSan build pipeline (build-push-tsan.yml)

Triggered after successful build-push.yml completion, pushes to main that touch tsan/, or manual dispatch. Builds on top of the dev image.

OxCaml build pipeline (build-push-oxcaml.yml)

Same fan-out/fan-in pattern as the OCaml pipeline. Triggered by pushes to main that touch oxcaml-base/ or oxcaml-dev/.

OCaml test pipeline (test.yml)

Triggered on push/PR to main and after successful image builds.

  • Main image tests: test-ocaml, test-lsp, test-profiling — single ocaml switch, no --privileged.
  • TSan image tests: test-ocaml-tsan, test-lsp-tsan, test-profiling-tsan — matrix ['ocaml', 'ocaml+tsan'], --privileged for TSan, ASLR sysctl.
  • Other tests run once against the default switch.

Rocq test pipeline (test-rocq-image.yml)

Triggered on push/PR to main and after successful Rocq image builds.

Tests run on the ocaml switch: test-rocq.sh, test-ocaml.sh, test-lsp.sh, test-profiling.sh, plus editor tests.

OxCaml test pipeline (test-oxcaml-image.yml)

Triggered on push/PR to main and after successful OxCaml image builds.

Tests run on the single oxcaml switch: test-oxcaml-switch and test-lsp.

Required secrets

Secret Purpose
DOCKERHUB_USERNAME Docker Hub username
DOCKERHUB_TOKEN Docker Hub access token
GITHUB_TOKEN Automatic — used for GHCR push

Performance tuning

Mount a persistent dune cache to speed up rebuilds across container restarts:

{
  "mounts": [
    "source=dune-cache,target=/home/vscode/.cache/dune,type=volume"
  ]
}

Enable caching via environment:

export DUNE_CACHE=enabled

Debugging tools

The base image includes several debugging tools:

dune build
gdb _build/default/src/main.exe
valgrind --leak-check=full ./_build/default/src/main.exe

rr (Record & Replay)

Requires hardware perf counters — works on bare metal and some VMs, not in most cloud containers.

rr record ./_build/default/src/main.exe
rr replay