Skip to content

Add mktestdocs to catch docs/code drift#1096

Open
cianc wants to merge 5 commits intomasterfrom
test/add-mktestdocs
Open

Add mktestdocs to catch docs/code drift#1096
cianc wants to merge 5 commits intomasterfrom
test/add-mktestdocs

Conversation

@cianc
Copy link
Contributor

@cianc cianc commented Mar 3, 2026

Description

Adds automated testing of Python code examples in the documentation to prevent examples from drifting out of sync with the library.

  • Add mktestdocs and pytest to the doc dependency group in pyproject.toml so they are available alongside the other doc-build tools without pulling in the full dev group.

  • Add tests/check-docs-drift.py: a pytest-based script that uses mktestdocs.grab_code_blocks() to collect every ```python fenced block under docs/, skips any block whose first line is `# skip`, and executes the rest via `exec_python()`. A new taskipy task `docs-check-drift` runs it with `pytest scripts/check-docs-drift.py -v`.

  • Fix all ```python code blocks across docs/ so they are correctly picked up by mktestdocs. Some are marked "#skip" because it's infeasible to run them in CI.

  • Set save_to_file=False, log_level="error" via environment variables in check-docs-drift.py to avoid creating CSV files or noisy output during CI runs.

  • Update .github/workflows/build-docs.yml to run docs-check-drift as a step before the docs build, triggered on changes to docs/**, mkdocs.yml, or scripts/check-docs-drift.py.

  • Document the drift check and the # skip convention in CONTRIBUTING.md under the "Build Documentation" section.

Related Issue

Please link to the issue this PR resolves: #1083

Motivation and Context

Helps prevent drift of code blocks in documentation

How Has This Been Tested?

Ran all tests including new mktestdocs tests.

Screenshots (if appropriate):

Types of changes

What types of changes does your code introduce? Put an x in all the boxes that apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Checklist:

Go over all the following points, and put an x in all the boxes that apply.

  • My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I have read the CONTRIBUTING.md document.
  • I have added tests to cover my changes.
  • All new and existing tests passed.

@cianc cianc marked this pull request as ready for review March 3, 2026 21:51
@cianc cianc requested a review from a team as a code owner March 3, 2026 21:51
@codecov
Copy link

codecov bot commented Mar 3, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 80.74%. Comparing base (46ac6dd) to head (987ecdf).

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #1096   +/-   ##
=======================================
  Coverage   80.74%   80.74%           
=======================================
  Files          41       41           
  Lines        3921     3921           
=======================================
  Hits         3166     3166           
  Misses        755      755           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@davidberenstein1957
Copy link
Collaborator

Hi @cianc,

Thanks a lot for this PR. Do you have time to also move some of the documentation examples to docstring examples in the Python code? We would then be able to load them from here, while keeping a single source of truth directly from the Python code, while also exposing examples in the docstring :)
If not, we can focus on keeping the current approach.

Also, I feel that we might not need a separate script directory, but rather just keep Python file as part of the tests directory.

Anywhow. Thahnks a lot for tackling this :)

Copy link
Collaborator

@davidberenstein1957 davidberenstein1957 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left some small remarks and a comment in the main thread. Happy to hear your thoughts and thanks a lot for the help :)


``` python
```python
# skip
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can actually run this for a small subset of the data, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as below, I'm not sure we want to run something this heavy in CI.


``` python
```python
# skip
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, we could run this with a small subset of the data, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure, this is quite a lot of code to run in CI - I updated the comment, what do you think?

@davidberenstein1957 davidberenstein1957 self-assigned this Mar 4, 2026
@davidberenstein1957
Copy link
Collaborator

@cianc just following up here to see if the extended contribution fits in your time schedule, if not feel free to let us know, and we'll handle the rest :)

@cianc
Copy link
Contributor Author

cianc commented Mar 9, 2026 via email

@davidberenstein1957
Copy link
Collaborator

@cianc no problem, happy to iterate on the solution and wait for a while from our side :)

cianc added 2 commits March 20, 2026 15:29
Adds automated testing of Python code examples in the documentation
to prevent examples from drifting out of sync with the library.

Changes:

- Add `mktestdocs` and `pytest` to the `doc` dependency group in
  `pyproject.toml` so they are available alongside the other doc-build
  tools without pulling in the full `dev` group.

- Add `scripts/check-docs-drift.py`: a pytest-based script that uses
  `mktestdocs.grab_code_blocks()` to collect every ```python fenced
  block under `docs/`, skips any block whose first line is `# skip`,
  and executes the rest via `exec_python()`. A new taskipy task
  `docs-check-drift` runs it with `pytest scripts/check-docs-drift.py -v`.

- Fix all ```python code blocks across `docs/` so they are correctly
  picked up by mktestdocs:
  - Remove the stray space in ``` python fences (changed to ```python)
    so that mktestdocs can identify them (it matches on the exact string
    "python" immediately after the backticks).
  - Add `save_to_file=False, log_level="error"` to `EmissionsTracker`
    and `OfflineEmissionsTracker` instantiations to avoid creating CSV
    files or noisy output during CI runs.
  - Add `# skip` as the first line of blocks that cannot run in CI
    because they depend on external services or optional heavy
    dependencies (TensorFlow, Prometheus, Logfire, Google Cloud,
    Comet ML, live CodeCarbon API).
  - Correct a `pip install` command that was incorrectly fenced as
    ```python in `comet.md`; changed to ```console.

- Update `.github/workflows/build-docs.yml` to run `docs-check-drift`
  as a step before the docs build, triggered on changes to `docs/**`,
  `mkdocs.yml`, or `scripts/check-docs-drift.py`.

- Document the drift check and the `# skip` convention in
  `CONTRIBUTING.md` under the "Build Documentation" section.
…heck-docs-drift.py and the CI workflow triggers on changes to that path.

2. Replaced inline save_to_file=False/log_level="error" arguments with environment variables — CODECARBON_SAVE_TO_FILE=false and CODECARBON_LOG_LEVEL=error are now set at the top of the test script, so the doc examples themselves stay clean and uncluttered.

3. Reduced the number of skipped blocks — the task manager example in usage.md was skippable only because it was missing an import; adding from codecarbon import EmissionsTracker made it runnable. The remaining # skip blocks (TensorFlow examples, Comet integration, Prometheus/Logfire output) are unavoidable due to missing dependencies. Skip comments now explain why each block is excluded.
@cianc cianc force-pushed the test/add-mktestdocs branch from 9057de3 to 3c99b8e Compare March 20, 2026 15:30
@cianc
Copy link
Contributor Author

cianc commented Mar 20, 2026

I think I've addressed all comments except for the one on running tensorflow modeling in CI - let me know what you think.

Clean up the #skip comments for some python blocks

Co-authored-by: David Berenstein <david.m.berenstein@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add docstring examples and tests for mkdocstrings and mktestdocs

3 participants