Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #1096 +/- ##
=======================================
Coverage 80.74% 80.74%
=======================================
Files 41 41
Lines 3921 3921
=======================================
Hits 3166 3166
Misses 755 755 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
Hi @cianc, Thanks a lot for this PR. Do you have time to also move some of the documentation examples to docstring examples in the Python code? We would then be able to load them from here, while keeping a single source of truth directly from the Python code, while also exposing examples in the docstring :) Also, I feel that we might not need a separate script directory, but rather just keep Python file as part of the tests directory. Anywhow. Thahnks a lot for tackling this :) |
davidberenstein1957
left a comment
There was a problem hiding this comment.
I left some small remarks and a comment in the main thread. Happy to hear your thoughts and thanks a lot for the help :)
docs/getting-started/examples.md
Outdated
|
|
||
| ``` python | ||
| ```python | ||
| # skip |
There was a problem hiding this comment.
we can actually run this for a small subset of the data, right?
There was a problem hiding this comment.
Same comment as below, I'm not sure we want to run something this heavy in CI.
docs/getting-started/examples.md
Outdated
|
|
||
| ``` python | ||
| ```python | ||
| # skip |
There was a problem hiding this comment.
same here, we could run this with a small subset of the data, right?
There was a problem hiding this comment.
I'm not sure, this is quite a lot of code to run in CI - I updated the comment, what do you think?
|
@cianc just following up here to see if the extended contribution fits in your time schedule, if not feel free to let us know, and we'll handle the rest :) |
|
Yes sorry, work has been quite busy. These suggestions seems reasonable and
I hope to get to them soon.
…On Mon, Mar 9, 2026 at 12:36 PM David Berenstein ***@***.***> wrote:
*davidberenstein1957* left a comment (mlco2/codecarbon#1096)
<#1096 (comment)>
@cianc <https://github.com/cianc> just following up here to see if the
extended contribution fits in your time schedule, if not feel free to let
us know, and we'll handle the rest :)
—
Reply to this email directly, view it on GitHub
<#1096 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AADD7PTBJQDHYNZWPCHSYZD4P23FVAVCNFSM6AAAAACWF6E7M2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHM2DAMRTGQ2TAOJYGU>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
|
@cianc no problem, happy to iterate on the solution and wait for a while from our side :) |
Adds automated testing of Python code examples in the documentation
to prevent examples from drifting out of sync with the library.
Changes:
- Add `mktestdocs` and `pytest` to the `doc` dependency group in
`pyproject.toml` so they are available alongside the other doc-build
tools without pulling in the full `dev` group.
- Add `scripts/check-docs-drift.py`: a pytest-based script that uses
`mktestdocs.grab_code_blocks()` to collect every ```python fenced
block under `docs/`, skips any block whose first line is `# skip`,
and executes the rest via `exec_python()`. A new taskipy task
`docs-check-drift` runs it with `pytest scripts/check-docs-drift.py -v`.
- Fix all ```python code blocks across `docs/` so they are correctly
picked up by mktestdocs:
- Remove the stray space in ``` python fences (changed to ```python)
so that mktestdocs can identify them (it matches on the exact string
"python" immediately after the backticks).
- Add `save_to_file=False, log_level="error"` to `EmissionsTracker`
and `OfflineEmissionsTracker` instantiations to avoid creating CSV
files or noisy output during CI runs.
- Add `# skip` as the first line of blocks that cannot run in CI
because they depend on external services or optional heavy
dependencies (TensorFlow, Prometheus, Logfire, Google Cloud,
Comet ML, live CodeCarbon API).
- Correct a `pip install` command that was incorrectly fenced as
```python in `comet.md`; changed to ```console.
- Update `.github/workflows/build-docs.yml` to run `docs-check-drift`
as a step before the docs build, triggered on changes to `docs/**`,
`mkdocs.yml`, or `scripts/check-docs-drift.py`.
- Document the drift check and the `# skip` convention in
`CONTRIBUTING.md` under the "Build Documentation" section.
…heck-docs-drift.py and the CI workflow triggers on changes to that path. 2. Replaced inline save_to_file=False/log_level="error" arguments with environment variables — CODECARBON_SAVE_TO_FILE=false and CODECARBON_LOG_LEVEL=error are now set at the top of the test script, so the doc examples themselves stay clean and uncluttered. 3. Reduced the number of skipped blocks — the task manager example in usage.md was skippable only because it was missing an import; adding from codecarbon import EmissionsTracker made it runnable. The remaining # skip blocks (TensorFlow examples, Comet integration, Prometheus/Logfire output) are unavoidable due to missing dependencies. Skip comments now explain why each block is excluded.
9057de3 to
3c99b8e
Compare
… set via env vars in test script
…g, and skip pseudocode blocks
|
I think I've addressed all comments except for the one on running tensorflow modeling in CI - let me know what you think. |
Clean up the #skip comments for some python blocks Co-authored-by: David Berenstein <david.m.berenstein@gmail.com>
Description
Adds automated testing of Python code examples in the documentation to prevent examples from drifting out of sync with the library.
Add
mktestdocsandpytestto thedocdependency group inpyproject.tomlso they are available alongside the other doc-build tools without pulling in the fulldevgroup.Add
tests/check-docs-drift.py: a pytest-based script that usesmktestdocs.grab_code_blocks()to collect every ```python fenced block underdocs/, skips any block whose first line is `# skip`, and executes the rest via `exec_python()`. A new taskipy task `docs-check-drift` runs it with `pytest scripts/check-docs-drift.py -v`.Fix all ```python code blocks across
docs/so they are correctly picked up by mktestdocs. Some are marked "#skip" because it's infeasible to run them in CI.Set
save_to_file=False, log_level="error"via environment variables in check-docs-drift.py to avoid creating CSV files or noisy output during CI runs.Update
.github/workflows/build-docs.ymlto rundocs-check-driftas a step before the docs build, triggered on changes todocs/**,mkdocs.yml, orscripts/check-docs-drift.py.Document the drift check and the
# skipconvention inCONTRIBUTING.mdunder the "Build Documentation" section.Related Issue
Please link to the issue this PR resolves: #1083
Motivation and Context
Helps prevent drift of code blocks in documentation
How Has This Been Tested?
Ran all tests including new mktestdocs tests.
Screenshots (if appropriate):
Types of changes
What types of changes does your code introduce? Put an
xin all the boxes that apply:Checklist:
Go over all the following points, and put an
xin all the boxes that apply.