Regression testing to ensure consistent CDISC Rules Engine reports before and after updates.
This repository contains automatically generated regression tests to validate rule behavior using core.py. These tests ensure consistent results after any code updates or logic changes.
├── rule\_regression\_tests/ # Generated pytest test scripts
├── published\_rules/ # Rule JSON files (organized by subfolder)
├── json\_datasets/ # Dataset JSONs and optional XML files (organized by subfolder)
└── README.md # This file
- Python 3.12 or later
pipinstalledcore.pyfrom cdisc rules engine. You will need to have in your root directory the core.py file from rules engine or the executable of the cdisc rules engine
Make sure you're in the root directory of the repository.
Then run:
python -m pytest rule_regression_tests/Each test will:
- Re-run
core.py validatewith the associated rule and dataset. - Generate a new Excel result.
- Compare it with the previously captured expected result.
- Raise an error if there's any difference in sheet names, columns, or data.
- Timestamp columns are ignored during comparison.
NaNvalues are treated asNonefor consistency.- Make sure any new Excel files are removed or managed if you're regenerating to avoid picking up old outputs.
============================= test session starts =============================
collected 120 items
rule_regression_tests/test_rule_A.py ............. [ 10%]
rule_regression_tests/test_rule_B.py ............. [ 20%]
...
========================== 120 passed in 25.31s ==============================