Skip to content

Part 1 of autograd backend: DifferentiationInterface extention#158

Open
yebai wants to merge 15 commits into
mainfrom
evaluator-interface-di-ext
Open

Part 1 of autograd backend: DifferentiationInterface extention#158
yebai wants to merge 15 commits into
mainfrom
evaluator-interface-di-ext

Conversation

@yebai
Copy link
Copy Markdown
Member

@yebai yebai commented May 4, 2026

No description provided.

yebai and others added 3 commits May 4, 2026 18:21
Layer the DifferentiationInterface AD extension onto the simplified evaluator
interface introduced on the `evaluators` branch. This is the catch-all path
for any `<:AbstractADType` backend without a native AbstractPPL extension.

- ext/AbstractPPLDifferentiationInterfaceExt.jl: DI-backed `prepare`,
  `value_and_gradient!!`, and `value_and_jacobian!!` for vector-input
  evaluators. The evaluator is passed as a `DI.Constant` so DynamicPPL-style
  problem state stays constant across calls. Compiled `AutoReverseDiff{true}`
  takes the one-argument tape path (the `DI.Constant` route would invalidate
  the compiled tape across calls). Empty inputs short-circuit before
  `DI.prepare_*` since many backends fail on length-zero arrays.
- Project.toml: DifferentiationInterface added as a weakdep with extension
  trigger and compat bound.
- test/autograd_tests.jl: shared problem definitions and `run_autograd_tests`
  entry point with separate gradient / jacobian / empty-input helpers.
- test/ext/differentiationinterface/: isolated test env using a local
  `DummyADType <: AbstractADType` to exercise the catch-all dispatch without
  pulling in a real AD package.
- .github/workflows/CI.yml: extend the ext matrix with
  `ext/differentiationinterface` alongside the existing
  `ext/logdensityproblems`.
- test/run_extras.jl: register the new label.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The upstream `Simplify Evaluators module after review` commit dropped
`_assert_supported_output` / `_assert_jacobian_output` from
`Evaluators.jl`, expecting the AD-extension PRs that actually call them
to bring them back. Move both helpers into the DI extension, the sole
caller after rebase.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Replace `_assert_supported_output` / `_assert_jacobian_output` with an
`if y isa Number / elseif y isa AbstractVector / else throw` cascade.
The jacobian assertion was unreachable after the scalar branch, and
the supported-output helper had a single call site; inlining the
remaining check as the `else` arm is clearer than two named helpers.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@yebai yebai force-pushed the evaluator-interface-di-ext branch from e7f18de to 6bc564a Compare May 4, 2026 17:23
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 4, 2026

AbstractPPL.jl documentation for PR #158 is available at:
https://TuringLang.github.io/AbstractPPL.jl/previews/PR158/

@codecov
Copy link
Copy Markdown

codecov Bot commented May 4, 2026

Codecov Report

❌ Patch coverage is 5.74713% with 82 lines in your changes missing coverage. Please review.
✅ Project coverage is 77.53%. Comparing base (ceec39b) to head (17ebe0e).

Files with missing lines Patch % Lines
ext/AbstractPPLTestExt.jl 0.00% 46 Missing ⚠️
ext/AbstractPPLDifferentiationInterfaceExt.jl 0.00% 36 Missing ⚠️

❗ There is a different number of reports uploaded between BASE (ceec39b) and HEAD (17ebe0e). Click for more details.

HEAD has 1 upload less than BASE
Flag BASE (ceec39b) HEAD (17ebe0e)
6 5
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #158      +/-   ##
==========================================
- Coverage   85.10%   77.53%   -7.58%     
==========================================
  Files          13       14       +1     
  Lines         705      779      +74     
==========================================
+ Hits          600      604       +4     
- Misses        105      175      +70     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

yebai and others added 2 commits May 4, 2026 20:20
Expose shared AD backend fixtures through TestResources so extension test environments can reuse the same cases without including files from test/.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Exercise the DifferentiationInterface catch-all with a real backend instead of a stub so the extension test covers the integration path used by downstream AD packages.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Comment thread ext/AbstractPPLDifferentiationInterfaceExt.jl Outdated
Comment thread test/ext/differentiationinterface/main.jl
Comment thread test/ext/differentiationinterface/main.jl Outdated
yebai and others added 6 commits May 5, 2026 16:51
… extension

Replace `prepare(adtype, problem, x)`'s two-step `prepare` + `VectorEvaluator`
wrap with a single delegated call to `AbstractPPL.prepare(problem, x; check_dims)`,
eliminating a latent double-wrap when `check_dims=false`.

Move `TestResources` (callable problems, test cases, runner) out of `src/` and
into a new `AbstractPPLTestExt` package extension, so `Test` is loaded only when
downstream code opts in. Split the unified `TestCase` into `ValueCase` /
`ErrorCase`, drop the unused `:namedtuple` fixtures, and store the error
operation as a callable instead of a `:call`/`:gradient` symbol.

Flatten the DI extension test file to two `run_testcases(Val(...); ...)` calls.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- value_and_gradient!! / value_and_jacobian!! now check arity before the
  length-zero short-circuit, so empty-input vector-valued / scalar-valued
  functions surface the correct ArgumentError instead of silently returning
  the wrong shape. Uses Val(0) as the cache sentinel for empty inputs.
- New AbstractPPLDifferentiationInterfaceLogDensityProblemsExt advertises
  LogDensityOrder{1} for any DI-prepared evaluator, registering the method
  in __init__ to side-step extension precompile-visibility constraints.
- Empty-input arity errors and the new capability advertisement are
  exercised in the DI test environment, which now also loads LDP.
- Project.toml [compat] alphabetised.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Drop the triple extension and have AbstractPPLLogDensityProblemsExt
advertise LogDensityOrder{1} for any Prepared{<:Any,<:VectorEvaluator}.
The contract is now uniform: anything from `prepare(adtype, …)` claims
gradient capability; a bare VectorEvaluator from `prepare(problem, x)`
stays at order 0. Backends that don't implement value_and_gradient!! or
that wrap a vector-output function surface a runtime error at call time
rather than via a capability downgrade.

Documents the rule in docs/src/evaluators.md.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…tras

Aqua's persistent_tasks check spawns a wrapper subprocess that runs
Pkg.precompile() on a package depending on AbstractPPL. On Julia 1.10
this hits "Declaring __precompile__(false) is not allowed in files
that are being precompiled" inside the wrapper's extension precompile
path. The dedicated Ext CI jobs load and exercise every extension on
min Julia and pass, so this is a Julia 1.10 / Aqua interaction, not a
defect in our extensions. Re-enable when min is bumped past 1.10.

The CI extras step also had a redundant --project=. that was overridden
by run_extras.jl's own Pkg.activate; drop it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- Mark generate_testcases / run_testcases as public (1.11+) so downstream
  AD-backend packages can reuse the conformance suite without reaching
  into private API.
- Note in the Prepared docstring that the two-arg constructor is for
  backends that allocate fresh storage per call.
- Expand the _ad_output_arity error message to call out matrix/tuple
  outputs explicitly and recommend flattening.
- Add a MethodError hint on value_and_gradient!! / value_and_jacobian!!
  that fires when no AD backend extension is loaded — also surfaces
  through LogDensityProblems.logdensity_and_gradient, which delegates
  to value_and_gradient!!. Suppressed once any backend registers a
  method, where the standard candidate list is more informative.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@yebai yebai requested a review from shravanngoswamii May 6, 2026 22:46
@yebai
Copy link
Copy Markdown
Member Author

yebai commented May 6, 2026

@shravanngoswamii, can you help take a look at this PR as you reviewed #157?

Comment thread ext/AbstractPPLLogDensityProblemsExt.jl Outdated
Comment thread ext/AbstractPPLDifferentiationInterfaceExt.jl Outdated
Comment thread ext/AbstractPPLDifferentiationInterfaceExt.jl
Comment thread docs/src/evaluators.md Outdated
@yebai
Copy link
Copy Markdown
Member Author

yebai commented May 7, 2026

Thanks, @shravanngoswamii, for the review—I'll try to address them later.

yebai and others added 3 commits May 7, 2026 17:59
- Drop AbstractPPLLogDensityProblemsExt and its test launcher; downstream
  packages can define LDP methods on `Prepared` directly when needed.
- Add `_check_vector_length` to enforce dim mismatch on
  `value_and_gradient!!`/`value_and_jacobian!!` before the `Val(0)`
  short-circuit and DI call (matches `VectorEvaluator{true}` behaviour).
- Unify gradient/jacobian DI prep into `_prepare_di(prep, ...)`; the
  `AutoReverseDiff{true}` carve-out now also fixes Jacobian tape recording.
- Tidy DI tests: remove LDP block, drop redundant constants and edge cases,
  tighten gradient-of-vector-valued case to `ArgumentError`.
- Document prep-time `evaluator(x)` call and reserved `Val{:vector}` /
  `Val{:edge}` testcase keys.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@yebai
Copy link
Copy Markdown
Member Author

yebai commented May 7, 2026

Addressed all review feedback in 5797c83:

  • value_and_gradient!! / value_and_jacobian!! now run a _check_vector_length before the Val(0) short-circuit and the DI call, so a wrong-length x (including a non-empty x against a length-0 Prepared) raises DimensionMismatch matching prepared(x).
  • AutoReverseDiff(; compile=true) Jacobian path: unified gradient/jacobian DI prep into _prepare_di(prep, ...); the no-context carve-out now applies to both, restoring tape recording for vector outputs.
  • Docs wording tweak applied.

Cleanup

  • Tightened the gradient of vector-valued output edge case from ExceptionArgumentError.
  • Documented the prep-time evaluator(x) call and reserved Val{:vector}/Val{:edge} testcase keys.

EDIT: the DynamicPPL failures are unrelated.

) where {T<:Real}
p.cache.gradient_prep === nothing &&
throw(ArgumentError("`value_and_gradient!!` requires a scalar-valued function."))
Evaluators._check_vector_length(p.evaluator.dim, x)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think value_and_gradient!! / value_and_jacobian!! still bypass the other VectorEvaluator validation for integer vectors.

julia> using AbstractPPL

julia> using ADTypes: AutoForwardDiff

julia> using DifferentiationInterface

julia> using ForwardDiff

julia> p = AbstractPPL.prepare(AutoForwardDiff(), x -> sum(abs2, x), zeros(3));

julia> p([1, 2, 3])
ERROR: ArgumentError: VectorEvaluator requires a vector of floating-point values, but received an `Vector{Int64}`. Convert to a floating-point vector (e.g. `Float64.(x)`) before calling.
Stacktrace:
 [1] _reject_integer_input(x::Vector{Int64})
   @ AbstractPPL.Evaluators ~/Work/vectorly-ai/AbstractPPL.jl/src/evaluators/Evaluators.jl:167
 [2] (::AbstractPPL.Evaluators.VectorEvaluator{true, var"#2#3"})(x::Vector{Int64})
   @ AbstractPPL.Evaluators ~/Work/vectorly-ai/AbstractPPL.jl/src/evaluators/Evaluators.jl:182
 [3] (::AbstractPPL.Evaluators.Prepared{…})(x::Vector{…})
   @ AbstractPPL.Evaluators ~/Work/vectorly-ai/AbstractPPL.jl/src/evaluators/Evaluators.jl:41
 [4] top-level scope
   @ REPL[6]:1
Some type information was truncated. Use `show(err)` to see complete types.

julia> 

But the derivative call goes into DI and gives a backend preparation error:

julia> AbstractPPL.value_and_gradient!!(p, [1, 2, 3])
ERROR: PreparationMismatchError (inconsistent types between preparation and execution):
  - f:- backend:- x:- prep: Vector{Float64}
    - exec: Vector{Int64}
  - contexts: ✅
If you are confident that this check is superfluous, you can disable it by running preparation with the keyword argument `strict=Val(false)` inside DifferentiationInterface.

Stacktrace:
 [1] check_prep(f::Function, ::DifferentiationInterfaceForwardDiffExt.ForwardDiffGradientPrep{…}, backend::AutoForwardDiff{…}, x::Vector{…}, contexts::Constant{…})
   @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/IS0Dg/src/utils/prep.jl:177
 [2] value_and_gradient(f::typeof(AbstractPPLDifferentiationInterfaceExt._call_evaluator), prep::DifferentiationInterfaceForwardDiffExt.ForwardDiffGradientPrep{…}, backend::AutoForwardDiff{…}, x::Vector{…}, contexts::Constant{…})
   @ DifferentiationInterfaceForwardDiffExt ~/.julia/packages/DifferentiationInterface/IS0Dg/ext/DifferentiationInterfaceForwardDiffExt/onearg.jl:411
 [3] value_and_gradient!!(p::AbstractPPL.Evaluators.Prepared{…}, x::Vector{…})
   @ AbstractPPLDifferentiationInterfaceExt ~/Work/vectorly-ai/AbstractPPL.jl/ext/AbstractPPLDifferentiationInterfaceExt.jl:74
 [4] top-level scope
   @ REPL[7]:1
Some type information was truncated. Use `show(err)` to see complete types.

julia> 

Same thing happens for value_and_jacobian!! on a vector-valued function.

Can we reuse the same input validation as VectorEvaluator{true} before calling DI, not only the length check?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a personal opinion: I don't think those emojis as error looks good.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are from DifferentiationInterface, not AbstractPPL.

Comment thread ext/AbstractPPLLogDensityProblemsExt.jl Outdated
The bare `VectorEvaluator` rejected integer vectors with a clear
`ArgumentError`, but the DI derivative entry points went straight to
DI prep and surfaced an opaque `PreparationMismatchError`. Apply the
same compile-time `T <: Integer` check before dispatching to DI, and
add matching `:edge` testcases.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants