Part 1 of autograd backend: DifferentiationInterface extention#158
Part 1 of autograd backend: DifferentiationInterface extention#158yebai wants to merge 15 commits into
Conversation
Layer the DifferentiationInterface AD extension onto the simplified evaluator
interface introduced on the `evaluators` branch. This is the catch-all path
for any `<:AbstractADType` backend without a native AbstractPPL extension.
- ext/AbstractPPLDifferentiationInterfaceExt.jl: DI-backed `prepare`,
`value_and_gradient!!`, and `value_and_jacobian!!` for vector-input
evaluators. The evaluator is passed as a `DI.Constant` so DynamicPPL-style
problem state stays constant across calls. Compiled `AutoReverseDiff{true}`
takes the one-argument tape path (the `DI.Constant` route would invalidate
the compiled tape across calls). Empty inputs short-circuit before
`DI.prepare_*` since many backends fail on length-zero arrays.
- Project.toml: DifferentiationInterface added as a weakdep with extension
trigger and compat bound.
- test/autograd_tests.jl: shared problem definitions and `run_autograd_tests`
entry point with separate gradient / jacobian / empty-input helpers.
- test/ext/differentiationinterface/: isolated test env using a local
`DummyADType <: AbstractADType` to exercise the catch-all dispatch without
pulling in a real AD package.
- .github/workflows/CI.yml: extend the ext matrix with
`ext/differentiationinterface` alongside the existing
`ext/logdensityproblems`.
- test/run_extras.jl: register the new label.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The upstream `Simplify Evaluators module after review` commit dropped `_assert_supported_output` / `_assert_jacobian_output` from `Evaluators.jl`, expecting the AD-extension PRs that actually call them to bring them back. Move both helpers into the DI extension, the sole caller after rebase. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Replace `_assert_supported_output` / `_assert_jacobian_output` with an `if y isa Number / elseif y isa AbstractVector / else throw` cascade. The jacobian assertion was unreachable after the scalar branch, and the supported-output helper had a single call site; inlining the remaining check as the `else` arm is clearer than two named helpers. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
e7f18de to
6bc564a
Compare
|
AbstractPPL.jl documentation for PR #158 is available at: |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #158 +/- ##
==========================================
- Coverage 85.10% 77.53% -7.58%
==========================================
Files 13 14 +1
Lines 705 779 +74
==========================================
+ Hits 600 604 +4
- Misses 105 175 +70 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Expose shared AD backend fixtures through TestResources so extension test environments can reuse the same cases without including files from test/. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Exercise the DifferentiationInterface catch-all with a real backend instead of a stub so the extension test covers the integration path used by downstream AD packages. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
… extension Replace `prepare(adtype, problem, x)`'s two-step `prepare` + `VectorEvaluator` wrap with a single delegated call to `AbstractPPL.prepare(problem, x; check_dims)`, eliminating a latent double-wrap when `check_dims=false`. Move `TestResources` (callable problems, test cases, runner) out of `src/` and into a new `AbstractPPLTestExt` package extension, so `Test` is loaded only when downstream code opts in. Split the unified `TestCase` into `ValueCase` / `ErrorCase`, drop the unused `:namedtuple` fixtures, and store the error operation as a callable instead of a `:call`/`:gradient` symbol. Flatten the DI extension test file to two `run_testcases(Val(...); ...)` calls. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- value_and_gradient!! / value_and_jacobian!! now check arity before the
length-zero short-circuit, so empty-input vector-valued / scalar-valued
functions surface the correct ArgumentError instead of silently returning
the wrong shape. Uses Val(0) as the cache sentinel for empty inputs.
- New AbstractPPLDifferentiationInterfaceLogDensityProblemsExt advertises
LogDensityOrder{1} for any DI-prepared evaluator, registering the method
in __init__ to side-step extension precompile-visibility constraints.
- Empty-input arity errors and the new capability advertisement are
exercised in the DI test environment, which now also loads LDP.
- Project.toml [compat] alphabetised.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Drop the triple extension and have AbstractPPLLogDensityProblemsExt
advertise LogDensityOrder{1} for any Prepared{<:Any,<:VectorEvaluator}.
The contract is now uniform: anything from `prepare(adtype, …)` claims
gradient capability; a bare VectorEvaluator from `prepare(problem, x)`
stays at order 0. Backends that don't implement value_and_gradient!! or
that wrap a vector-output function surface a runtime error at call time
rather than via a capability downgrade.
Documents the rule in docs/src/evaluators.md.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…tras Aqua's persistent_tasks check spawns a wrapper subprocess that runs Pkg.precompile() on a package depending on AbstractPPL. On Julia 1.10 this hits "Declaring __precompile__(false) is not allowed in files that are being precompiled" inside the wrapper's extension precompile path. The dedicated Ext CI jobs load and exercise every extension on min Julia and pass, so this is a Julia 1.10 / Aqua interaction, not a defect in our extensions. Re-enable when min is bumped past 1.10. The CI extras step also had a redundant --project=. that was overridden by run_extras.jl's own Pkg.activate; drop it. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- Mark generate_testcases / run_testcases as public (1.11+) so downstream AD-backend packages can reuse the conformance suite without reaching into private API. - Note in the Prepared docstring that the two-arg constructor is for backends that allocate fresh storage per call. - Expand the _ad_output_arity error message to call out matrix/tuple outputs explicitly and recommend flattening. - Add a MethodError hint on value_and_gradient!! / value_and_jacobian!! that fires when no AD backend extension is loaded — also surfaces through LogDensityProblems.logdensity_and_gradient, which delegates to value_and_gradient!!. Suppressed once any backend registers a method, where the standard candidate list is more informative. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
|
@shravanngoswamii, can you help take a look at this PR as you reviewed #157? |
|
Thanks, @shravanngoswamii, for the review—I'll try to address them later. |
- Drop AbstractPPLLogDensityProblemsExt and its test launcher; downstream
packages can define LDP methods on `Prepared` directly when needed.
- Add `_check_vector_length` to enforce dim mismatch on
`value_and_gradient!!`/`value_and_jacobian!!` before the `Val(0)`
short-circuit and DI call (matches `VectorEvaluator{true}` behaviour).
- Unify gradient/jacobian DI prep into `_prepare_di(prep, ...)`; the
`AutoReverseDiff{true}` carve-out now also fixes Jacobian tape recording.
- Tidy DI tests: remove LDP block, drop redundant constants and edge cases,
tighten gradient-of-vector-valued case to `ArgumentError`.
- Document prep-time `evaluator(x)` call and reserved `Val{:vector}` /
`Val{:edge}` testcase keys.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
|
Addressed all review feedback in 5797c83:
Cleanup
EDIT: the DynamicPPL failures are unrelated. |
| ) where {T<:Real} | ||
| p.cache.gradient_prep === nothing && | ||
| throw(ArgumentError("`value_and_gradient!!` requires a scalar-valued function.")) | ||
| Evaluators._check_vector_length(p.evaluator.dim, x) |
There was a problem hiding this comment.
I think value_and_gradient!! / value_and_jacobian!! still bypass the other VectorEvaluator validation for integer vectors.
julia> using AbstractPPL
julia> using ADTypes: AutoForwardDiff
julia> using DifferentiationInterface
julia> using ForwardDiff
julia> p = AbstractPPL.prepare(AutoForwardDiff(), x -> sum(abs2, x), zeros(3));
julia> p([1, 2, 3])
ERROR: ArgumentError: VectorEvaluator requires a vector of floating-point values, but received an `Vector{Int64}`. Convert to a floating-point vector (e.g. `Float64.(x)`) before calling.
Stacktrace:
[1] _reject_integer_input(x::Vector{Int64})
@ AbstractPPL.Evaluators ~/Work/vectorly-ai/AbstractPPL.jl/src/evaluators/Evaluators.jl:167
[2] (::AbstractPPL.Evaluators.VectorEvaluator{true, var"#2#3"})(x::Vector{Int64})
@ AbstractPPL.Evaluators ~/Work/vectorly-ai/AbstractPPL.jl/src/evaluators/Evaluators.jl:182
[3] (::AbstractPPL.Evaluators.Prepared{…})(x::Vector{…})
@ AbstractPPL.Evaluators ~/Work/vectorly-ai/AbstractPPL.jl/src/evaluators/Evaluators.jl:41
[4] top-level scope
@ REPL[6]:1
Some type information was truncated. Use `show(err)` to see complete types.
julia> But the derivative call goes into DI and gives a backend preparation error:
julia> AbstractPPL.value_and_gradient!!(p, [1, 2, 3])
ERROR: PreparationMismatchError (inconsistent types between preparation and execution):
- f: ✅
- backend: ✅
- x: ❌
- prep: Vector{Float64}
- exec: Vector{Int64}
- contexts: ✅
If you are confident that this check is superfluous, you can disable it by running preparation with the keyword argument `strict=Val(false)` inside DifferentiationInterface.
Stacktrace:
[1] check_prep(f::Function, ::DifferentiationInterfaceForwardDiffExt.ForwardDiffGradientPrep{…}, backend::AutoForwardDiff{…}, x::Vector{…}, contexts::Constant{…})
@ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/IS0Dg/src/utils/prep.jl:177
[2] value_and_gradient(f::typeof(AbstractPPLDifferentiationInterfaceExt._call_evaluator), prep::DifferentiationInterfaceForwardDiffExt.ForwardDiffGradientPrep{…}, backend::AutoForwardDiff{…}, x::Vector{…}, contexts::Constant{…})
@ DifferentiationInterfaceForwardDiffExt ~/.julia/packages/DifferentiationInterface/IS0Dg/ext/DifferentiationInterfaceForwardDiffExt/onearg.jl:411
[3] value_and_gradient!!(p::AbstractPPL.Evaluators.Prepared{…}, x::Vector{…})
@ AbstractPPLDifferentiationInterfaceExt ~/Work/vectorly-ai/AbstractPPL.jl/ext/AbstractPPLDifferentiationInterfaceExt.jl:74
[4] top-level scope
@ REPL[7]:1
Some type information was truncated. Use `show(err)` to see complete types.
julia> Same thing happens for value_and_jacobian!! on a vector-valued function.
Can we reuse the same input validation as VectorEvaluator{true} before calling DI, not only the length check?
There was a problem hiding this comment.
Just a personal opinion: I don't think those emojis as error looks good.
There was a problem hiding this comment.
These are from DifferentiationInterface, not AbstractPPL.
The bare `VectorEvaluator` rejected integer vectors with a clear `ArgumentError`, but the DI derivative entry points went straight to DI prep and surfaced an opaque `PreparationMismatchError`. Apply the same compile-time `T <: Integer` check before dispatching to DI, and add matching `:edge` testcases. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
No description provided.