-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix + test for compiled ReverseDiff without linking #2097
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## master #2097 +/- ##
======================================
Coverage 0.00% 0.00%
======================================
Files 21 21
Lines 1451 1451
======================================
Misses 1451 1451
☔ View full report in Codecov by Sentry. |
Interesting, if I assume that people use autodiff for gradient based sampling algorithms, then are there gradient algorithms that do not require unconstrained space? |
Yup, e.g. reflective HMC, though we don't currently have these implemented (though there is interest: TuringLang/AdvancedHMC.jl#310) |
@torfjelde actually, why compile with input zeros causing wrong result? Is it because ReverseDiff use zero inputs for specialization? |
In light of TuringLang/Turing.jl#2097, we know sometimes computation with `ReverseDiff` compiled can be wrong because `LogDensityProblemsAD` uses zeros array for the compilation process. This PR added a function `getparams` similar to [`DynamicPPL.jl`'s](https://github.com/TuringLang/DynamicPPL.jl/blob/d204fcb658a889421525365808b9830be37d3fdb/src/logdensityfunction.jl#L89). The PR also update the function `get_params_varinfo` so that we can return a DPPL compatible `SimpleVarInfo` with values in unconstrained space.
Currently, we have this (from https://turinglang.org/TuringBenchmarking.jl/dev/):
That is, compiled ReverseDiff is incorrect when not linking! Super-strange, right?
Weeeell, not so much; LogDensityProblemsAD.jl uses
zeros
as the default input for compiling the tape, which, in the case where we have not performed any linking, causes issues with models involving, say, positively constrained distributions a laInverseGamma
: https://github.com/tpapp/LogDensityProblemsAD.jl/blob/e13061ff72ddedb1fccf4deeb69f713972300239/ext/LogDensityProblemsADReverseDiffExt.jl#L54-L58Note that this is not LogDensityProblemsAD.jl's fault, as it assumes we're working in unconstrained space.
This PR addresses this issue. It's not a very common use-case, but it's useful for identifying performance issues with transformations + it's also relevant if we want to work with
Float32
instead ofFloat64
, as the current implementation would then compile the tape withFloat64
every time.