-
Notifications
You must be signed in to change notification settings - Fork 27
Description
Consider the following MWE
using ReverseDiff
using FiniteDifferences
using LinearAlgebra
p = [1.0,2.0,3.0]
q = [4.0,5.0,6.0]
function f(x)
return FiniteDifferences.jvp(FiniteDifferences.central_fdm(3,1), norm, (p,x))
end
ReverseDiff.gradient(f, q)
This fails because of a variety of issues. First of, to_vec
fails to produce an AbstractVector{<:Real}
for _jvp
to consume:
ERROR: MethodError: no method matching jvp(::FiniteDifferences.AdaptedFiniteDifferenceMethod{…}, ::typeof(norm), ::Vector{…}, ::ReverseDiff.TrackedArray{…})
The function `jvp` exists, but no method is defined for this combination of argument types.
Closest candidates are:
jvp(::Any, ::Any, ::Tuple{Any, Any})
@ FiniteDifferences ~/.julia/packages/FiniteDifferences/IPGFN/src/grad.jl:57
jvp(::Any, ::Any, ::Tuple{Any, Any}...)
@ FiniteDifferences ~/.julia/packages/FiniteDifferences/IPGFN/src/grad.jl:62
This can be remedied in this simple example by just directly using _jvp
(evaluating f
on q
will then still work entirely without errors).
But the more important issue is that the closure that FiniteDifferences._jvp
then generates is not type stable and that seems to cause the estimated step size to become tracked by ReverseDiff
, and then the conversion inside _eval_function
(to the eltype
of the point at which the derivative is taken) fails:
ERROR: MethodError: no method matching Float64(::ReverseDiff.TrackedReal{Float64, Float64, Nothing})
The type `Float64` exists, but no method is defined for this combination of argument types when trying to construct it.
Closest candidates are:
(::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat
@ Base rounding.jl:265
(::Type{T})(::T) where T<:Number
@ Core boot.jl:900
Float64(::Irrational{:SQRT_HALF})
@ Random irrationals.jl:251
...
Stacktrace:
[1] _eval_function(m::FiniteDifferences.AdaptedFiniteDifferenceMethod{…}, f::FiniteDifferences.var"#86#87"{…}, x::Float64, step::ReverseDiff.TrackedReal{…})
@ FiniteDifferences ~/.julia/packages/FiniteDifferences/IPGFN/src/methods.jl:249
[2] (::FiniteDifferences.AdaptedFiniteDifferenceMethod{…})(f::FiniteDifferences.var"#86#87"{…}, x::Float64, step::ReverseDiff.TrackedReal{…})
@ FiniteDifferences ~/.julia/packages/FiniteDifferences/IPGFN/src/methods.jl:240
[3] (::FiniteDifferences.AdaptedFiniteDifferenceMethod{…})(f::FiniteDifferences.var"#86#87"{…}, x::Float64)
@ FiniteDifferences ~/.julia/packages/FiniteDifferences/IPGFN/src/methods.jl:194
[4] _jvp(fdm::FiniteDifferences.AdaptedFiniteDifferenceMethod{…}, f::typeof(norm), x::Vector{…}, ẋ::ReverseDiff.TrackedArray{…})
@ FiniteDifferences ~/.julia/packages/FiniteDifferences/IPGFN/src/grad.jl:48
[5] f(x::ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}})
@ Main ./REPL[26]:2
[6] ReverseDiff.GradientTape(f::typeof(f), input::Vector{…}, cfg::ReverseDiff.GradientConfig{…})
@ ReverseDiff ~/.julia/packages/ReverseDiff/rKZaG/src/api/tape.jl:199
[7] gradient(f::Function, input::Vector{Float64}, cfg::ReverseDiff.GradientConfig{ReverseDiff.TrackedArray{…}})
@ ReverseDiff ~/.julia/packages/ReverseDiff/rKZaG/src/api/gradients.jl:22
[8] top-level scope
@ REPL[27]:1
Changing to adapt=0
in central_fdm
solves this issue entirely (though this still requires using _jvp
). Is this use case something that should work? Should jvp
be changed to accept more generic inputs for the direction? Maybe one could add the adapt=0
restriction/solution as a remark somewhere to the docs?