The goal of the Mooncake.jl
project is to produce an AD package which is written entirely in Julia, which improves over ForwardDiff.jl
, ReverseDiff.jl
and Zygote.jl
in several ways, and is competitive with Enzyme.jl
.
Please refer to the docs for more info.
Check that you're running a version of Julia that Mooncake.jl supports.
See the SUPPORT_POLICY.md
file for more info.
There are several ways to interact with Mooncake.jl
.
We recommend that people interact with Mooncake.jl
via DifferentiationInterface.jl
.
For example, use it as follows to compute the gradient of a function mapping a Vector{Float64}
to Float64
.
using DifferentiationInterface
import Mooncake
f(x) = sum(cos, x)
backend = AutoMooncake() # Reverse-mode AD. For forward-mode AD, use `AutoMooncakeForward()`.
x = ones(1_000)
prep = prepare_gradient(f, backend, x)
gradient(f, prep, backend, x)
You should expect that prepare_gradient
takes a little bit of time to run, but that gradient
is fast.
We are committed to ensuring support for DifferentiationInterface, which is why we recommend using that.
If you are interested in interacting more directly with Mooncake.jl
, you should consider Mooncake.value_and_gradient!!
.
See its docstring for more info.