You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At @torfjelde's suggestion, I am testing the AbstractPPL interface to see what it is missing. Here the test case is simulation-based calibration. From the interface spec, I got to this implementation:
using AbstractPPL, AbstractMCMC
# returns collection of traces, where the variable names correspond to# those produced by the generative model, except those in data_vars, and the# corresponding values are ranks of the prior draws in the posteriorfunctioncalibrate(rng, model, sampler, data_vars; nreps=1_000, ndraws=100)
joint_model = AbstractPPL.decondition(model) # decondition just in case
ranks =map(1:nreps) do _
step_rank(rng, joint_model, sampler, ndraws, data_vars)
endreturn ranks
endfunctionstep_rank(rng, joint_model, sampler, ndraws, data_vars)
θ̃_ỹ, _ = AbstractMCMC.step(rng, joint_model) # NOTE: method does not exist
ỹ, θ̃ =split_trace(θ̃_ỹ, data_vars)
posterior_model = AbstractPPL.condition(joint_model, ỹ)
θ = AbstractMCMC.sample(rng, posterior_model, sampler, ndraws)
returnrank_draw_in_sample(θ̃, θ)
endfunctionsplit_trace(draw, vars)
# split draw into part whose variables names do not match vars# and part whose variable names do, handling indices in names correctly# e.g. if draw=@T(x=10, y[1]=10, y[2]=5) and vars=(:y,),# then this returns @T(y[1]=10, y[2]=5), @T(x=10,)endfunctionrank_draw_in_sample(draw, sample)
# compute element-wise rank of all variables in draw in sample.# i.e. if draw=@T(x[1]=10) and sample=[@T(x[1]=40), @T(x[1]=10), @T(x[1]=0)],# then this returns @T(x[1]=3)end
Here is an example of how one might use this with Turing:
using Turing, Random
@modelfunctionmodel(y)
μ ~Normal(0, 1)
σ ~truncated(Normal(0, 1), 0, Inf)
y ~Normal(μ, σ)
end
rng =MersenneTwister(42)
calibrate(rng, model(1.5), NUTS(), (:y,))
What we are missing (so far):
A method like AbstractMCMC.step(rng, joint_model) to exactly sample from the joint prior and prior-predictive distribution.
functionality to manipulate traces, e.g. splitting a trace into two traces based on variable names
functionality to map over variable names and values of a trace, constructing a new trace with different values
The text was updated successfully, but these errors were encountered:
Additionally, and this is more of an AbstractMCMC comment, if sample could take an array of models and not just a model, then we could also use the MCMCThreads, MCMCDistributed, and MCMCSerial parallelization options.
thanks, @sethaxen - there are really helpful! These features are also helpful for simulation-based inference algorithms, e.g. particle-based sampling algorithms (see, https://github.com/TuringLang/AdvancedPS.jl).
At @torfjelde's suggestion, I am testing the AbstractPPL interface to see what it is missing. Here the test case is simulation-based calibration. From the interface spec, I got to this implementation:
Here is an example of how one might use this with Turing:
What we are missing (so far):
AbstractMCMC.step(rng, joint_model)
to exactly sample from the joint prior and prior-predictive distribution.The text was updated successfully, but these errors were encountered: