Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use subsumes in subset #587

Merged
merged 3 commits into from
Apr 16, 2024
Merged

Use subsumes in subset #587

merged 3 commits into from
Apr 16, 2024

Conversation

torfjelde
Copy link
Member

@torfjelde torfjelde commented Apr 15, 2024

I was using the new (experimental) Gibbs sampler in Turing.jl (TuringLang/Turing.jl#2099) I realized that there the current subset functionality, which enables the new Gibbs sampler, is not quite flexible enough to handle all the cases that the current Gibbs sampler can.

In particular, we need to use subsumes in subset rather than require explicit presence of the VarName we're interested in.

The reason for this is to support models where the size of the support might be changing. For example:

julia> using Distributions, DynamicPPL

julia> @model function demo()
           n ~ Categorical(10)
           x = Vector{Float64}(undef, n)
           for i = 1:n
               x[i] ~ Normal()
           end
       end
demo (generic function with 2 methods)

julia> model = demo();

julia> varinfo = VarInfo();

julia> DynamicPPL.evaluate!!(model, varinfo)

Suppose I now want to do:

subset(varinfo, @varname(x))

That is currently not possible since we only allow passing varnames that are explicit present in the varinfo. And so in the above example we have to constantly query the varinfo for which x[i] are present, which we'd have to do using subsumes, and then call subset with these.

It seems a common enough pattern to me that it's worth just using subsumes already in the impl of subset so we avoid these annoyances on the user-side.

Note that this is technically not prohibiting feature-parity with the existing Gibbs sampler, as we could implement this subsumes filtering in the Gibbs sampler itself before calling subset, but this PR just seems cleaner IMO.

@coveralls
Copy link

coveralls commented Apr 16, 2024

Pull Request Test Coverage Report for Build 8701844296

Details

  • 10 of 10 (100.0%) changed or added relevant lines in 3 files are covered.
  • 2 unchanged lines in 2 files lost coverage.
  • Overall coverage increased (+0.2%) to 84.973%

Files with Coverage Reduction New Missed Lines %
src/compiler.jl 1 93.69%
src/simple_varinfo.jl 1 85.87%
Totals Coverage Status
Change from base Build 8111716086: 0.2%
Covered Lines: 2720
Relevant Lines: 3201

💛 - Coveralls

Copy link

codecov bot commented Apr 16, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 83.56%. Comparing base (6a2454f) to head (8bdf5f6).

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #587      +/-   ##
==========================================
- Coverage   84.47%   83.56%   -0.91%     
==========================================
  Files          28       28              
  Lines        3207     3212       +5     
==========================================
- Hits         2709     2684      -25     
- Misses        498      528      +30     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@torfjelde
Copy link
Member Author

@devmotion @yebai might want to have a lookk at this

Copy link
Member

@yebai yebai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good except for a clarity question below.

@@ -456,7 +459,7 @@ function _subset(x::NamedTuple, vns)
end

syms = map(getsym, vns)
return NamedTuple{Tuple(syms)}(Tuple(map(Base.Fix2(getindex, x), syms)))
return NamedTuple{Tuple(syms)}(Tuple(map(Base.Fix1(getindex, x), syms)))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get this. Is this related to this PR, or just an unrelated bugfix?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unrelated bugfix. Had a typo in the tests so we weren't actually testing SimpleVarInfo at all before; discovered the bug when I fixed the tests.

@yebai yebai enabled auto-merge April 16, 2024 20:47
@yebai yebai added this pull request to the merge queue Apr 16, 2024
Merged via the queue into master with commit 9f5266a Apr 16, 2024
12 of 14 checks passed
@yebai yebai deleted the torfjelde/subsumes-improvement branch April 16, 2024 22:26
torfjelde added a commit to TuringLang/Turing.jl that referenced this pull request Apr 17, 2024
torfjelde added a commit to TuringLang/Turing.jl that referenced this pull request Apr 21, 2024
* initial work on the new gibbs sampler

* added tests for the new Gibbs sampler

* added tests for new Gibbs

* new Gibbs is now sampling (correctly) sequentially

* let's not overload merge just yet

* export GibbsV2 + added more samplers to the tests

* added TODO comment

* removed lots of varinfo related merging functionality that is now
available in DynamicPPL

* shifting some code around

* removed redundant constructor for GibbsV2

* added GibbsContext which is similar to FixContext but also computes
the log-prob of the fixed variables

* adopted the rerun mechanism in Gibbs for GibbsV2, thus fixing the
issues with some of the tests for GibbsV2

* broken tests are no longer broken

* fix issues with dot_tilde_* impls for GibbsContext

* fix for dot_tilde_assume when using GibbsContext

* fixed re-running of models for Gibbs sampling properly this time

* added new gibbs to tests

* added some further comments on why we need `GibbsContext`

* went back to using `DynamicPPL.condition` rather than using custom
`GibbsContext` while we wait for
TuringLang/DynamicPPL.jl#563 to be merged

* add concrete comment about reverting changes for `gibbs_condition`

* Update test/mcmc/gibbs_new.jl

Co-authored-by: Hong Ge <[email protected]>

* fixed recursive definition of `condition` varinfos

* use `fix` instead of `condition`

* Revert "use `fix` instead of `condition`"

This reverts commit f87e2d1.

* rmeoved unnused symbol

* Revert "went back to using `DynamicPPL.condition` rather than using custom"

This reverts commit 53bd707.

* bump compat entry of DynamicPPL so we can overload acclogp!

* update assume for SMC samplers to make use of new `acclogp!`

* added proper impl of acclogp!! for SMC samplers + made accessing
task local varinfo and rng a bit nicer

* added experimental module and moved gibbs to it

* fixed now-inccorect references in new gibbs file

* updated gibbs tests

* moved experimental gibbs tests

* updated tests to include experiemntal tests

* removed refrences to previews tests of experimental Gibbs sampler

* removed solved TODO

* added a comment on `reconstruct_getvalue` usage

* bump patch version

* added comments on future work

* Update test/experimental/gibbs.jl

* fixed bug where particle samplers didn't properly account for
weightings of logpdf, etc.

* relax atol for a numerical test with Gibbs a bit

* fixed bug with `AbstractDict` constructor for experimental `Gibbs`

* aaaalways link the varinfo in the new Gibbs sampler, just to be sure

* add test to cover recent improvement to `DynamicPPL.subset`

ref: TuringLang/DynamicPPL.jl#587

* bump compat entry for DynamicPPL

* added some docstrings

* fixed test

* fixed import

* another attempt at fixing tests

* another attempt at fixing tests

* attempt at fix tests

* forgot something in previos commit

* cleaned up the experimental Gibbs sampler a bit

* removed accidentaly psuedocode inclusion

* Apply suggestions from code review

* relaxed olerance in one MH test a bit

* bump patch version

---------

Co-authored-by: Hong Ge <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants