Skip to content

Commit

Permalink
Merge branch 'master' into tor/state-transition-related
Browse files Browse the repository at this point in the history
  • Loading branch information
torfjelde committed Oct 10, 2024
2 parents d9480d1 + fc8cfa6 commit 3f861bf
Show file tree
Hide file tree
Showing 10 changed files with 465 additions and 103 deletions.
49 changes: 49 additions & 0 deletions .github/workflows/DocsNav.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
name: Add Navbar

on:
page_build: # Triggers the workflow on push events to gh-pages branch
workflow_dispatch: # Allows manual triggering
schedule:
- cron: '0 0 * * 0' # Runs every week on Sunday at midnight (UTC)

jobs:
add-navbar:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Checkout gh-pages
uses: actions/checkout@v4
with:
ref: gh-pages
fetch-depth: 0

- name: Download insert_navbar.sh
run: |
curl -O https://raw.githubusercontent.com/TuringLang/turinglang.github.io/main/assets/scripts/insert_navbar.sh
chmod +x insert_navbar.sh
- name: Update Navbar
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
git config user.name github-actions[bot]
git config user.email github-actions[bot]@users.noreply.github.com
# Define the URL of the navbar to be used
NAVBAR_URL="https://raw.githubusercontent.com/TuringLang/turinglang.github.io/main/assets/scripts/TuringNavbar.html"
# Update all HTML files in the current directory (gh-pages root)
./insert_navbar.sh . $NAVBAR_URL
# Remove the insert_navbar.sh file
rm insert_navbar.sh
# Check if there are any changes
if [[ -n $(git status -s) ]]; then
git add .
git commit -m "Added navbar and removed insert_navbar.sh"
git push "https://${GITHUB_ACTOR}:${GITHUB_TOKEN}@github.com/${GITHUB_REPOSITORY}.git" gh-pages
else
echo "No changes to commit"
fi
8 changes: 5 additions & 3 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
name = "AbstractMCMC"
uuid = "80f14c24-f653-4e6a-9b94-39d6b0f70001"
keywords = ["markov chain monte carlo", "probablistic programming"]
keywords = ["markov chain monte carlo", "probabilistic programming"]
license = "MIT"
desc = "A lightweight interface for common MCMC methods."
version = "4.5.0"
version = "5.4.0"

[deps]
BangBang = "198e06fe-97b7-11e9-32a5-e1d131e6ad66"
ConsoleProgressMonitor = "88cd18e8-d9cc-4ea6-8889-5259c0d15c8b"
Distributed = "8ba89e20-285c-5b6f-9357-94700520ee1b"
FillArrays = "1a297f60-69ca-5386-bcde-b61e274b549b"
LogDensityProblems = "6fdf6af0-433a-55f7-b3ed-c6c6e0b8df7c"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
LoggingExtras = "e6f89c97-d47a-5376-807f-9c37f3926c36"
Expand All @@ -19,8 +20,9 @@ TerminalLoggers = "5d786b92-1e48-4d6f-9151-6b4477ca9bed"
Transducers = "28d57a85-8fef-5791-bfe6-a80928e7c999"

[compat]
BangBang = "0.3.19"
BangBang = "0.3.19, 0.4"
ConsoleProgressMonitor = "0.1"
FillArrays = "1"
LogDensityProblems = "2"
LoggingExtras = "0.4, 0.5, 1"
ProgressLogging = "0.1"
Expand Down
18 changes: 12 additions & 6 deletions docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,18 +71,24 @@ Common keyword arguments for regular and parallel sampling are:
- `progress` (default: `AbstractMCMC.PROGRESS[]` which is `true` initially): toggles progress logging
- `chain_type` (default: `Any`): determines the type of the returned chain
- `callback` (default: `nothing`): if `callback !== nothing`, then
`callback(rng, model, sampler, sample, state, iteration)` is called after every sampling step,
where `sample` is the most recent sample of the Markov chain and `state` and `iteration` are the current state and iteration of the sampler
- `discard_initial` (default: `0`): number of initial samples that are discarded
`callback(rng, model, sampler, sample, iteration)` is called after every sampling step,
where `sample` is the most recent sample of the Markov chain and `iteration` is the current iteration
- `num_warmup` (default: `0`): number of "warm-up" steps to take before the first "regular" step,
i.e. number of times to call [`AbstractMCMC.step_warmup`](@ref) before the first call to
[`AbstractMCMC.step`](@ref).
- `discard_initial` (default: `num_warmup`): number of initial samples that are discarded. Note that
if `discard_initial < num_warmup`, warm-up samples will also be included in the resulting samples.
- `thinning` (default: `1`): factor by which to thin samples.
- `initial_state` (default: `nothing`): if `initial_state !== nothing`, the first call to [`AbstractMCMC.step`](@ref)
is passed `initial_state` as the `state` argument.

!!! info
The common keyword arguments `progress`, `chain_type`, and `callback` are not supported by the iterator [`AbstractMCMC.steps`](@ref) and the transducer [`AbstractMCMC.Sample`](@ref).

There is no "official" way for providing initial parameter values yet.
However, multiple packages such as [EllipticalSliceSampling.jl](https://github.com/TuringLang/EllipticalSliceSampling.jl) and [AdvancedMH.jl](https://github.com/TuringLang/AdvancedMH.jl) support an `init_params` keyword argument for setting the initial values when sampling a single chain.
To ensure that sampling multiple chains "just works" when sampling of a single chain is implemented, [we decided to support `init_params` in the default implementations of the ensemble methods](https://github.com/TuringLang/AbstractMCMC.jl/pull/94):
- `init_params` (default: `nothing`): if `init_params isa AbstractArray`, then the `i`th element of `init_params` is used as initial parameters of the `i`th chain. If one wants to use the same initial parameters `x` for every chain, one can specify e.g. `init_params = FillArrays.Fill(x, N)`.
However, multiple packages such as [EllipticalSliceSampling.jl](https://github.com/TuringLang/EllipticalSliceSampling.jl) and [AdvancedMH.jl](https://github.com/TuringLang/AdvancedMH.jl) support an `initial_params` keyword argument for setting the initial values when sampling a single chain.
To ensure that sampling multiple chains "just works" when sampling of a single chain is implemented, [we decided to support `initial_params` in the default implementations of the ensemble methods](https://github.com/TuringLang/AbstractMCMC.jl/pull/94):
- `initial_params` (default: `nothing`): if `initial_params isa AbstractArray`, then the `i`th element of `initial_params` is used as initial parameters of the `i`th chain. If one wants to use the same initial parameters `x` for every chain, one can specify e.g. `initial_params = FillArrays.Fill(x, N)`.

Progress logging can be enabled and disabled globally with `AbstractMCMC.setprogress!(progress)`.

Expand Down
9 changes: 9 additions & 0 deletions docs/src/design.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,15 @@ the sampling step of the inference method.
AbstractMCMC.step
```

If one also has some special handling of the warmup-stage of sampling, then this can be specified by overloading

```@docs
AbstractMCMC.step_warmup
```

which will be used for the first `num_warmup` iterations, as specified as a keyword argument to [`AbstractMCMC.sample`](@ref).
Note that this is optional; by default it simply calls [`AbstractMCMC.step`](@ref) from above.

## Collecting samples

!!! note
Expand Down
15 changes: 15 additions & 0 deletions src/AbstractMCMC.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ using ProgressLogging: ProgressLogging
using StatsBase: StatsBase
using TerminalLoggers: TerminalLoggers
using Transducers: Transducers
using FillArrays: FillArrays

using Distributed: Distributed
using Logging: Logging
Expand Down Expand Up @@ -107,4 +108,18 @@ include("stepper.jl")
include("transducer.jl")
include("logdensityproblems.jl")

if isdefined(Base.Experimental, :register_error_hint)
function __init__()
Base.Experimental.register_error_hint(MethodError) do io, exc, argtypes, _
if Base.parentmodule(exc.f) == LogDensityProblems &&
any(a -> a <: LogDensityModel, argtypes)
print(
io,
"\n`AbstractMCMC.LogDensityModel` is a wrapper and does not itself implement the LogDensityProblems.jl interface. To use LogDensityProblems.jl methods, access the inner type with (e.g.) `logdensity(model.logdensity, params)` instead of `logdensity(model, params)`.",
)
end
end
end
end

end # module AbstractMCMC
17 changes: 17 additions & 0 deletions src/interface.jl
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,23 @@ current `state` of the sampler.
"""
function step end

"""
step_warmup(rng, model, sampler[, state; kwargs...])
Return a 2-tuple of the next sample and the next state of the MCMC `sampler` for `model`.
When sampling using [`sample`](@ref), this takes the place of [`AbstractMCMC.step`](@ref) in the first
`num_warmup` number of iterations, as specified by the `num_warmup` keyword to [`sample`](@ref).
This is useful if the sampler has an initial "warmup"-stage that is different from the
standard iteration.
By default, this simply calls [`AbstractMCMC.step`](@ref).
"""
step_warmup(rng, model, sampler; kwargs...) = step(rng, model, sampler; kwargs...)
function step_warmup(rng, model, sampler, state; kwargs...)
return step(rng, model, sampler, state; kwargs...)
end

"""
samples(sample, model, sampler[, N; kwargs...])
Expand Down
Loading

0 comments on commit 3f861bf

Please sign in to comment.