Skip to content
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,4 @@ julia = "1.6"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test"]
test = ["Test"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is the sort of whitespace change it's nice to not include in a code review...

1 change: 0 additions & 1 deletion docs/.gitignore

This file was deleted.

3 changes: 2 additions & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Gen = "ea4f424c-a589-11e8-07c0-fd5c91b9da4a"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"

[compat]
Documenter = "0.27"
Documenter = "1"
25 changes: 25 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Website Docs
- `pages.jl` to find skeleton of website.
- `make.jl` to build the website index.

The docs are divided in roughly four sections:
- Getting Started + Tutorials
- How-to Guides
- API = Modeling API + Inference API
- Explanations + Internals

# Build Docs Locally
To build the docs, run `julia --make.jl` or alternatively startup the Julia REPL and include `make.jl`.

# Add Tutorial Code
Currently you must write the tutorial directly in the docs rather than a source file (e.g. Quarto). See `getting_started` or `tutorials` for examples.

Code snippets must use the triple backtick with a label to run. The environment carries over so long as the labels match. Example:

```@example tutorial_1
x = rand()
```

```@example tutorial_1
print(x)
```
6 changes: 0 additions & 6 deletions docs/build_docs_locally.sh

This file was deleted.

41 changes: 11 additions & 30 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,37 +1,18 @@
# Run: julia --project make.jl
using Documenter, Gen

include("pages.jl")
makedocs(
sitename = "Gen",
modules = [Gen],
pages = [
"Home" => "index.md",
"Getting Started" => "getting_started.md",
"Tutorials" => "tutorials.md",
"Modeling Languages and APIs" => [
"Generative Functions" => "ref/gfi.md",
"Probability Distributions" => "ref/distributions.md",
"Built-in Modeling Language" => "ref/modeling.md",
"Generative Function Combinators" => "ref/combinators.md",
"Choice Maps" => "ref/choice_maps.md",
"Selections" => "ref/selections.md",
"Optimizing Trainable Parameters" => "ref/parameter_optimization.md",
"Trace Translators" => "ref/trace_translators.md",
"Extending Gen" => "ref/extending.md"
],
"Standard Inference Library" => [
"Importance Sampling" => "ref/importance.md",
"MAP Optimization" => "ref/map.md",
"Markov chain Monte Carlo" => "ref/mcmc.md",
"MAP Optimization" => "ref/map.md",
"Particle Filtering" => "ref/pf.md",
"Variational Inference" => "ref/vi.md",
"Learning Generative Functions" => "ref/learning.md"
],
"Internals" => [
"Optimizing Trainable Parameters" => "ref/internals/parameter_optimization.md",
"Modeling Language Implementation" => "ref/internals/language_implementation.md"
]
]
doctest = false,
clean = true,
warnonly = true,
format = Documenter.HTML(;
assets = String["assets/header.js", "assets/header.css", "assets/theme.css"],
collapselevel=1,
),
sitename = "Gen.jl",
pages = pages,
)

deploydocs(
Expand Down
53 changes: 53 additions & 0 deletions docs/pages.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
pages = [
"Home" => "index.md",
"Getting Started" => [
"Example 1: Linear Regression" => "getting_started/linear_regression.md",
],
"Tutorials" => [
"Basics" => [
"tutorials/modeling_in_gen.md",
"tutorials/gfi.md",
"tutorials/combinators.md",
"tutorials/mcmc.md",
"tutorials/vi.md"
],
"Advanced" => [
"tutorials/modeling_in_gen.md",
"tutorials/trace_translators.md",
],
"Modeling Languages" => [
],
],
"How-to Guides" => [
"Custom Distributions" => "how_to/custom_distributions.md",
"Custom Modeling Languages" => "how_to/custom_dsl.md",
"Custom Gradients" => "how_to/custom_derivatives.md",
"Incremental Computation" => "how_to/custom_incremental_computation.md",
],
"API Reference" => [
"Modeling Library" => [
"Generative Functions" => "api/model/gfi.md",
"Probability Distributions" => "api/model/distributions.md",
"Built-in Modeling Languages" => "api/model/modeling.md",
"Combinators" => "api/model/combinators.md",
"Choice Maps" => "api/model/choice_maps.md",
"Selections" => "api/model/selections.md",
"Optimizing Trainable Parameters" => "api/model/parameter_optimization.md",
"Trace Translators" => "api/model/trace_translators.md",
"Differential Programming" => "api/model/differential_programming.md"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be "Differentiable programming"! In both the label and the filename.

],
"Standard Inference Library" => [
"Importance Sampling" => "api/inference/importance.md",
"MAP Optimization" => "api/inference/map.md",
"Markov chain Monte Carlo" => "api/inference/mcmc.md",
"MAP Optimization" => "api/inference/map.md",
"Particle Filtering" => "api/inference/pf.md",
"Variational Inference" => "api/inference/vi.md",
"Learning Generative Functions" => "api/inference/learning.md"
],
],
"Explanation and Internals" => [
"Optimizing Trainable Parameters" => "explanations/parameter_optimization.md",
"Modeling Language Implementation" => "explanations/language_implementation.md"
]
]
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ Then, the traces of the model can be obtained by simulating from the variational
Instead of fitting the variational approximation from scratch for each observation, it is possible to fit an *inference model* instead, that takes as input the observation, and generates a distribution on latent variables as output (as in the wake sleep algorithm).
When we train the variational approximation by minimizing the evidence lower bound (ELBO) this is called amortized variational inference.
Variational autencoders are an example.
It is possible to perform amortized variational inference using [`black_box_vi`](@ref) or [`black_box_vimco!`](@ref).
It is possible to perform amortized variational inference using [`black_box_vi!`](@ref) or [`black_box_vimco!`](@ref).

## References

Expand Down
File renamed without changes.
13 changes: 13 additions & 0 deletions docs/src/api/inference/mcmc.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Markov chain Monte Carlo (MCMC)
```@docs
metropolis_hastings
mh
mala
hmc
elliptical_slice
@pkern
@kern
@rkern
reversal
involutive_mcmc
```
File renamed without changes.
7 changes: 7 additions & 0 deletions docs/src/api/inference/vi.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
## Variational inference
There are two procedures in the inference library for performing black box variational inference.
Each of these procedures can also train the model using stochastic gradient descent, as in a variational autoencoder.
```@docs
black_box_vi!
black_box_vimco!
```
Original file line number Diff line number Diff line change
Expand Up @@ -50,3 +50,8 @@ choicemap
set_value!
set_submap!
```

```@docs
DynamicChoiceMap
EmptyChoiceMap
```
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,4 +1,11 @@
# Probability Distributions
# [Probability Distributions](@id distributions)

```@docs
random
logpdf
has_output_grad
logpdf_grad
```

Gen provides a library of built-in probability distributions, and four ways of
defining custom distributions, each of which are explained below:
Expand Down
46 changes: 46 additions & 0 deletions docs/src/api/model/gfi.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
## [Generative Functions](@id gfi_api)

```@docs
GenerativeFunction
Trace
```

The complete set of methods in the generative function interface (GFI) is:

```@docs
simulate
generate
update
regenerate
get_args
get_retval
get_choices
get_score
get_gen_fn
Base.getindex
project
propose
assess
has_argument_grads
accepts_output_grad
accumulate_param_gradients!
choice_gradients
get_params
```

```@docs
NoChange
UnknownChange
```

```@docs
CustomUpdateGF
apply_with_state
update_with_state
```

```@docs
CustomGradientGF
apply
gradient
```
34 changes: 17 additions & 17 deletions docs/src/ref/modeling.md → docs/src/api/model/modeling.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Built-in Modeling Language
# [The Dynamic Modeling Language](@id dynamic_modeling_language)

Gen provides a built-in embedded modeling language for defining generative functions.
The language uses a syntax that extends Julia's syntax for defining regular Julia functions, and is also referred to as the **Dynamic Modeling Language**.
Expand Down Expand Up @@ -29,14 +29,14 @@ We can also trace its execution:
```
Optional arguments can be left out of the above operations, and default values will be filled in automatically:
```julia
julia> (trace, _) = generate(foo, (,));
julia> (trace, _) = generate(foo, ())
julia> get_args(trace)
(0.1,)
```
See [Generative Functions](@ref) for the full set of operations supported by a generative function.
See [Generative Functions](@ref gfi_api) for the full set of operations supported by a generative function.
Note that the built-in modeling language described in this section is only one of many ways of defining a generative function -- generative functions can also be constructed using other embedded languages, or by directly implementing the methods of the generative function interface.
However, the built-in modeling language is intended to being flexible enough cover a wide range of use cases.
In the remainder of this section, we refer to generative functions defined using the built-in modeling language as `@gen` functions. Details about the implementation of `@gen` functions can be found in the [Modeling Language Implementation](@ref) section.
In the remainder of this section, we refer to generative functions defined using the built-in modeling language as `@gen` functions. Details about the implementation of `@gen` functions can be found in the [Modeling Language Implementation](@ref language-implementation) section.

## Annotations

Expand All @@ -57,7 +57,7 @@ Each argument can have the following different syntactic forms:

Currently, the possible argument annotations are:

- `grad` (see [Differentiable programming](@ref)).
- `grad` (see [Differentiable programming](@ref differentiable_modeling)).

**Function annotations.** The `@gen` function itself can also be optionally associated with zero or more annotations, which are separate from the per-argument annotations.
Function-level annotations use the following different syntactic forms:
Expand All @@ -70,19 +70,19 @@ Function-level annotations use the following different syntactic forms:

Currently the possible function annotations are:

- `grad` (see [Differentiable programming](@ref)).
- `grad` (see [Differentiable programming](@ref differentiable_modeling)).

- `static` (see [Static Modeling Language](@ref)).
- `static` (see [Static Modeling Language](@ref sml)).

- `nojuliacache` (see [Static Modeling Language](@ref)).
- `nojuliacache` (see [Static Modeling Language](@ref sml)).

## Making random choices

Random choices are made by calling a probability distribution on some arguments:
```julia
val::Bool = bernoulli(0.5)
```
See [Probability Distributions](@ref) for the set of built-in probability distributions, and for information on implementing new probability distributions.
See [Probability Distributions](@ref distributions) for the set of built-in probability distributions, and for information on implementing new probability distributions.

In the body of a `@gen` function, wrapping a call to a random choice with an `@trace` expression associates the random choice with an *address*, and evaluates to the value of the random choice.
The syntax is:
Expand Down Expand Up @@ -145,7 +145,7 @@ It is recommended to write disciplined generative functions when possible.

**Untraced call**:
If `foo` is a generative function, we can invoke `foo` from within the body of a `@gen` function using regular call syntax.
The random choices made within the call are not given addresses in our trace, and are therefore *untraced* random choices (see [Generative Function Interface](@ref) for details on untraced random choices).
The random choices made within the call are not given addresses in our trace, and are therefore *untraced* random choices (see [Generative Function Interface](@ref gfi) for details on untraced random choices).
```julia
val = foo(0.5)
```
Expand Down Expand Up @@ -247,10 +247,10 @@ Note that `~` is also defined in `Base` as a unary operator that performs the bi

Like regular Julia functions, `@gen` functions return either the expression used in a `return` keyword, or by evaluating the last expression in the function body.
Note that the return value of a `@gen` function is different from a trace of `@gen` function, which contains the return value associated with an execution as well as the assignment to each random choice made during the execution.
See [Generative Function Interface](@ref) for more information about traces.
See [Generative Function Interface](@ref gfi) for more information about traces.


## Trainable parameters
## [Trainable Parameters](@id trainable_parameters_modeling)

A `@gen` function may begin with an optional block of *trainable parameter declarations*.
The block consists of a sequence of statements, beginning with `@param`, that declare the name and Julia type for each trainable parameter.
Expand Down Expand Up @@ -281,7 +281,7 @@ zero_param_grad!
Trainable parameters are designed to be trained using gradient-based methods.
This is discussed in the next section.

## Differentiable programming
## [Differentiable Programming](@id differentiable_modeling)

Given a trace of a `@gen` function, Gen supports automatic differentiation of the log probability (density) of all of the random choices made in the trace with respect to the following types of inputs:

Expand Down Expand Up @@ -371,7 +371,7 @@ See [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl) for more details.
When making a random choice, each argument is either a tracked value or not.
If the argument is a tracked value, then the probability distribution must support differentiation of the log probability (density) with respect to that argument.
Otherwise, an error is thrown.
The [`has_argument_grads`](@ref) function indicates which arguments support differentiation for a given distribution (see [Probability Distributions](@ref)).
The [`has_argument_grads`](@ref) function indicates which arguments support differentiation for a given distribution (see [Probability Distributions](@ref distributions)).
If the gradient is required for the *value* of a random choice, the distribution must support differentiation of the log probability (density) with respect to the value.
This is indicated by the [`has_output_grad`](@ref) function.

Expand All @@ -381,7 +381,7 @@ It is an error if a tracked value is passed as an argument of a generative funct
If a generative function `gen_fn` has `accepts_output_grad(gen_fn) = true`, then the return value of the generative function call will be tracked and will propagate further through the caller `@gen` function's computation.


## Static Modeling Language
## [Static Modeling Language](@id sml)

The *static modeling language* is a restricted variant of the built-in modeling language.
Models written in the static modeling language can result in better inference performance (more inference operations per second and less memory consumption), than the full built-in modeling language, especially for models used with iterative inference algorithms like Markov chain Monte Carlo.
Expand All @@ -399,7 +399,7 @@ end
```
After running this code, `foo` is a Julia value whose type is a subtype of `StaticIRGenerativeFunction`, which is a subtype of [`GenerativeFunction`](@ref).

### Static computation graph
### Static Computation Graphs
Using the `static` annotation instructs Gen to statically construct a directed acyclic graph for the computation represented by the body of the function.
For the function `foo` above, the static graph looks like:
```@raw html
Expand Down Expand Up @@ -431,7 +431,7 @@ First, the definition of a `(static)` generative function is always expected to
Next, in order to be able to construct the static graph, Gen restricts the permitted syntax that can be used in functions annotated with `static`.
In particular, each statement in the body must be one of the following:

- A `@param` statement specifying any [Trainable parameters](@ref), e.g.:
- A `@param` statement specifying any [trainable parameters](@ref trainable_parameters_modeling), e.g.:

```julia
@param theta::Float64
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Optimizing Trainable Parameters
# Trainable Parameters(@trainable_parameter_optimization)

Trainable parameters of generative functions are initialized differently depending on the type of generative function.
Trainable parameters of the built-in modeling language are initialized with [`init_param!`](@ref).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@ If we use this selection in the context of a trace of the function `bar` below,
@trace(normal(0, 1), :z)
@trace(normal(0, 1), :w)
end
end

@gen function bar()
@trace(bernoulli(0.5), :x)
Expand Down
Loading