You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+11-5Lines changed: 11 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,9 +17,15 @@ The interface sits at roughly the same level as that of [Distributions.jl](https
17
17
18
18
## Conventions
19
19
20
-
A `BayesianLinearRegressor` in `D` dimensions works with data where:
21
-
- inputs `X` should be a `D x N` matrix of `Real`s where each column is from one data point.
22
-
- outputs `y` should be an `N`-vector of `Real`s, where each element is from one data point.
20
+
`BayesianLinearRegressors` is consistent with `AbstractGPs`.
21
+
Consequently, a `BayesianLinearRegressor` in `D` dimensions can work with the following input types:
22
+
1.`ColVecs` -- a wrapper around an `D x N` matrix of `Real`s saying that each column should be interpreted as an input.
23
+
2.`RowVecs`s -- a wrapper around an `N x D` matrix of `Real`s, saying that each row should be interpreted as an input.
24
+
3.`Matrix{<:Real}` -- must be `D x N`. Prefer using `ColVecs` or `RowVecs` for the sake of being explicit.
25
+
26
+
Consult the `Design` section of the [KernelFunctions.jl](https://juliagaussianprocesses.github.io/KernelFunctions.jl/dev/design/) docs for more info on these conventions.
27
+
28
+
Outputs for a BayesianLinearRegressor should be an `AbstractVector{<:Real}` of length `N`.
23
29
24
30
## Example Usage
25
31
@@ -38,7 +44,7 @@ f = BayesianLinearRegressor(mw, Λw)
38
44
39
45
# Index into the regressor and assume heterscedastic observation noise `Σ_noise`.
40
46
N =10
41
-
X =collect(hcat(collect(range(-5.0, 5.0, length=N)), ones(N))')
47
+
X =ColVecs(collect(hcat(collect(range(-5.0, 5.0, length=N)), ones(N))'))
42
48
Σ_noise =Diagonal(exp.(randn(N)))
43
49
fX =f(X, Σ_noise)
44
50
@@ -70,7 +76,7 @@ logpdf(f′(X, Σ_noise), y)
70
76
71
77
# Sample from the posterior predictive distribution.
0 commit comments