From 6a5da97bf5690145dc813cc2c4f7c46efb9f6eb8 Mon Sep 17 00:00:00 2001 From: Gabriel Gress Date: Fri, 24 May 2024 16:27:29 +0900 Subject: [PATCH 1/2] Updated retcode success check to the current version used by SciMLBase. Introduced new noisy observations that are better suited for the problem, and corrected the model to calculate the likelihood based on multiple trajectories rather than a single trajectory. --- .../index.qmd | 28 +++++++++++++------ 1 file changed, 19 insertions(+), 9 deletions(-) diff --git a/tutorials/10-bayesian-stochastic-differential-equations/index.qmd b/tutorials/10-bayesian-stochastic-differential-equations/index.qmd index 6166ec270..215bbfed8 100644 --- a/tutorials/10-bayesian-stochastic-differential-equations/index.qmd +++ b/tutorials/10-bayesian-stochastic-differential-equations/index.qmd @@ -17,6 +17,7 @@ Pkg.instantiate(); ```{julia} using Turing using DifferentialEquations +using DifferentialEquations.EnsembleAnalysis # Load StatsPlots for visualizations and diagnostics. using StatsPlots @@ -117,6 +118,10 @@ prob_sde = SDEProblem(lotka_volterra!, multiplicative_noise!, u0, tspan, p) ensembleprob = EnsembleProblem(prob_sde) data = solve(ensembleprob, SOSRI(); saveat=0.1, trajectories=1000) plot(EnsembleSummary(data)) + +# We generate new noisy observations based on the stochastic model for the parameter estimation tasks in this tutorial. +# We create our observations by adding random normally distributed noise to the mean of the ensemble simulation. +sdedata = reduce(hcat, timeseries_steps_mean(data).u) + 0.8 * randn(size(reduce(hcat, timeseries_steps_mean(data).u))) ``` ```{julia} @@ -132,17 +137,23 @@ plot(EnsembleSummary(data)) # Simulate stochastic Lotka-Volterra model. p = [α, β, γ, δ, ϕ1, ϕ2] - predicted = solve(prob, SOSRI(); p=p, saveat=0.1) + remake(prob, p = p) + ensembleprob = EnsembleProblem(prob) + predicted = solve(ensembleprob, SOSRI(); saveat=0.1, trajectories = 1000) # Early exit if simulation could not be computed successfully. - if predicted.retcode !== :Success - Turing.@addlogprob! -Inf - return nothing + for i in 1:length(predicted) + if !SciMLBase.successful_retcode(predicted[i]) + Turing.@addlogprob! -Inf + return nothing + end end # Observations. - for i in 1:length(predicted) - data[:, i] ~ MvNormal(predicted[i], σ^2 * I) + for j in 1:length(predicted) + for i in 1:length(predicted[j]) + data[:, i] ~ MvNormal(predicted[j][i], σ^2 * I) + end end return nothing @@ -154,9 +165,8 @@ Therefore we use NUTS with a low target acceptance rate of `0.25` and specify a SGHMC might be a more suitable algorithm to be used here. ```{julia} -model_sde = fitlv_sde(odedata, prob_sde) +model_sde = fitlv_sde(sdedata, prob_sde) -setadbackend(:forwarddiff) chain_sde = sample( model_sde, NUTS(0.25), @@ -165,4 +175,4 @@ chain_sde = sample( progress=false, ) plot(chain_sde) -``` +``` \ No newline at end of file From 44885f2a9e78c5565af4c424badf872c5823b7f2 Mon Sep 17 00:00:00 2001 From: Gabriel Gress Date: Mon, 27 May 2024 12:38:24 +0900 Subject: [PATCH 2/2] Added more explanation on the likelihood calculation --- .../10-bayesian-stochastic-differential-equations/index.qmd | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/tutorials/10-bayesian-stochastic-differential-equations/index.qmd b/tutorials/10-bayesian-stochastic-differential-equations/index.qmd index 215bbfed8..785d77f26 100644 --- a/tutorials/10-bayesian-stochastic-differential-equations/index.qmd +++ b/tutorials/10-bayesian-stochastic-differential-equations/index.qmd @@ -150,6 +150,7 @@ sdedata = reduce(hcat, timeseries_steps_mean(data).u) + 0.8 * randn(size(reduce( end # Observations. + # We compute the likelihood for each trajectory of our simulation in order to better approximate the overall likelihood of our choice of parameters for j in 1:length(predicted) for i in 1:length(predicted[j]) data[:, i] ~ MvNormal(predicted[j][i], σ^2 * I) @@ -175,4 +176,4 @@ chain_sde = sample( progress=false, ) plot(chain_sde) -``` \ No newline at end of file +```