Skip to content

Commit d2619ba

Browse files
committed
✨ Add ELSA
1 parent 6e8f6b6 commit d2619ba

File tree

2 files changed

+29
-10
lines changed

2 files changed

+29
-10
lines changed

assets/elsa_logo.png

16.9 KB
Loading

index.html

Lines changed: 29 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -97,19 +97,22 @@ <h2 style="text-align: center">Outline</h2>
9797
<h3 style="text-align: left">Introduction: Why & where is UQ helpful?</h3>
9898
<p>
9999
Initial exploration into the critical role of uncertainty quantification (UQ) within the realm
100-
of computer vision (CV): participants will gain an understanding of why it’s essential to consider uncertainty in CV, especially concerning decision-making in complex
100+
of computer vision (CV): participants will gain an understanding of why it’s essential to consider
101+
uncertainty in CV, especially concerning decision-making in complex
101102
environments. We will introduce real-world scenarios where uncertainty can profoundly
102103
impact model performance and safety, setting the stage for deeper exploration through out the tutorial.
103104
</p>
104105
<h3 style="text-align: left">From maximum a posteriori to BNNs.</h3>
105106
<p>
106107
In this part, we will journey through the evolution of UQ techniques, starting
107-
from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural Networks. The participants will grasp the conceptual foundations
108+
from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural
109+
Networks. The participants will grasp the conceptual foundations
108110
of UQ, laying the groundwork for the subsequent discussions of Bayesian methods.
109111
</p>
110112
<h3 style="text-align: left">Strategies for BNN posterior inference.</h3>
111113
<p>
112-
This is the core part, which will dive into the process of estimating the posterior distribution of BNNs. The participants
114+
This is the core part, which will dive into the process of estimating the posterior distribution of BNNs.
115+
The participants
113116
will gain insights into the computational complexities involved in modeling uncertainty
114117
through a comprehensive overview of techniques such as Variational Inference (VI),
115118
Hamiltonian Monte Carlo (HMC), and Langevin Dynamics. Moreover, we will explore
@@ -118,19 +121,21 @@ <h3 style="text-align: left">Strategies for BNN posterior inference.</h3>
118121
</p>
119122
<h3 style="text-align: left">Computationally-efficient BNNs for CV.</h3>
120123
<p>
121-
Here, we will present recent techniques to improve the computational efficiency of BNNs for computer vision tasks.
124+
Here, we will present recent techniques to improve the computational efficiency of BNNs for computer vision
125+
tasks.
122126
We will present different forms of obtaining BNNs from a intermediate checkpoints,
123127
weight trajectories during a training run, different types of variational subnetworks,
124128
etc., along with their main strenghts and limitations.
125129
</p>
126130
<h3 style="text-align: left">Convert your DNN into a BNN: post-hoc BNN inference.</h3>
127-
<p>
128-
This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The participants
131+
<p>
132+
This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The
133+
participants
129134
will learn how Laplace approximation serves as a computationally efficient method for
130135
approximating the posterior distribution of Bayesian Neural Networks.
131136
</p>
132137
<h3 style="text-align: left">Quality of estimated uncertainty and practical examples.</h3>
133-
<p>
138+
<p>
134139
In the final session, participants will learn how to evaluate the quality of UQ in practi-
135140
cal settings. We will develop multiple approaches to assess the reliability and calibra-
136141
tion of uncertainty estimates, equipping participants with the tools to gauge the robust-
@@ -142,9 +147,10 @@ <h3 style="text-align: left">Quality of estimated uncertainty and practical exam
142147
</p>
143148

144149
<h3 style="text-align: left">Uncertainty Quantification Framework.</h3>
145-
<p>
146-
This tutorial will also very quickly introduce the <a href="https://github.com/ensta-u2is-ai/torch-uncertainty">TorchUncertainty
147-
library</a>, an uncertainty-aware open-source framework for training models in PyTorch.
150+
<p>
151+
This tutorial will also very quickly introduce the <a
152+
href="https://github.com/ensta-u2is-ai/torch-uncertainty">TorchUncertainty
153+
library</a>, an uncertainty-aware open-source framework for training models in PyTorch.
148154
</p>
149155
</div>
150156

@@ -202,8 +208,21 @@ <h2 style="text-align: center">Selected References</h2>
202208
href="https://github.com/ensta-u2is-ai/awesome-uncertainty-deeplearning">Awesome Uncertainty in deep
203209
learning.</a>
204210
</div>
211+
212+
<br>
213+
214+
<div class="containertext">
215+
<h3 style="text-align: center">Andrei Bursuc is supported by</h3>
216+
217+
<center>
218+
<a href="https://elsa-ai.eu/" target="_blank"><img src="assets/elsa_logo.png" width="10%" hspace="2%" />
219+
</center>
220+
</a>
221+
</div>
205222
</div>
206223
</div>
224+
225+
207226
</section>
208227

209228
<script src="https://code.jquery.com/jquery-3.3.1.slim.min.js"

0 commit comments

Comments
 (0)