|
32 | 32 | <h1 class="project-name">A Bayesian Odyssey in Uncertainty: from Theoretical
|
33 | 33 | Foundations to Real-World Applications</h1>
|
34 | 34 | <h2 class="project-tagline">ECCV 2024 - Room: <strong>Suite 7</strong><br><strong>30 Sept</strong> - 8:30 AM to
|
35 |
| - 12:30 PM<br>This tutorial will be |
36 |
| - <strong>available online and recorded</strong> |
| 35 | + 12:30 PM<br>This recording of the tutorial will be |
| 36 | + <strong>available online</strong>. |
37 | 37 | </h2>
|
38 | 38 | </section>
|
39 | 39 |
|
@@ -97,146 +97,172 @@ <h2 style="text-align: center">Overview</h2>
|
97 | 97 | </p>
|
98 | 98 | </div>
|
99 | 99 |
|
100 |
| - <br> |
101 |
| - |
102 | 100 | <div class="containertext" style="max-width:50rem">
|
103 |
| - <h2 style="text-align: center">Outline</h2> |
104 |
| - |
105 |
| - <h3 style="text-align: left">Introduction: Why & where is UQ helpful?</h3> |
106 |
| - <p> |
107 |
| - Initial exploration into the critical role of uncertainty quantification (UQ) within the realm |
108 |
| - of computer vision (CV): participants will gain an understanding of why it’s essential to consider |
109 |
| - uncertainty in CV, especially concerning decision-making in complex |
110 |
| - environments. We will introduce real-world scenarios where uncertainty can profoundly |
111 |
| - impact model performance and safety, setting the stage for deeper exploration through out the tutorial. |
112 |
| - </p> |
113 |
| - <h3 style="text-align: left">From maximum a posteriori to BNNs.</h3> |
114 |
| - <p> |
115 |
| - In this part, we will journey through the evolution of UQ techniques, starting |
116 |
| - from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural |
117 |
| - Networks. The participants will grasp the conceptual foundations |
118 |
| - of UQ, laying the groundwork for the subsequent discussions of Bayesian methods. |
119 |
| - </p> |
120 |
| - <h3 style="text-align: left">Strategies for BNN posterior inference.</h3> |
121 |
| - <p> |
122 |
| - This is the core part, which will dive into the process of estimating the posterior distribution of BNNs. |
123 |
| - The participants |
124 |
| - will gain insights into the computational complexities involved in modeling uncertainty |
125 |
| - through a comprehensive overview of techniques such as Variational Inference (VI), |
126 |
| - Hamiltonian Monte Carlo (HMC), and Langevin Dynamics. Moreover, we will explore |
127 |
| - the characteristics and visual representation of posterior distributions, providing a better |
128 |
| - understanding of Bayesian inference. |
129 |
| - </p> |
130 |
| - <h3 style="text-align: left">Computationally-efficient BNNs for CV.</h3> |
| 101 | + <h2 style="text-align: center">Schedule</h2> |
131 | 102 | <p>
|
132 |
| - Here, we will present recent techniques to improve the computational efficiency of BNNs for computer vision |
133 |
| - tasks. |
134 |
| - We will present different forms of obtaining BNNs from a intermediate checkpoints, |
135 |
| - weight trajectories during a training run, different types of variational subnetworks, |
136 |
| - etc., along with their main strenghts and limitations. |
137 |
| - </p> |
138 |
| - <h3 style="text-align: left">Convert your DNN into a BNN: post-hoc BNN inference.</h3> |
139 |
| - <p> |
140 |
| - This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The |
141 |
| - participants |
142 |
| - will learn how Laplace approximation serves as a computationally efficient method for |
143 |
| - approximating the posterior distribution of Bayesian Neural Networks. |
144 |
| - </p> |
145 |
| - <h3 style="text-align: left">Quality of estimated uncertainty and practical examples.</h3> |
146 |
| - <p> |
147 |
| - In the final session, participants will learn how to evaluate the quality of UQ in practi- |
148 |
| - cal settings. We will develop multiple approaches to assess the reliability and calibra- |
149 |
| - tion of uncertainty estimates, equipping participants with the tools to gauge the robust- |
150 |
| - ness of their models. Additionally, we will dive into real-world examples and applica- |
151 |
| - tions, showcasing how UQ can enhance the reliability |
152 |
| - and performance of computer vision systems in diverse scenarios. Through interactive |
153 |
| - discussions and case studies, participants will gain practical insights into deploying |
154 |
| - uncertainty-aware models in real-world applications. |
155 |
| - </p> |
| 103 | + <ul> |
| 104 | + <li>8:45-9:15: Opening - Andrei</li> |
| 105 | + <li> |
| 106 | + 9:15-10:05: Uncertainty quantification: from maximum a posteriori to BNNs - Pavel (remotely) |
| 107 | + </li> |
| 108 | + <li>10:05-10:30: Computationally-efficient BNNs for computer vision - Gianni</li> |
| 109 | + <li>10:35-11:00: Coffee</li> |
| 110 | + <li>11:00-11:50: Convert your DNN into a BNN - Alexander</li> |
| 111 | + <li>11:50-12:20: Quality of estimated uncertainty and practical examples - Adrien (remotely) & Gianni </li> |
| 112 | + <li>12:20-12:40: Closing remarks + Q&A - Andrei, Alex, Pavel & Gianni</li> |
| 113 | + </ul> |
156 | 114 |
|
157 |
| - <h3 style="text-align: left">Uncertainty Quantification Framework.</h3> |
158 |
| - <p> |
159 |
| - This tutorial will also very quickly introduce the <a |
160 |
| - href="https://github.com/ensta-u2is-ai/torch-uncertainty">TorchUncertainty |
161 |
| - library</a>, an uncertainty-aware open-source framework for training models in PyTorch. |
162 |
| - </p> |
163 |
| - </div> |
164 | 115 |
|
165 |
| - <a href="https://torch-uncertainty.github.io/" target="_blank"> |
166 |
| - <div><img src="assets/logoTU_full.png" width="20%" hspace="2%"> </div> |
167 |
| - </a> |
168 | 116 |
|
169 |
| - <br> |
170 | 117 |
|
171 |
| - <div class="containertext" style="max-width:50rem"> |
172 |
| - <h2 style="text-align: center">Relation to prior tutorials and short courses</h2> |
173 |
| - <p> This tutorial is affiliated with the <a href="https://uncv2023.github.io/">UNCV Workshop</a>, |
174 |
| - which had its inaugural edition at ECCV 2022, a subsequent one at ICCV, and is back at ECCV this year. |
175 |
| - In constrast to the workshop, the tutorial puts its primary emphasis on the theoretical facets. </p> |
176 |
| - <p> UQ has received some attention |
177 |
| - in recent times, as evidenced by its inclusion in |
178 |
| - the tutorial <a href="https://abursuc.github.io/many-faces-reliability/">'Many Faces of Reliability of Deep |
179 |
| - Learning for Real-World Deployment'</a>. While this tutorial explored various applications associated with |
180 |
| - uncertainty, |
181 |
| - it did not place a specific emphasis on probabilistic models and Bayesian Neural Networks. Our tutorial aims |
182 |
| - to provide a more in-depth exploration of uncertainty theory, accompanied by the introduction of practical |
183 |
| - applications, including the presentation of the library, <a |
184 |
| - href="https://github.com/ensta-u2is-ai/torch-uncertainty">TorchUncertainty</a>.</p> |
185 |
| - </div> |
186 | 118 |
|
187 |
| - <div class="containertext" style="max-width:50rem"> |
188 |
| - <h2 style="text-align: center">Selected References</h2> |
189 |
| - <ol> |
190 |
| - <li><b>Immer, A.</b>, Palumbo, E., Marx, A., & Vogt, J. E. E<a |
191 |
| - href="https://proceedings.neurips.cc/paper_files/paper/2023/file/a901d5540789a086ee0881a82211b63d-Paper-Conference.pdf"> |
192 |
| - Effective Bayesian Heteroscedastic Regres- |
193 |
| - sion with Deep Neural Networks</a>. In NeurIPS, 2023.</li> |
194 |
| - <li><b>Franchi, G., Bursuc, A.,</b> Aldea, E., Dubuisson, S., |
195 |
| - & Bloch, I. <a href="https://arxiv.org/pdf/2012.02818">Encoding the latent posterior of |
196 |
| - Bayesian Neural Networks for uncertainty quantification</a>. IEEE TPAMI, 2023.</li> |
197 |
| - <li><b>Franchi, G.</b>, Yu, X., <b>Bursuc, A.</b>, Aldea, E., Dubuisson, |
198 |
| - S., & Filliat, D. <a href="https://arxiv.org/pdf/2207.10130">Latent Discriminant |
199 |
| - deterministic Uncertainty</a>. In ECCV 2022.</li> |
200 |
| - <li><b>Laurent, O.</b>, <b>Lafage, A.</b>, Tartaglione, E., Daniel, G., |
201 |
| - Martinez, J. M., <b>Bursuc, A.</b>, & <b>Franchi, G.</b> |
202 |
| - <a href="https://arxiv.org/pdf/2210.09184">Packed-Ensembles for Efficient Uncertainty Estimation</a>. In |
203 |
| - ICLR 2023. |
204 |
| - </li> |
205 |
| - <li><b>Izmailov, P.</b>, Vikram, S., Hoffman, M. D., & Wilson, A. G. <a |
206 |
| - href="https://arxiv.org/pdf/2104.14421">What are Bayesian neural network |
207 |
| - posteriors really like?</a> In ICML, 2021.</li> |
208 |
| - <li><b>Izmailov, P.</b>, Maddox, W. J., Kirichenko, P., Garipov, T., Vetrov, D., & Wilson, A. G. <a |
209 |
| - href="https://arxiv.org/pdf/1907.07504">Subspace inference for Bayesian deep learning</a>. In UAI, 2020. |
210 |
| - </li> |
211 |
| - <li><b>Franchi, G.</b>, <b>Bursuc, A.</b>, Aldea, E., Dubuisson, S., & |
212 |
| - Bloch, I. <a href="https://arxiv.org/pdf/1912.11316">TRADI: Tracking deep neural |
213 |
| - network weight distributions</a>. In ECCV 2020.</li> |
214 |
| - <li>Wilson, A. G., & <b>Izmailov, P</b>. <a href="https://arxiv.org/pdf/2002.08791">Bayesian deep |
215 |
| - learning and a probabilistic perspective of generalization</a>. In NeurIPS, 2020.</li> |
216 |
| - <li>Hendrycks, D., Dietterich, T. <a href="https://arxiv.org/pdf/1903.12261">Benchmarking Neural Network |
217 |
| - Robustness to Common Corruptions and |
218 |
| - Perturbations</a>. In ICLR 2019.</li> |
219 |
| - <li><b> Izmailov, P.</b>, Podoprikhin, D., Garipov, T., Vetrov, D., & Wilson, A. G. <a |
220 |
| - href="https://arxiv.org/pdf/1803.05407">Averaging weights |
221 |
| - leads to wider optima and better generalization</a>. In UAI, 2018. </li> |
222 |
| - </ol> |
223 |
| - You will find more references in the <a |
224 |
| - href="https://github.com/ensta-u2is-ai/awesome-uncertainty-deeplearning">Awesome Uncertainty in deep |
225 |
| - learning.</a> |
226 |
| - </div> |
227 | 119 |
|
228 |
| - <br> |
| 120 | + </p> |
| 121 | + <br> |
229 | 122 |
|
230 |
| - <div class="containertext"> |
231 |
| - <h3 style="text-align: center">Andrei Bursuc is supported by ELSA:</h3> |
| 123 | + <div class="containertext" style="max-width:50rem"> |
| 124 | + <h2 style="text-align: center">Outline</h2> |
232 | 125 |
|
233 |
| - <center> |
234 |
| - <a href="https://elsa-ai.eu/" target="_blank"><img src="assets/elsa_logo.png" width="10%" hspace="2%" /> |
235 |
| - </center> |
| 126 | + <h3 style="text-align: left">Introduction: Why & where is UQ helpful?</h3> |
| 127 | + <p> |
| 128 | + Initial exploration into the critical role of uncertainty quantification (UQ) within the realm |
| 129 | + of computer vision (CV): participants will gain an understanding of why it’s essential to consider |
| 130 | + uncertainty in CV, especially concerning decision-making in complex |
| 131 | + environments. We will introduce real-world scenarios where uncertainty can profoundly |
| 132 | + impact model performance and safety, setting the stage for deeper exploration through out the tutorial. |
| 133 | + </p> |
| 134 | + <h3 style="text-align: left">From maximum a posteriori to BNNs.</h3> |
| 135 | + <p> |
| 136 | + In this part, we will journey through the evolution of UQ techniques, starting |
| 137 | + from classic approaches such as maximum a posteriori estimation to the more ellaborate Bayesian Neural |
| 138 | + Networks. The participants will grasp the conceptual foundations |
| 139 | + of UQ, laying the groundwork for the subsequent discussions of Bayesian methods. |
| 140 | + </p> |
| 141 | + <h3 style="text-align: left">Strategies for BNN posterior inference.</h3> |
| 142 | + <p> |
| 143 | + This is the core part, which will dive into the process of estimating the posterior distribution of BNNs. |
| 144 | + The participants |
| 145 | + will gain insights into the computational complexities involved in modeling uncertainty |
| 146 | + through a comprehensive overview of techniques such as Variational Inference (VI), |
| 147 | + Hamiltonian Monte Carlo (HMC), and Langevin Dynamics. Moreover, we will explore |
| 148 | + the characteristics and visual representation of posterior distributions, providing a better |
| 149 | + understanding of Bayesian inference. |
| 150 | + </p> |
| 151 | + <h3 style="text-align: left">Computationally-efficient BNNs for CV.</h3> |
| 152 | + <p> |
| 153 | + Here, we will present recent techniques to improve the computational efficiency of BNNs for computer |
| 154 | + vision |
| 155 | + tasks. |
| 156 | + We will present different forms of obtaining BNNs from a intermediate checkpoints, |
| 157 | + weight trajectories during a training run, different types of variational subnetworks, |
| 158 | + etc., along with their main strenghts and limitations. |
| 159 | + </p> |
| 160 | + <h3 style="text-align: left">Convert your DNN into a BNN: post-hoc BNN inference.</h3> |
| 161 | + <p> |
| 162 | + This segment focuses on post-hoc inference techniques, with a focus on Laplace approximation. The |
| 163 | + participants |
| 164 | + will learn how Laplace approximation serves as a computationally efficient method for |
| 165 | + approximating the posterior distribution of Bayesian Neural Networks. |
| 166 | + </p> |
| 167 | + <h3 style="text-align: left">Quality of estimated uncertainty and practical examples.</h3> |
| 168 | + <p> |
| 169 | + In the final session, participants will learn how to evaluate the quality of UQ in practi- |
| 170 | + cal settings. We will develop multiple approaches to assess the reliability and calibra- |
| 171 | + tion of uncertainty estimates, equipping participants with the tools to gauge the robust- |
| 172 | + ness of their models. Additionally, we will dive into real-world examples and applica- |
| 173 | + tions, showcasing how UQ can enhance the reliability |
| 174 | + and performance of computer vision systems in diverse scenarios. Through interactive |
| 175 | + discussions and case studies, participants will gain practical insights into deploying |
| 176 | + uncertainty-aware models in real-world applications. |
| 177 | + </p> |
| 178 | + |
| 179 | + <h3 style="text-align: left">Uncertainty Quantification Framework.</h3> |
| 180 | + <p> |
| 181 | + This tutorial will also very quickly introduce the <a |
| 182 | + href="https://github.com/ensta-u2is-ai/torch-uncertainty">TorchUncertainty |
| 183 | + library</a>, an uncertainty-aware open-source framework for training models in PyTorch. |
| 184 | + </p> |
| 185 | + </div> |
| 186 | + |
| 187 | + <a href="https://torch-uncertainty.github.io/" target="_blank"> |
| 188 | + <div><img src="assets/logoTU_full.png" width="20%" hspace="2%"> </div> |
236 | 189 | </a>
|
| 190 | + |
| 191 | + <br> |
| 192 | + |
| 193 | + <div class="containertext" style="max-width:50rem"> |
| 194 | + <h2 style="text-align: center">Relation to prior tutorials and short courses</h2> |
| 195 | + <p> This tutorial is affiliated with the <a href="https://uncv2023.github.io/">UNCV Workshop</a>, |
| 196 | + which had its inaugural edition at ECCV 2022, a subsequent one at ICCV, and is back at ECCV this year. |
| 197 | + In constrast to the workshop, the tutorial puts its primary emphasis on the theoretical facets. </p> |
| 198 | + <p> UQ has received some attention |
| 199 | + in recent times, as evidenced by its inclusion in |
| 200 | + the tutorial <a href="https://abursuc.github.io/many-faces-reliability/">'Many Faces of Reliability of |
| 201 | + Deep |
| 202 | + Learning for Real-World Deployment'</a>. While this tutorial explored various applications associated |
| 203 | + with |
| 204 | + uncertainty, |
| 205 | + it did not place a specific emphasis on probabilistic models and Bayesian Neural Networks. Our tutorial |
| 206 | + aims |
| 207 | + to provide a more in-depth exploration of uncertainty theory, accompanied by the introduction of practical |
| 208 | + applications, including the presentation of the library, <a |
| 209 | + href="https://github.com/ensta-u2is-ai/torch-uncertainty">TorchUncertainty</a>.</p> |
| 210 | + </div> |
| 211 | + |
| 212 | + <div class="containertext" style="max-width:50rem"> |
| 213 | + <h2 style="text-align: center">Selected References</h2> |
| 214 | + <ol> |
| 215 | + <li><b>Immer, A.</b>, Palumbo, E., Marx, A., & Vogt, J. E. E<a |
| 216 | + href="https://proceedings.neurips.cc/paper_files/paper/2023/file/a901d5540789a086ee0881a82211b63d-Paper-Conference.pdf"> |
| 217 | + Effective Bayesian Heteroscedastic Regres- |
| 218 | + sion with Deep Neural Networks</a>. In NeurIPS, 2023.</li> |
| 219 | + <li><b>Franchi, G., Bursuc, A.,</b> Aldea, E., Dubuisson, S., |
| 220 | + & Bloch, I. <a href="https://arxiv.org/pdf/2012.02818">Encoding the latent posterior of |
| 221 | + Bayesian Neural Networks for uncertainty quantification</a>. IEEE TPAMI, 2023.</li> |
| 222 | + <li><b>Franchi, G.</b>, Yu, X., <b>Bursuc, A.</b>, Aldea, E., Dubuisson, |
| 223 | + S., & Filliat, D. <a href="https://arxiv.org/pdf/2207.10130">Latent Discriminant |
| 224 | + deterministic Uncertainty</a>. In ECCV 2022.</li> |
| 225 | + <li><b>Laurent, O.</b>, <b>Lafage, A.</b>, Tartaglione, E., Daniel, G., |
| 226 | + Martinez, J. M., <b>Bursuc, A.</b>, & <b>Franchi, G.</b> |
| 227 | + <a href="https://arxiv.org/pdf/2210.09184">Packed-Ensembles for Efficient Uncertainty Estimation</a>. In |
| 228 | + ICLR 2023. |
| 229 | + </li> |
| 230 | + <li><b>Izmailov, P.</b>, Vikram, S., Hoffman, M. D., & Wilson, A. G. <a |
| 231 | + href="https://arxiv.org/pdf/2104.14421">What are Bayesian neural network |
| 232 | + posteriors really like?</a> In ICML, 2021.</li> |
| 233 | + <li><b>Izmailov, P.</b>, Maddox, W. J., Kirichenko, P., Garipov, T., Vetrov, D., & Wilson, A. G. <a |
| 234 | + href="https://arxiv.org/pdf/1907.07504">Subspace inference for Bayesian deep learning</a>. In UAI, |
| 235 | + 2020. |
| 236 | + </li> |
| 237 | + <li><b>Franchi, G.</b>, <b>Bursuc, A.</b>, Aldea, E., Dubuisson, S., & |
| 238 | + Bloch, I. <a href="https://arxiv.org/pdf/1912.11316">TRADI: Tracking deep neural |
| 239 | + network weight distributions</a>. In ECCV 2020.</li> |
| 240 | + <li>Wilson, A. G., & <b>Izmailov, P</b>. <a href="https://arxiv.org/pdf/2002.08791">Bayesian deep |
| 241 | + learning and a probabilistic perspective of generalization</a>. In NeurIPS, 2020.</li> |
| 242 | + <li>Hendrycks, D., Dietterich, T. <a href="https://arxiv.org/pdf/1903.12261">Benchmarking Neural Network |
| 243 | + Robustness to Common Corruptions and |
| 244 | + Perturbations</a>. In ICLR 2019.</li> |
| 245 | + <li><b> Izmailov, P.</b>, Podoprikhin, D., Garipov, T., Vetrov, D., & Wilson, A. G. <a |
| 246 | + href="https://arxiv.org/pdf/1803.05407">Averaging weights |
| 247 | + leads to wider optima and better generalization</a>. In UAI, 2018. </li> |
| 248 | + </ol> |
| 249 | + You will find more references in the <a |
| 250 | + href="https://github.com/ensta-u2is-ai/awesome-uncertainty-deeplearning">Awesome Uncertainty in deep |
| 251 | + learning.</a> |
| 252 | + </div> |
| 253 | + |
| 254 | + <br> |
| 255 | + |
| 256 | + <div class="containertext"> |
| 257 | + <h3 style="text-align: center">Andrei Bursuc is supported by ELSA:</h3> |
| 258 | + |
| 259 | + <center> |
| 260 | + <a href="https://elsa-ai.eu/" target="_blank"><img src="assets/elsa_logo.png" width="10%" hspace="2%" /> |
| 261 | + </center> |
| 262 | + </a> |
| 263 | + </div> |
237 | 264 | </div>
|
238 | 265 | </div>
|
239 |
| - </div> |
240 | 266 |
|
241 | 267 |
|
242 | 268 | </section>
|
|
0 commit comments