From f65a778a16b77d3bbf7f413fd0f55b7bc31a5f31 Mon Sep 17 00:00:00 2001 From: Matthew Feickert Date: Wed, 24 Jan 2024 02:12:17 -0600 Subject: [PATCH] Update talk slides just to introduce neos --- talk.md | 48 ++++++++++++++++++++++++++++++------------------ 1 file changed, 30 insertions(+), 18 deletions(-) diff --git a/talk.md b/talk.md index cb46f22..dddfe3f 100644 --- a/talk.md +++ b/talk.md @@ -765,29 +765,41 @@ $$ .bold.center[Having access to the gradients can make the fit orders of magnitude faster than finite difference] --- -# Enable new techniques with autodiff - -.kol-2-3[ -* Familiar (toy) example: Optimizing selection "cut" for an analysis.
-Place discriminate selection cut on observable $x$ to maximize significance. -* Traditionally, step along values in $x$ and calculate significance at each selection. Keep maximum. -* Need differentiable analogue to non-differentiable "cut".
-Weight events using activation function of sigmoid - -.center[$w=\left(1 + e^{-\alpha(x-c)}\right)^{-1}$] - -* Most importantly though, with the differentiable model we have access to the gradient $\partial_{x} f(x)$ -* So can find the maximum significance at the point where the gradient of the significance is zero $\partial_{x} f(x) = 0$ -* With a simple gradient descent algorithm can easily automate the significance optimization +# Enabling new tools with autodiff [TODO: CLARIFY] +.kol-1-1[ +.kol-1-3[ +

+ +

] -.kol-1-3.center[ +.kol-1-3[ +

+ +

+] +.kol-1-3[

- - - +

] +] + +.kol-1-3[ +* Counting experiment for presence of signal process +* Place discriminate selection cut on observable $x$ to maximize significance $f(x)$ +* Step along cut values in $x$ and calculate significance +] +.kol-1-3[ +* Need differentiable analogue to non-differentiable cut +* Weight events using activation function of sigmoid + +.center[$w=\left(1 + e^{-\alpha(x-c)}\right)^{-1}$] +] +.kol-1-3[ +* With a simple gradient descent algorithm can easily automate the significance optimization +* Allows for the "cut" to become a parameter that can be differentiated through for the larger analysis +] --- # New Art: Analysis as a Differentiable Program