forked from ffbobservatory/ffbobservatory.github.io
-
Notifications
You must be signed in to change notification settings - Fork 1
/
workshops.html
395 lines (295 loc) · 38.3 KB
/
workshops.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
<!doctype html>
<html class="no-js" lang="en">
<head>
<meta charset="utf-8"/>
<meta name="viewport" content="width=device-width, initial-scale=1.0"/>
<meta name="description" content="Fruit Fly Brain Observatory">
<meta name="author" content="FFBO Team">
<title>Get Involved | Workshops</title>
<!-- Bootstrap Core CSS -->
<link href="css/bootstrap.min.css" rel="stylesheet">
<link href="css/ffbo.css" rel="stylesheet">
<link rel="shortcut icon" href="ico/favicon.ico" />
</head>
<body>
<!-- PRELOADING -->
<!--div id="preload">
<div class="preload">
<div class="loader">
</div>
<h2>loading ...</h2>
</div>
</div-->
<!-- Navigation -->
<nav class="navbar navbar-fixed-top navbar-ffbo top-nav-collapse" role="navigation">
<div class="container">
<div data-scroll-header class="navbar-header">
<button type="button" class="navbar-toggle" data-toggle="collapse" data-target="#nav">
<span class="sr-only">Toggle navigation</span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
</button>
<a class="navbar-brand" href="index.html"><span><img width="50px" height="50px" src="img/ffbo_logo.png"></span> Fruit Fly Brain Observatory</a>
</div>
<div class="collapse navbar-collapse" id="nav">
<ul class="nav navbar-nav navbar-right uppercase">
<li>
<a class="dropdown-toggle" role="button" data-toggle="dropdown" href="#">Get Started</a>
<ul class="dropdown-menu" role="menu">
<li class="dropdown-submenu">
<a href="#" class="dropdown-toggle" data-toggle="dropdown">About</a>
<ul class="dropdown-menu" role="menu">
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#mission-2">Background</a></li>
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#overview-2">Overview</a></li>
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#neuronlp-2">NeuroNLP</a></li>
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#neurogfx-2">NeuroGFX</a></li>
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#neuroapp-2">NeuroAPPs</a></li>
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#under-the-hood-2">Under The Hood</a></li>
<li><a tabindex="-1" data-scroll href="http://fruitflybrain.org#team-2">About Us</a></li>
</ul>
</li>
<li class="divider"></li>
<li class="dropdown-submenu">
<a tabindex="-1" href="#" class="dropdown-toggle" data-toggle="dropdown">Get Involved</a>
<ul class="dropdown-menu" role="menu">
<li><a tabindex="-1" href="faq.html" target="_blank">FAQs</a></li>
<li><a tabindex="-1" href="code.html" target="_blank">Code</a></li>
<li><a tabindex="-1" href="hackathons.html" target="_blank">Hackathons</a></li>
<li><a tabindex="-1" href="https://lists.columbia.edu/mailman/listinfo/ffbo" target="_blank">Mailing List</a></li>
<li><a tabindex="-1" href="workshops.html" target="_blank">Workshops</a></li>
<li><a tabindex="-1" href="https://github.com/NeuralEnsemble/NeuroinformaticsTutorial/blob/master/Exercises/Exercise6_FruitFlyBrainObservatory.md" target="_blank">Tutorials</a></li>
</ul>
</li>
</ul>
</li>
<li><a class="dropdown-toggle" role="button" data-toggle="dropdown">NeuroNLP</a>
<ul class="dropdown-menu" role="menu">
<li><a tabindex="-1" data-scroll href="index.html#neuronlp-2">What is NeuroNLP</a></li>
<li class="divider"></li>
<li><a href="https://neuronlp.fruitflybrain.org" target="_blank">Launch NeuroNLP.Adult (with FlyCircuit and Janelia Medulla 7-column Data)</a></li>
<li class="divider"></li>
<li><a href="https://neuronlp.larva.fruitflybrain.org" target="_blank">Launch NeuroNLP.Larva (with Janelia Larva Olfaction Data)</a></li>
<li class="divider"></li>
<li><a href="http://hemibrain.fruitflybrain.org/" target="_blank">Download NeuroNLP.EM (with Hemibrain and VNC data)</a></li>
</ul>
</li>
<li><a class="dropdown-toggle" role="button" data-toggle="dropdown">NeuroGFX</a>
<ul class="dropdown-menu" role="menu">
<li><a tabindex="-1" data-scroll href="index.html#neurogfx-2">What is NeuroGFX</a></li>
<li class="divider"></li>
<li><a href="https://neurogfx.fruitflybrain.org" target="_blank">Launch NeuroGFX</a></li>
</ul>
</li>
<li>
<a class="dropdown-toggle" role="button" data-toggle="dropdown">NeuroAPPs</a>
<ul class="dropdown-menu" role="menu">
<li><a tabindex="-1" data-scroll href="index.html#neuroapp-2">What are NeuroAPPs</a></li>
<li class="divider"></li>
<li><a href="https://neuroapps.fruitflybrain.org/retinal_degeneration" target="_blank">Retinal Degeneration</a></li>
<li><a href="https://neuroapps.fruitflybrain.org/parkinsons/olfaction" target="_blank">Parkinson's Disease: Olfaction</a></li>
<li><a href="https://neuroapps.fruitflybrain.org/parkinsons/vision" target="_blank">Parkinson's Disease: Vision</a></li>
<li><a href="https://neuroapps.fruitflybrain.org/epilepsy" target="_blank">Epilepsy</a></li>
</ul>
</li>
</ul>
</div>
<!-- /.navbar-collapse -->
</div>
<!-- /.container -->
</nav>
<div class="container" style="padding-top:100px;padding-left:40px">
<div class="col-lg-12">
<h2>Columbia Workshop on Brain Circuits, Memory and Computation 2020</h2>
<h4>BCMC 2020
<br>March 16-17, 2020
<br>Zoom Webinar
<br>Columbia University, New York, NY, USA</h4><br>
<h4>Update on Recorded Presentations</h4>
<p>The organizer would like to thank again all the speakers and the audience for participating in the virtual workshop.
The presentations from the workshop can be accessed from this
<a href="https://www.youtube.com/playlist?list=PLw4dBUPYQ7UtfarNApx9hSIGUJ4ALETbv">YouTube playlist</a>.
</p>
<hr />
<h4>Update on COVID-19</h4>
Due to the COVID-19 emergency, the Columbia Workshop on Brain Circuits, Memory and Computation will be hosted as a virtual workshop using <a href="https://zoom.us/">Zoom</a> Webinar.
Please read this page carefully about instructions to register and the modified schedule.
<hr />
<h4>Update on Program</h4>
<p>Due to a few canceled talks, we will start the virtual meeting on Tuesday, March 17, 2020, at 1:30PM. Webinar will be open to join starting from 1:00PM.
Please see the updated schedule below. The virtual meeting on Monday still starts at 9:00AM.</p>
<hr />
<h4>Overview</h4>
<p>The goal of the workshop is to bring together researchers interested in developing executable models of
neural computation/processing of the brain of model organisms.
Of interest are models of computation that consist of elementary units
of processing using brain circuits and memory elements. Elementary units of computation/processing
include population encoding/decoding circuits with biophysically-grounded neuron models, non-linear dendritic processors for motion detection/direction selectivity, spike processing and pattern recognition neural circuits,
movement control and decision-making circuits, etc.
Memory units include models of spatio-temporal memory circuits, circuit models for memory access and
storage, etc.
A major aim of the workshop is to explore the integration of various sensory and control circuits in higher brain centers.
</p>
<hr />
<h4>Organizer and Program Chair</h4>
<p><a href="http://www.ee.columbia.edu/~aurel">Aurel A. Lazar</a>, Columbia University</p>
<hr />
<h4>Sponsorship</h4>
<p>The 2020 Columbia Workshop on Brain Circuits, Memory and Computation is supported by the</p>
<p><a class="reference external" href="http://www.ee.columbia.edu">Department of Electrical Engineering</a>, Columbia University</p>
<p><a class="reference external" href="https://datascience.columbia.edu/computing-systems">Center for Computing Systems for Data-Driven Science</a>, Data Science Institute, Columbia University</p>
<p><a class="reference external" href="http://www.engineering.columbia.edu">School of Engineering and Applied Science</a>, Columbia University</p>
<hr />
<h4>Registration</h4>
<p>1. Registration link: <a href="https://zoom.us/webinar/register/2015839536140/WN_SSos0Cn2RsqcV3XX81cNcw">https://zoom.us/webinar/register/2015839536140/WN_SSos0Cn2RsqcV3XX81cNcw</a>.</p>
<p>2. Using Zoom: You can register for a free Zoom account <a href="https://zoom.us/signup">here</a>, download a <a href="https://zoom.us/download#client_4meeting">Desktop App</a> or a <a href="https://zoom.us/download#client_4meeting">Mobile App</a> to join the meeting.
<hr />
<h4>Details about Zoom Webinar</h4>
<p>1. The workshop will be hosted as a Webinar. Speakers will join as panelists, who can speak, share camera and screen. The audience are view-only.</p>
<p>2. Each speaker will have 45 minutes for their talks, followed by a 15 minute Q&A session. Due to the unusual format of the workshop, we ask the audience to refrain from asking questions (using the "Raise Hand" functionality) during the talk (note that some of the talks will be pre-recorded). After the talk, there will be a 15-minute Q&A session, when any attendee can type in questions. Speaker can choose to answer the questions during the 15 minutes and optionally by text after the talk.</p>
<p>3. <strong>How to structure your question:</strong> Please type in your name and organization followed by your question.
<hr />
<h4>Program Overview (All times EDT (GMT-4:00) )</h4>
<!-- (<a href="http://bionet.github.io/papers/bcmc_program20.pdf">PDF version including the list of close by restaurants</a>) -->
<hr class="docutils" />
<h4>Monday, 9:00 AM - 5:30 PM, March 16, 2020</h4>
<hr class="docutils" />
<h5>Morning Session: Connectomics and Synaptomics (9:00 AM - 10:00 AM)</h5>
<hr class="docutils" />
</p>9:00 AM - 10:00 AM<p>
<p><strong>A Connectome of the Fly Central Brain and Implications for Analysis</strong></p>
<p><a href="https://www.janelia.org/people/stephen-plaza">Stephen Plaza</a>, Janelia Research Campus, Ashburn, VA.</p>
<p>Janelia FlyEM and Google produced the largest dense connectome ever consisting of around 25,000 neurons and 20 million synaptic connections in the central fly brain. In this talk, I will first highlight the various methods that made this reconstruction possible and then discuss the feasibility and economics of future connectomic efforts. This large connectome poses many challenges for data interpretation. The second part of this talk will discuss various considerations for using this data for different types of biological questions. To simplify data analysis, our team introduces compact data representations and many tools for navigating the dataset.</p>
<hr class="docutils" />
<h5>Morning Session: Cell Classification and the Functional Organization of the Mouse Visual Cortex (10:00 AM - 12:00 PM)</h5>
<hr class="docutils" />
</p>10:00 AM - 11:00 AM<p>
<p><strong>Cell Type Classification and Multi-modal Correspondence in Mouse Visual Cortex</strong></p>
<p><a href="https://alleninstitute.org/what-we-do/brain-science/about/team/staff-profiles/nathan-gouwens/">Nathan Gouwens</a>, Allen Institute of Brain Science, Seattle, WA.</p>
<p>Addressing the complexity of neuronal circuits by classifying cellular components into meaningful types is an ongoing challenge. Recent single-cell transcriptomic studies have provided high-dimensional, high-throughput classifications of neocortical neurons. However, it is unclear to what extent these types are consistent in other domains, such as in their intrinsic electrophysiological and morphological properties. To investigate this, we have systematically recorded from mouse visual cortical interneurons labeled by transgenic lines, followed by morphological reconstruction and transcriptomic analysis when possible. Our analyses inform cell type classification by characterizing patterns of cell-by-cell co-variation in gene expression, local morphology, and electrophysiological properties. We have also used these data to build biophysical models that can populate circuit models reflecting our current understanding of cortical cell types.</p>
<hr class="docutils" />
</p>11:00 AM - 12:00 PM<p>
<p><strong>A Large-Scale Standardized Physiological Pipeline Reveals Functional Organization of the Mouse Visual Cortex</p></strong>
<p><a href="https://alleninstitute.org/what-we-do/brain-science/about/team/staff-profiles/saskia-de-vries/">Saskia E.J. de Vries</a>, Allen Institute for Brain Science, Seattle, WA.</p>
<p>An important open question in visual neuroscience is how visual information is represented in cortex. Important results characterized neural coding by assessing the responses to artificial stimuli, with the assumption that responses to gratings, for example, capture the key features of neural responses, and deviations, such as extra-classical effects, are relatively minor. The failure of these responses to have strong predictive power has renewed these questions. It has been suggested that this characterization of visual responses has been strongly influenced by the biases inherent in recording methods and the limited stimuli used in experiments. In creating the Allen Brain Observatory, we sought to reduce these biases by recording large populations of neurons in the mouse visual cortex using a broad array of stimuli, both artificial and natural. This open dataset is a large-scale, systematic survey of physiological activity in the awake mouse cortex recorded using 2-photon calcium imaging. Neural activity was recorded in cortical neurons of awake mice who were presented a variety of visual stimuli, including gratings, noise, natural images, and natural movies. This dataset consists of over 63,000 neurons recorded in over 1300 imaging sessions, surveying 6 cortical areas, 4 cortical layers, and 14 transgenically defined cell types (Cre lines). We found that visual responses throughout the mouse cortex are highly variable. Using the joint reliabilities of responses to multiple stimuli, we classify neurons into functional classes and validate this classification with models of visual responses. Only 10% of neurons in the mouse visual cortex show reliable responses to all of the stimuli used, and are reasonably well predicted by linear-nonlinear models. The remaining neurons fall into classes characterized by responses to specific subsets of the stimuli and the neurons in the largest class do not reliably responsive to any of the stimuli. These classes reveal a functional organization within the mouse visual cortex wherein putative dorsal areas show specialization for visual motion signals.</p>
<hr class="docutils" />
<h5>Lunch Break 12:00 PM - 1:30 PM</h5>
<hr class="docutils" />
<h5>Afternoon Session: Vision Circuits and Visual Perception in the Fruit Fly Brain (1:30 PM - 3:30 PM)</h5>
<hr class="docutils" />
</p>1:30 PM - 2:30 PM<p>
<p><strong>Binocular Photoreceptor Microsaccades Give Fruit Fly Hyperacute 3D-Vision</strong></p>
<p><a href="http://cognition.group.shef.ac.uk">Mikko Juusola</a>, Centre for Cognition in Small Brains, The University of Sheffield.</p>
<p>Neural mechanisms behind stereovision, which requires simultaneous disparity inputs from two eyes, have remained mysterious. Here we show how ultrafast synchronous mirror-symmetric photomechanical contractions in the frontal forward-facing left and right eye photoreceptors give Drosophila super-resolution 3D-vision. By combining in vivo 100-nm-resolution x-ray imaging with electrophysiology and fly genetics, in vivo high-speed optical imaging, mathematical modelling and behavioural paradigms, we reveal how these photoreceptor microsaccades - by verging and narrowing the eyes’ overlapping receptive fields - channel depth information, as phasic binocular image motion disparity signals in time, to hyperacute stereovision and learning. We further show how peripherally, outside the stereoscopic sampling, photoreceptor microsaccades match a forward flying fly’s optic flow field to better resolve the world in motion. These results change our understanding of how insect compound eyes work, highlight the importance of fast photoreceptor vergence for enhancing 3D perception, and suggest coding strategies to improve man-made sensors.</p>
<hr class="docutils" />
</p>2:30 PM - 3:30 PM<p>
<p><strong>An evolutionarily convergent circuit for color vision in Drosophila</strong></p>
<a href="http://behnialab.neuroscience.columbia.edu" target="_blank">Rudy Behnia</a>, Columbia University, New York
<br />
</p>Spectral information is commonly processed in the brain through generation of antagonistic responses to different wavelengths. In many species, these color opponent signals arise as early as photoreceptor terminals. Here, we measure the spectral tuning of photoreceptors in Drosophila. In addition to a previously described pathway comparing wavelengths at each point in space, we find a horizontal-cell-mediated pathway similar to that found in mammals. This pathway enables additional spectral comparisons through lateral inhibition, expanding the range of chromatic encoding in the fly. Together, these two pathways enable efficient decorrelation and dimensionality reduction of photoreceptor signals while retaining maximal chromatic information. A biologically constrained model accounts for our findings and predicts a spatio-chromatic receptive field for fly photoreceptor outputs, with a color opponent center and broadband surround. This dual mechanism combines motifs of both an insect-specific visual circuit and an evolutionarily convergent circuit architecture, endowing flies with the ability to extract chromatic information at distinct spatial resolutions.</p>
<hr class="docutils" />
<h5>Afternoon Session: Connectomics and Cognition (3:30 PM - 5:30 PM)</h5>
<hr class="docutils" />
</p>3:30 PM - 4:30 PM<p>
<p><strong>Vertebrate-like Eye Movements in a Compound Eye</strong></p>
<p><a href="https://maimonlab.rockefeller.edu">Gaby Maimon</a>, Laboratory of Integrative Brain Function, The Rockefeller University.</p>
<!-- <p>I will discuss a neural circuit that begins to explain how flies compare the angle in which they are currently oriented in the world with the angle in which they wish to be oriented - a goal heading angle that they can flexibly change - to determine which way to turn, how hard to turn, and how fast to walk forward. This detailed circuit in a small brain should inspire one to think more clearly about how larger mammalian brains, like our own, set goals and then compel behaviors to achieve those goals.</p> -->
<hr class="docutils" />
</p>4:30 PM - 5:30 PM<p>
<p><strong>Functional Modules of Brainstem Interneuron Circuits Revealed by Analysis of Local Connectivity</strong></p>
<p><a href="http://seunglab.org">H. Sebastian Seung</a>, Princeton Neuroscience Institute and Computer Science Department, Princeton University.</p>
<p>While sensory and motor nuclei in the brainstem tend to be spatially localized, interneurons have often appeared relatively "diffuse" in their anatomical organization. Electrophysiological and molecular studies have provided evidence for distinct interneuron populations that are functionally specialized but may overlap in space. Here we demonstrate that interneurons possess a modular organization that is defined by local connectivity and lacks strong spatial structure. We reconstructed 3000 cells from a 3D electron microscopic image of a larval zebrafish brainstem, and identified a "core" population with strong recurrent connectivity. We divided the core into two modules with strong connectivity within a module, and weak connectivity between modules. One module synapses onto the abducens motor nucleus, and is identified with the velocity-to-position neural integrator for horizontal eye movements. The other module contains reticulospinal neurons, and is likely to be involved in body movements. The integrator module is divided into two submodules that appear specialized for control of the two eyes, judging from their connectivity with the abducens nucleus. A network model of the integrator based on the observed connectivity reproduces some empirical aspects of eye position encoding.</p>
<hr class="docutils" />
<h4>Tuesday, 1:30 PM - 5:50 PM, March 17, 2020</h4>
<hr class="docutils" />
<h5>Morning Session: Processing and Perception of Odor and Taste in Drosophila (9:00 AM - 11:00 AM)</h5>
<hr class="docutils" />
</p>9:00 AM - 10:00 AM<p>
<p><strong>How States and Needs Shape Odor Perception and Behavior (Canceled)</strong></p>
<p><a href="http://www.neuro.wzw.tum.de">Ilona Grunwald Kadow</a>, School of Life Sciences, Technical University of Munich.</p>
<p>Neuromodulation permits flexibility of synapses, neural circuits and ultimately behavior. One neuromodulator, dopamine, has been studied extensively in its role as reward signal during learning and memory across animal species. Newer evidence suggests that dopaminergic neurons (DANs) can modulate sensory perception acutely, thereby allowing an animal to adapt its behavior and decision-making to its internal and behavioral state. In addition, some data indicate that DANs are heterogeneous and convey different types of information as a population. We have investigated DAN population activity and how it could encode relevant information about sensory stimuli and state by taking advantage of the confined anatomy of DANs innervating the mushroom body (MB) of the fly Drosophila melanogaster. Using in vivo calcium imaging and a custom 3D image registration method, we find that the activity of the population of MB DANs is predictive of the innate valence of an odor as well as the metabolic and behavioral state of the animal, suggesting that distinct DAN population activities encode innate odor valence, movement and physiological state in a MB compartment specific manner. This information could influence perception and state-dependent decision making as suggested by behavioral analysis. We propose that dopamine shapes innate odor perception through combinatorial population coding of sensory valence, physiological and behavioral context.</p>
<hr class="docutils" />
</p>10:00 AM - 11:00 AM<p>
<p><strong>Dietary Sugar Inhibits Satiation by Decreasing the Central Processing of Sweet Taste (Canceled)</strong></p>
<p><a href="https://sites.lsa.umich.edu/dus-lab/">Monica Dus</a>, Dept. of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI.</p>
<p>From humans to flies, exposure to diets rich in sugar and fat lowers taste sensation, changes food choices, and promotes feeding. However, how these peripheral alterations influence eating is unknown. Here we used the genetically tractable organism D. melanogaster to define the neural mechanisms through which this occurs. We characterized a population of protocerebral anterior medial dopaminergic neurons (PAM DANs) that innervates the β’2 compartment of the mushroom body and responds to sweet taste. In animals fed a high sugar diet, the response of PAM-β’2 to sweet stimuli was reduced and delayed, and sensitive to the strength of the signal transmission out of the sensory neurons. We found that PAM-β’2 DANs activity controls feeding rate and satiation: closed-loop optogenetic activation of β’2 DANs restored normal eating in animals fed high sucrose. These data argue that diet-dependent alterations in taste weaken satiation by impairing the central processing of sensory signals.</p>
<hr class="docutils" />
<hr class="docutils" />
<h5>Morning Session: Computational Tools for Analyzing the Function of Neural Circuits (11:00 AM - 2:30 PM)</h5>
<hr class="docutils" />
</p>11:00 AM - 12:00 PM<p>
<p><strong>Covert Sleep-Related Biological Processes Are Revealed by Probabilistic Analysis in Drosophila (Canceled)</p></strong>
<p><a href="http://www.bio.brandeis.edu/griffithlab/index.html">Leslie C. Griffith</a>, Volen National Center for Complex Systems, Brandeis University.</p>
<p>Sleep pressure and sleep depth are key regulators of wake and sleep. Current methods of measuring these parameters in Drosophila melanogaster have low temporal resolution and/or require disrupting sleep. Here we report novel analysis tools for high-resolution, non-invasive measurement of sleep pressure and depth from movement data. Probability of initiating activity, P(Wake), measures sleep depth while probability of ceasing activity, P(Doze), measures sleep pressure. In vivo and computational analyses show that P(Wake) and P(Doze) are largely independent and control the amount of total sleep. We also develop a Hidden Markov Model that allows visualization of distinct sleep/wake substates. These hidden states have a predictable relationship with P(Doze) and P(Wake) suggesting the methods capture the same behaviors. Importantly, we demonstrate that both the Doze/Wake probabilities and sleep/wake substates are tied to specific biological processes. These new metrics provide greater mechanistic insight into behavior than measuring the amount of sleep alone.</p>
<hr class="docutils" />
<h5>Lunch Break 12:00 PM - 1:30 PM</h5>
<hr class="docutils" />
</p>1:30 PM - 2:30 PM<p>
<p><strong>A Similarity-preserving Neural Network Trained on Transformed Images Recapitulates Salient Features of the Fly Motion Detection Circuit</strong></p>
<p><a href="https://www.simonsfoundation.org/team/dmitri-mitya-chklovskii/">Dmitri ‘Mitya’ Chklovskii</a>, Flatiron Institute, Simons Foundation.</p>
<p>Learning to detect content-independent transformations from data is one of the central problems in biological and artificial intelligence. An example of such problem is unsupervised learning of a visual motion detector from pairs of consecutive video frames. Rao and Ruderman formulated this problem in terms of learning infinitesimal transformation operators (Lie group generators) via minimizing image reconstruction error. Unfortunately, it is difficult to map their model onto a biologically plausible neural network (NN) with local learning rules. Here we propose a biologically plausible model of motion detection. We also adopt the transformation-operator approach but, instead of reconstruction-error minimization, start with a similarity-preserving objective function. An online algorithm that optimizes such an objective function naturally maps onto an NN with biologically plausible learning rules. The trained NN recapitulates major features of the well-studied motion detector in the fly. In particular, it is consistent with the experimental observation that local motion detectors combine information from at least three adjacent pixels, something that contradicts the celebrated Hassenstein-Reichardt model.</p>
<p>Joint work with Yanis Bahroun and Anirvan Sengupta.</p>
<hr class="docutils" />
<!-- <h5>Lunch Break 1:00 PM - 2:30 PM</h5> -->
<hr class="docutils" />
<h5>Afternoon Session: Neural Circuits: From Structure to Function (2:00 PM - 3:30 PM)</h5>
<hr class="docutils" />
</p>2:30 PM - 3:30 PM<p>
<p><strong>Biased Randomness, a Connectivity Mechanism for Associative Brain Centers</strong></p>
<p><a href="http://www.thecaronlab.com">Sophie Caron</a>, Department of Biology, University of Utah, Salt Lake City, UT.</p>
<p>Uncovering fundamental mechanisms of neuronal connectivity that enable associative brain centers to learn efficiently is an important goal of neuroscience. In the Drosophila melanogaster mushroom body, the constituent Kenyon cells receive input from olfactory projection neurons. Each projection neuron connects to one of the fifty-one glomeruli in the antennal lobe, an olfactory processing center. We and others have shown that these connections are unstructured in that there are no sets of glomeruli converging preferentially onto a given Kenyon cell. However, we found that the glomeruli are not represented with equal frequency among Kenyon cell inputs. Overrepresented glomeruli form many more connections than expected under a uniform distribution of inputs, whereas underrepresented glomeruli form far fewer connections than expected. We hypothesize that this non-uniform distribution, which we termed ‘biased randomness’, serves an important biological function. We are testing this hypothesis using two strategies. First, using a mathematical model of the mushroom body as well as in vivo calcium imaging, we demonstrate that, although biases do not affect the way most odors are represented in the mushroom body, they affect the way odors detected by only one glomerulus — monoglomerular odors — are represented. Monoglomerular odors with strong innate valence detected by underrepresented glomeruli fail to activate Kenyon cells. In contrast, monoglomerular odors with neutral valence detected by overrepresented glomeruli activate large ensembles of Kenyon cells. From these results, we hypothesize that biases serve two important biological functions: they disable the representation of odors with strong innate valence while they enable the representation of neutral odors. Second, we are determining whether and how the biases in Kenyon cell connectivity we detected in D. melanogaster shift across closely related species. We mapped the Kenyon cell inputs in Drosophila sechellia, a species that feeds exclusively on Noni, a fruit with a unique chemical makeup. We observe that, although most biases are conserved in both species, some biases shift significantly. Namely the glomeruli that show the largest shifts detect odors that are associated with food sources and oviposition sites. Glomeruli known to be activated by volatile chemicals found in Noni are overrepresented in D. sechellia, whereas glomeruli known to be activated by volatile chemicals found in citrus are overrepresented in D. melanogaster. Altogether, our work supports the idea that ‘biased randomness’ is a wiring mechanism that predisposes associative brain centers to learn efficiently.</p>
<p>Joint work with Ellis K., Amematsro E., Zavitz D. and Borisyuk A.</p>
<hr class="docutils" />
</p>3:30 PM - 4:30 PM<p>
<p><strong>Universality of Information Encoding in Brain Regions Using a Specific Combinatorial Code</strong></p>
<p><a href="https://www.salk.edu/scientist/charles-f-stevens/">Charles F. Stevens</a>, Salk Institute, La Jolla, CA.</p>
<p>Information in the brain is believed to be usually encoded by which neurons are activated by a stimulus. For example, in the primary visual cortex, the slopes of lines or edges at a particular location in the visual scene are encoded by which orientation selective neurons are active in response to the line or edge. In other brain areas, however, many of the same neurons are activated by every stimulus, so information is encoded not by which neurons are active but by a pattern of activity in a population of neurons that respond to most stimuli. Such brain regions use a combinatorial code to distinguish between alternative stimuli.</p>
<p>An example of such a brain region is the population of projection neurons in the fruit fly antennal lobe. About a dozen copies of each of about 50 genetically distinct types of odorant receptor neurons (ORNs) are present in the fly’s nose, and all of the neurons of the same type project to one of about fifty glomeruli in the antennal lobe. The output of each antennal lobe glomerulus is one of about 50 types of projection neurons that send olfactory information about odor type to Kenyon Cells in the mushroom body. Most of the 50 types of antennal lobe projection neurons fire in response to most odors.</p>
<p>The distribution of firing rates across the antennal lobe responsive projection neurons is the same for almost all odors: it is an exponential distribution with a mean that is the same across all odor types. What differs from one odor to the next is not the distribution of firing rates, but rather is which neurons have which rates in the distribution, so that odor type is encoded by the pattern of firing rates across projection neurons.</p>
<p>This same combinatorial code is known to be used across two additional brain areas. One is the mouse olfactory system and the other is the monkey code used for human faces in the inferotemporal cortex. I will discuss these additional regions and describe the universality of the code they use.</p>
<hr class="docutils" />
<h5>Afternoon Session: Evolution and Conserved Neural Circuit Function (4:30 PM - 5:50 PM)</h5>
<!-- <hr class="docutils" />
</p>4:00 PM - 4:45 PM<p>
<p><strong>Microinsects: Extreme Miniaturization or Perfect Optimization?</strong></p>
<p>Alexey A. Polilov, Department of Entomology, Faculty of Biology, Lomonosov Moscow State University.</p>
<p>Miniaturization is one of the principal directions of insect evolution. As a result of miniaturization, many insects have become smaller than some unicellular organisms. Such insects have repeatedly evolved in many orders; on the one hand, they are fascinating as model organisms for studying scaling; on the other hand, they are great examples of the perfect optimization of the structure of many systems, especially the nervous system and sensory organs.</p>
<p>In my talk I will discuss the smallest insect. My attention will be focused on the structure and ultrastructure of the nervous system and sensory organs, the radical transformations implemented in them by miniaturization in some microinsects, and how these systems retain their basic functions, despite their extremely small sizes. The early results of the study of the microinsect connectome will also be presented. The minute size and extreme simplicity and functionality of these insects suggest that they could be used as ideal model organisms for connectomics and other areas of neuroscience.</p>
-->
<hr class="docutils" />
</p>4:30 PM - 5:50 PM<p>
<p><strong>Panel Discussion: Are there Evolutionarily-Conserved Computations Mediated by Biological Neural Networks?</strong></p>
<p>Moderated by <a href="https://physiology.med.cornell.edu/people/daniel-gardner-ph-d/"> Daniel Gardner</a>, Department of Physiology and Biophysics, Weill Cornell Medicine.</p>
<p>In spite of the many individual successes of computational neuroscience, we incompletely understand three core, general, principles: what and how neuronal circuits compute, whether there are small numbers of evolutionarily conserved microcircuits and algorithms they use, and which nervous system properties are computationally essential. This panel will bring together investigators working on different facets of this core question and offer relevant examples, each informed by an individual neurobiological perspective of neurons and networks. We’ll try to generate a set of plausible and testable mechanisms that enable algorithmic computation, and propose a collaborative effort to examine and test these a community Neuromorphic Neural Networks (N^3) Initiative. We will also consider alternate views, including the idea that all microcircuits are ad-hoc, with no conserved algorithmic or computational principles.</p>
<p>We’ll begin with a 20-minute introduction by Dan Gardner, who will offer a set of candidate neuromorphic mechanisms common to almost all neural circuits, as well as a complementary set that are excluded from primary consideration because many neural circuits function without them. He will also define the two distinct credit assignment problems faced by biological neural nets.</p>
<hr class="docutils" />
<br>
<!--<hr />
<h4>More information about BCMC 2020 can be found <a href="http://www.bionet.ee.columbia.edu/workshops/bcmc/2020">here.</a></h4>-->
<br>
<h4>Past BCMC workshops</h4>
<p><a href="http://www.bionet.ee.columbia.edu/workshops/bcmc/2019">BCMC 2019</a></p>
<p><a href="http://www.bionet.ee.columbia.edu/workshops/bcmc/2018">BCMC 2018</a></p>
<p><a href="http://www.bionet.ee.columbia.edu/workshops/bcmc/2017">BCMC 2017</a></p>
<p><a href="http://www.bionet.ee.columbia.edu/workshops/bcmc/2016">BCMC 2016</a></p>
<p><a href="http://www.bionet.ee.columbia.edu/workshops/bcmc/2015">BCMC 2015</a></p>
<br>
</div>
</div>
<footer class="footer">
<div class="row">
<p>Copyright © 2019 <a href="licenses.html#credits">FFBO Teams</a>. All Rights Reserved. <a href="licenses.html">Licenses and Credits</a></p>
</div>
</footer>
<!-- jQuery Version 1.12.3 -->
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.12.2/jquery.min.js"></script>
<!-- Bootstrap Core JavaScript -->
<script src="js/bootstrap.min.js"></script>
<!-- Custom Modernizr JavaScript -->
<script src="js/modernizr.custom.js"></script>
<script src="js/smooth-scroll.js"></script>
<script src="js/jquery.stellar.min.js"></script>
<script src="js/submenu.js"></script>
<!-- <script src="js/ffbo.js"></script>-->
</body>
</html>