-
Notifications
You must be signed in to change notification settings - Fork 330
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
My changes #684
base: master
Are you sure you want to change the base?
My changes #684
Conversation
Thanks @salah-daddi-nounou for the PR, I will review later on |
# language=rst | ||
""" | ||
Bi_sigmoid STDP rule involving only post-synaptic spiking activity. The weight update | ||
quantity is poisitive if the post-synaptic spike occures shortly after the presynatpic spike, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
poisitive --> positive
bindsnet/network/nodes.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For performance, please create a new neuron with the relevant parameters for your model. Maybe LIF_Bi-sigmoid.
@@ -389,6 +389,11 @@ def _connection_update(self, **kwargs) -> None: | |||
""" | |||
Post-pre learning rule for ``Connection`` subclass of ``AbstractConnection`` | |||
class. | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This need to be generalized, MNIST is only example.
Bi_sigmoid learning rule for ``Connection`` subclass of ``AbstractConnection`` | ||
class. | ||
|
||
self.source.s : 28 *28 array of 0 and 1 source_s : array converted to 1D vector (784*1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as before, the description need to be generalize, MNIST is only one way to use BindsNET
Bi-sigmoid rule integration in Bindsnet
Our work involved extending the BindsNET framework to incorporate the Bi-sigmoid STDP rule.
This unique learning rule is directly derived from the behavior of physical spintronic synapses. It is the result of accurate electrical simulations that allowed us to determine the change in the conductance of the synapse with respect to the relative delay between electrical pulses in its terminals.
The rule was formalized by the following equation after fitting to simulation data (normalized conductance) :
Parameters:
Before hardware implementation, simulating the SNN with this rule in a functional framework like BindsNet is crucial. It achieved more than 91% accuracy on MNIST.
The process of integrating the Bi-sigmoid STDP rule into the BindsNET framework is as follows:
We developed a custom training script named
salah_example.py
, inspired by the Bindsnet exampleeth_mnist.py
. This script trains the SNN and subsequently saves the trained weights and network parameters as PyTorch objects (.pt
files).Within the
nodes.py
base class, we introduced a new trace,x2
, defined by thebisigmoid_trace(t_)
function. We added the tracex2
to the block that manages the traces, enabling all neuron types to record this novel trace as they spike.In
learning.py
, the Bi-sigmoid learning rule is introduced, updating synaptic weights based on thex2
trace. Here how this rule is used to update weights in the network:x2
, defined by the bisigmoid function, ands
, indicating spikes. Here,source
represents the input neuron whiletarget
refers to the output (excitatory) neuron.target.s
) spikes. At this moment, the connection between the input and output neurons is updated based on the value ofsource.x2
at the time oftarget.s
spiking.A new network model,
Salah_model
, was introduced inmodels.py
, similar to the modelDiehlAndCook2015
but notably incorporating the Bi-sigmoid rule as theupdate_rule
.Finally, for the purpose of inference,
evaluate_plot.py
was created. This script facilitates the loading of trained weights, performs model evaluation, and generates plots to visually represent the network's performance post-training. We separated Training and testing files. Saving the weights after training is crucial for later analysis.To go further on the physical simulations of the spintroinc synapse, check out our article.