sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

597
active users

🚨Excited to share our new short SNN paper! 🧵One of the advantages of spiking neural networks is the ability to use temporal coding to achieve efficient processing. Inspired by recent neurobiological evidence on myelin adaptation on short timescales Edoardo studied the limits of using ONLY synaptic delays with temporal representations for processing information in SNNs.

We fixed the network weights to scaled random ternary (+x/0/-x) values and trained ONLY the synaptic delays in the network. Does this work at all⁉️

Turns out it does!❗️We show that this can work as well as training the weights for various image classification datasets! We tested this on MNIST and Fashion-MNIST. This has of interesting implications for neuromorphic applications.
Link to paper: arxiv.org/abs/2306.06237

arXiv.orgBeyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay trainingBiological evidence suggests that adaptation of synaptic delays on short to medium timescales plays an important role in learning in the brain. Inspired by biology, we explore the feasibility and power of using synaptic delays to solve challenging tasks even when the synaptic weights are not trained but kept at randomly chosen fixed values. We show that training ONLY the delays in feed-forward spiking networks using backpropagation can achieve performance comparable to the more conventional weight training. Moreover, further constraining the weights to ternary values does not significantly affect the networks' ability to solve the tasks using only the synaptic delays. We demonstrate the task performance of delay-only training on MNIST and Fashion-MNIST datasets in preliminary experiments. This demonstrates a new paradigm for training spiking neural networks and sets the stage for models that can be more efficient than the ones that use weights for computation.
Anand Subramoney

If you're attending ICONS (icons.ornl.gov) today, check out the talk 🎙️ by Edoardo at 15:30 local time titled "Beyond Weights". We'll be putting up the talk video soon as well.

icons.ornl.gov ICONS Conference @ Oak Ridge National Laboratory