Excited to share our new short SNN paper!
One of the advantages of spiking neural networks is the ability to use temporal coding to achieve efficient processing. Inspired by recent neurobiological evidence on myelin adaptation on short timescales Edoardo studied the limits of using ONLY synaptic delays with temporal representations for processing information in SNNs.
We fixed the network weights to scaled random ternary (+x/0/-x) values and trained ONLY the synaptic delays in the network. Does this work at all
Turns out it does!️We show that this can work as well as training the weights for various image classification datasets! We tested this on MNIST and Fashion-MNIST. This has of interesting implications for neuromorphic applications.
Link to paper: https://arxiv.org/abs/2306.06237
If you're attending #ACM ICONS (https://icons.ornl.gov) today, check out the talk by Edoardo at 15:30 local time titled "Beyond Weights". We'll be putting up the talk video soon as well. #ICONS2023
You can find the code for this paper here: https://github.com/Efficient-Scalable-Machine-Learning/beyond-weights