sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

599
active users

#spikingneuralnetwork

2 posts2 participants0 posts today

New preprint for #neuromorphic and #SpikingNeuralNetwork folk (with Pengfei Sun and awesome MSc student Ziqiao Yu).

arxiv.org/abs/2507.16043

Surrogate gradients are popular for training SNNs, but some worry whether they really learn complex temporal spike codes. TLDR: we tested this, and yes they can!

We also find that delay-based spiking neural networks seem to degrade in more human-like ways than networks without delays.

Check the next post for links to the code and dataset which you can easily use to test your own spike based learning algorithms and models.

arXiv logo
arXiv.orgBeyond Rate Coding: Surrogate Gradients Enable Spike Timing Learning in Spiking Neural NetworksWe investigate the extent to which Spiking Neural Networks (SNNs) trained with Surrogate Gradient Descent (Surrogate GD), with and without delay learning, can learn from precise spike timing beyond firing rates. We first design synthetic tasks isolating intra-neuron inter-spike intervals and cross-neuron synchrony under matched spike counts. On more complex spike-based speech recognition datasets (Spiking Heidelberg Digits (SHD) and Spiking Speech Commands (SSC), we construct variants where spike count information is eliminated and only timing information remains, and show that Surrogate GD-trained SNNs are able to perform significantly above chance whereas purely rate-based models perform at chance level. We further evaluate robustness under biologically inspired perturbations -- including Gaussian jitter per spike or per-neuron, and spike deletion -- revealing consistent but perturbation-specific degradation. Networks show a sharp performance drop when spike sequences are reversed in time, with a larger drop in performance from SNNs trained with delays, indicating that these networks are more human-like in terms of behaviour. To facilitate further studies of temporal coding, we have released our modified SHD and SSC datasets.

Proud to have managed to finish a #neuromorphic manuscript, with Chiara De Luca, Mirco Tincani and Elisa Donati just before the end of the year!

It demonstrates the benefits of using #braininspired principles of computation for achieving robust computation across multiple time-scales, despite the inherent variability of the underlying computational substrate (silicon neurons that emulate faithfully biological ones):
A neuromorphic multi-scale approach for heart rate and state detection
doi.org/10.21203/rs.3.rs-57373
#neuromorphic #wearable #neuroai #SpikingNeuralNetwork

doi.orgA neuromorphic multi-scale approach for heart rate and state detectionWith the advent of novel sensor and machine learning technologies, it is becoming possible to develop wearable systems that perform continuous recording and processing of biosignals for health or body state assessment. For example, modern smartwatches can already track physiological functions, in...

Mmm...
Once I was surfing the net for some papers in CNS and I came across a paper about Natural Language Processing all of a sudden and I realized There aren't any real Spiking Neural Networks that are bio plausible for NLP.

I started reading some papers and some books to gain more knowledge about Language comprehension and language generation but no real model suggestions yet.

Do anyone know any labs working on NLP in Computational neuroscience or how to connect with them?

A Thousand Brains : a new theory of intelligence

Hi :)
I'm a new member of this amazing community and I would like to have my first post on the amazing breakthrough of Jeff Hawkins.

Before giving my opinion, I would like everyone to tell me whether they know the theory of have they read the book or the original papers and If so what's their insight on them?

I think the material in this research is pretty much fascinating and would like to engage and talk about it more.

#cns #SpikingNeuralNetwork #theory_of_brain #jeff_hawkins
#neuroscience #computationalneuroscience