My first mastodon post!
Interested in #neuralinterfaces #deeplearning and #brainimplants for the blind? Sharing our latest PRE-PRINT
"Biologically plausible phosphene simulation for
the differentiable optimization of visual cortical
prostheses" w Maureen van der Grinten,
@deRuyterJaap
et al. 1/
Starting already the last century, scientists and engineers have worked hard to understand and to learn how to create artificial visual percepts in people who lost their vision, by electrically stimulating the brain (here, Brindley et al.) 2/
Percepts created by these means are called "phosphenes". Their descriptions are very diverse, altough they were commonly described as "blobs of light". Since then, a wide array of neurophysiological and clinical descriptions about phosphene perception has been generated 3/
To accelerate the study the cortical visual percepts created by brain implants, Simulated Prosthetic Vision using #VR has been developed by research groups. However, some might lack realism, or not be suitable for #deeplearning optimization of scene representation. 4/
In order to address this challenge, we release an #pensource phenomenologically realistic #cortical phosphene simulator and #machinelearning pipeline!! Our simulator has in account most of the common #neurostimulation parameters used by researchers, and biological constraints 5/
We integrate several models to reproduce classic and modern #psychophysics of phosphene vision results, accounting for effects of stim parameters on phosphene thresholds, brigntess, size and accomodation effects (more detail on the manuscript). Example: http://shorturl.at/bty69 6/
Since it's implemented in #pytorch, it allows for end-to-end optimization of phosphene representation *and* electrical stimulation parameters directly via deep neural networks (previous work at http://shorturl.at/kMNR6). This is important given limited resolution of these kind of systems 7/
The differentiable nature of our pipeline allows for a flexible optimization. See how the phosphene representation changes when using semantic boundaries labels, and electrical stimulation safety constrains!!! Related cool work (in retinal implants) https://arxiv.org/abs/2205.13623 8/
Our simulator runs in real time on a #GPU, the results are very cool! Its modular nature allows for new psychophysics and biophysical models to be incorporated. An extra bonus of making it #opensource: any researcher in the world can contribute to make it a better pipeline! 9/
I want to highlight one of the big inspirations for this work: the excellent work previously done by
Prof. Michael Beyeler
at https://bionicvisionlab.org on phosphene simulation and computer vision optimization for #retinalimplants. Check their amazing work here https://github.com/pulse2percept/pulse2percept 10/
Thanks to all our amazing co-authors!!! Maureen van der Grinten, Jaap de Ruyter van Steveninck, Laura Pijnacker, Bodo Rückauer, Pieter Roelfsema, Marcel van Gerven, Richard van Wezel, Umut Güçlü, Yağmur Güçlütürk
You can experiment with our code:
https://github.com/neuralcodinglab/dynaphos
And read our pre-print here:
https://www.biorxiv.org/content/10.1101/2022.12.23.521749v1
We expect this work to increase the translational impact of research on artificial vision for the blind.
Thanks for reading!!! #NeuroAI #neuroscience
Fin.