sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

645
active users

#ion_channels

0 posts0 participants0 posts today

First of all, the application deadline for our 2 PhD positions in ion channel research was prolonged!
If you are interested, you can apply until february 16th!
Please find the job descriptions and instructions how to apply here:

uni-kiel.de/personal/de/stelle

uni-kiel.de/personal/de/stelle

#Phd #phdPosition
#ion_channels
#academia

📢 2 PhD positions in ion channel research available!

I want to share our latest job offers for a carrier in science - we have 2 open positions starting in the beginning of next year for PhD students. PhD students will learn & employ the PatchClamp technique to investigate structure-function relationships and pharmacology of ion channels, mainly potassium channels.
Enthusiastic about science?
Maybe something to think about over the holidays?
Have a look (english text can be found below the official german version in the documents)!

uni-kiel.de/personal/de/stelle

uni-kiel.de/personal/de/stelle

#academia
#science
#PhD
#phdPosition
#ion_channels
#patchclamp
#PotassiumChannel
#electrophysiology

Hi folks,

so that came out in the wrong order. 😜

I wanted to introduce myself first before posting our preprint!
I am a female scientist currently working as a postdoc at Kiel university.
I am fascinated by ion channels, and my work focuses on potassium channels of the K2P channel family. I'm especially interested in pharmacology, regulation, and heteromerization, and mostly doing Patch Clamp!

#introduction
#ion_channels
#academia
#unikiel
#postdoc
#postdoclife
#academicchatter

journals.plos.orgShort-term Hebbian learning can implement transformer-like attentionAuthor summary Many of the most impressive recent advances in machine learning, from generating images from text to human-like chatbots, are based on a neural network architecture known as the transformer. Transformers are built from so-called attention layers which perform large numbers of comparisons between the vector outputs of the previous layers, allowing information to flow through the network in a more dynamic way than previous designs. This large number of comparisons is computationally expensive and has no known analogue in the brain. Here, we show that a variation on a learning mechanism familiar in neuroscience, Hebbian learning, can implement a transformer-like attention computation if the synaptic weight changes are large and rapidly induced. We call our method the match-and-control principle and it proposes that when presynaptic and postsynaptic spike trains match up, small groups of synapses can be transiently potentiated allowing a few presynaptic axons to control the activity of a neuron. To demonstrate the principle, we build a model of a pyramidal neuron and use it to illustrate the power and limitations of the idea.