sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

649
active users

#eacl2023

0 posts0 participants0 posts today
Tom Stafford<p>Chatbots for Good and Evil</p><p><a href="https://underline.io/lecture/72154-chatbots-for-good-and-evil" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">underline.io/lecture/72154-cha</span><span class="invisible">tbots-for-good-and-evil</span></a> </p><p>Kevin Munger keynote at <a href="https://mastodon.online/tags/EACL2023" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>EACL2023</span></a></p>
Venkat<p>I’m at <a href="https://sigmoid.social/tags/eacl2023" class="mention hashtag" rel="tag">#<span>eacl2023</span></a> presenting my work on intergroup bias :) Come by and say hi <a href="https://underline.io/events/383/posters/14966/poster/71341-how-people-talk-about-each-other-modeling-generalized-intergroup-bias-and-emotion" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="ellipsis">underline.io/events/383/poster</span><span class="invisible">s/14966/poster/71341-how-people-talk-about-each-other-modeling-generalized-intergroup-bias-and-emotion</span></a><br /><a href="https://aclanthology.org/2023.eacl-main.183/" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="ellipsis">aclanthology.org/2023.eacl-mai</span><span class="invisible">n.183/</span></a></p>
marco<p>I&#39;ll be attending (and presenting a poster at) <a href="https://sigmoid.social/tags/EACL" class="mention hashtag" rel="tag">#<span>EACL</span></a> this week. Definitely reach out if you want to grab coffee!</p><p>My poster is</p><p>`Parameter-Efficient Korean Character-Level Language Modeling`,</p><p>where we describe a way to efficiently encode and decode (Korean) syllable-level representations without requiring an embedding for each possible syllable in the Korean writing system (more than 11k).</p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/nlproc" class="mention hashtag" rel="tag">#<span>nlproc</span></a> <a href="https://sigmoid.social/tags/eacl2023" class="mention hashtag" rel="tag">#<span>eacl2023</span></a> <a href="https://sigmoid.social/tags/cjk" class="mention hashtag" rel="tag">#<span>cjk</span></a></p>
Venkat<p>Super happy to announce that my paper studying interpersonal bias with <span class="h-card" translate="no"><a href="https://sigmoid.social/@jessyjli" class="u-url mention">@<span>jessyjli</span></a></span> , David Beaver, Malihe Alikhani, Katherine Atwell and Barea Sinno has been accepted to <a href="https://sigmoid.social/tags/eacl2023" class="mention hashtag" rel="tag">#<span>eacl2023</span></a>.I’m really excited for this first step in my research thesis, and for what I&#39;ll find in the months to come.<br />Our goal is to study how the relationship between the speaker and the target of an utterance influences the language itself. Language is biased regardless of intent , and studying the effect of interpersonal…🧵1/3</p>
eaclmeeting<p>Official dates for <a href="https://sigmoid.social/tags/eacl2023" class="mention hashtag" rel="tag">#<span>eacl2023</span></a> !</p><p>May 2, 3, 4 - Main Conference<br />May 5 &amp; 6 - Workshops and Tutorials </p><p>Looking forward to seeing everyone in three months!</p><p><a href="https://sigmoid.social/tags/NLProc" class="mention hashtag" rel="tag">#<span>NLProc</span></a></p>
Hosein Mohebbi<p>🥳 Thrilled to announce our paper got accepted to <a href="https://sigmoid.social/tags/EACL2023" class="mention hashtag" rel="tag">#<span>EACL2023</span></a>!<br />We introduce *Value Zeroing*, a new interpretability method for quantifying context mixing in Transformers.</p><p>A joint work w/ me, <span class="h-card" translate="no"><a href="https://sigmoid.social/@wzuidema" class="u-url mention">@<span>wzuidema</span></a></span>, <span class="h-card" translate="no"><a href="https://sigmoid.social/@gchrupala" class="u-url mention">@<span>gchrupala</span></a></span>, and Afra</p><p>📑Paper: <a href="https://arxiv.org/abs/2301.12971" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="">arxiv.org/abs/2301.12971</span><span class="invisible"></span></a><br />☕Code: <a href="https://github.com/hmohebbi/ValueZeroing" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="ellipsis">github.com/hmohebbi/ValueZeroi</span><span class="invisible">ng</span></a></p><p><a href="https://sigmoid.social/tags/NLProc" class="mention hashtag" rel="tag">#<span>NLProc</span></a> <a href="https://sigmoid.social/tags/InDeep" class="mention hashtag" rel="tag">#<span>InDeep</span></a></p>
UKP Lab<p>Six papers authored or co-authored by UKP members have been accepted for publication at <a href="https://sigmoid.social/tags/EACL2023" class="mention hashtag" rel="tag">#<span>EACL2023</span></a>, the Conference of the European Chapter of the Association for Computational Linguistics.</p><p>The conference will be held online and in Dubrovnik, Croatia, from 2 to 6 of May 2023.</p><p>Congratulations to all involved! 💐 </p><p><a href="https://www.informatik.tu-darmstadt.de/ukp/ukp_home/news_ukp/ukp_newsdetails_273984.en.jsp" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://www.</span><span class="ellipsis">informatik.tu-darmstadt.de/ukp</span><span class="invisible">/ukp_home/news_ukp/ukp_newsdetails_273984.en.jsp</span></a></p>
Zdeněk Kasner<p>Our paper with <span class="h-card" translate="no"><a href="https://sigmoid.social/@sinantie" class="u-url mention">@<span>sinantie</span></a></span> and <span class="h-card" translate="no"><a href="https://sigmoid.social/@tuetschek" class="u-url mention">@<span>tuetschek</span></a></span> got accepted to <a href="https://sigmoid.social/tags/eacl2023" class="mention hashtag" rel="tag">#<span>eacl2023</span></a>!</p><p>TL;DR: If you want pretrained models to describe data from a novel domain, use 𝘂𝗻𝗮𝗺𝗯𝗶𝗴𝘂𝗼𝘂𝘀 𝗵𝘂𝗺𝗮𝗻-𝗿𝗲𝗮𝗱𝗮𝗯𝗹𝗲 𝗹𝗮𝗯𝗲𝗹𝘀💡</p><p><a href="https://arxiv.org/abs/2210.07373" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://</span><span class="">arxiv.org/abs/2210.07373</span><span class="invisible"></span></a></p><p><span class="h-card" translate="no"><a href="https://birdsite.wilde.cloud/users/ufal_cuni" class="u-url mention">@<span>ufal_cuni</span></a></span></p>
marco<p>Happy to say my first paper since moving to Tokyo to start my PhD was accepted at <a href="https://sigmoid.social/tags/EACL2023" class="mention hashtag" rel="tag">#<span>EACL2023</span></a></p><p>`Parameter-Efficient Korean Character-Level Language Modeling`</p><p>This was joint work with Sangwhan Moon (<span class="h-card" translate="no"><a href="https://hachyderm.io/@s" class="u-url mention">@<span>s</span></a></span>), Lawrence Wolf-Sonkin, and my advisor, Naoaki Okazaki.</p><p>In it, we describe a method to factor the character ( = Hangul syllables) embedding and output layers in a generic neural LM to reduce the embedding/output parameter counts by up to &gt;99% with no loss in modeling quality!</p>
NEJLT<p>NEJLT welcomes NLP papers, and offers accelerated review if you include your <a href="https://sigmoid.social/tags/EACL2023" class="mention hashtag" rel="tag">#<span>EACL2023</span></a> or <a href="https://sigmoid.social/tags/ICLR2023" class="mention hashtag" rel="tag">#<span>ICLR2023</span></a> reviews.</p><p>➡️ indexing in the ACL Anthology<br />➡️ journal-format articles<br />➡️ no length limit<br />➡️ fast, open, free review &amp; free publishing</p><p>Read more here: <a href="https://www.nejlt.org/" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://www.</span><span class="">nejlt.org/</span><span class="invisible"></span></a></p>
Ivan Habernal<p>Dear <a href="https://sigmoid.social/tags/eacl2023" class="mention hashtag" rel="tag">#<span>eacl2023</span></a> reviewers - I&#39;m one of you, but I hate when you **promise to deliver** reviews after being asked several times and then nothing happens. Either say &quot;no, I&#39;m sorry&quot; or do the job. Is it that complicated?</p>