sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

587
active users

#pruned

0 posts0 participants0 posts today
JMLR<p>&#39;Random Pruning Over-parameterized Neural Networks Can Improve Generalization: A Training Dynamics Analysis&#39;, by Hongru Yang, Yingbin Liang, Xiaojie Guo, Lingfei Wu, Zhangyang Wang.</p><p><a href="http://jmlr.org/papers/v26/23-0832.html" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-0832.ht</span><span class="invisible">ml</span></a> <br /> <br /><a href="https://sigmoid.social/tags/pruning" class="mention hashtag" rel="tag">#<span>pruning</span></a> <a href="https://sigmoid.social/tags/pruned" class="mention hashtag" rel="tag">#<span>pruned</span></a> <a href="https://sigmoid.social/tags/generalization" class="mention hashtag" rel="tag">#<span>generalization</span></a></p>
Kevin Karhan :verified:<p><span class="h-card" translate="no"><a href="https://mastodon.social/@jean_dupont" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>jean_dupont</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@MoneroTalk" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>MoneroTalk</span></a></span> already has been, see <a href="https://infosec.space/tags/pruned" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>pruned</span></a> blockchain for <a href="https://infosec.space/tags/Monero" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Monero</span></a>...</p>
New Submissions to TMLR<p>AP: Selective Activation for De-sparsifying Pruned Networks</p><p><a href="https://openreview.net/forum?id=EGQSpkUDdD" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">openreview.net/forum?id=EGQSpk</span><span class="invisible">UDdD</span></a></p><p><a href="https://sigmoid.social/tags/pruning" class="mention hashtag" rel="tag">#<span>pruning</span></a> <a href="https://sigmoid.social/tags/pruned" class="mention hashtag" rel="tag">#<span>pruned</span></a> <a href="https://sigmoid.social/tags/neuron" class="mention hashtag" rel="tag">#<span>neuron</span></a></p>
JMLR<p>&#39;Connectivity Matters: Neural Network Pruning Through the Lens of Effective Sparsity&#39;, by Artem Vysogorets, Julia Kempe.</p><p><a href="http://jmlr.org/papers/v24/22-0415.html" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/22-0415.ht</span><span class="invisible">ml</span></a> <br /> <br /><a href="https://sigmoid.social/tags/pruning" class="mention hashtag" rel="tag">#<span>pruning</span></a> <a href="https://sigmoid.social/tags/pruned" class="mention hashtag" rel="tag">#<span>pruned</span></a> <a href="https://sigmoid.social/tags/sparser" class="mention hashtag" rel="tag">#<span>sparser</span></a></p>
Published papers at TMLR<p>Max-Affine Spline Insights Into Deep Network Pruning</p><p>Haoran You, Randall Balestriero, Zhihan Lu et al.</p><p><a href="https://openreview.net/forum?id=bMar2OkxVu" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">openreview.net/forum?id=bMar2O</span><span class="invisible">kxVu</span></a></p><p><a href="https://sigmoid.social/tags/pruning" class="mention hashtag" rel="tag">#<span>pruning</span></a> <a href="https://sigmoid.social/tags/prune" class="mention hashtag" rel="tag">#<span>prune</span></a> <a href="https://sigmoid.social/tags/pruned" class="mention hashtag" rel="tag">#<span>pruned</span></a></p>