sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

575
active users

#AlgorithmicComplexity

0 posts0 participants0 posts today
Peter Bloem<p>Solving ARC (well... up to 20%) with inference-time compression. Another indication that <a href="https://sigmoid.social/tags/MDL" class="mention hashtag" rel="tag">#<span>MDL</span></a> and <a href="https://sigmoid.social/tags/AlgorithmicComplexity" class="mention hashtag" rel="tag">#<span>AlgorithmicComplexity</span></a> are becoming relevant in the age of Deep Learning. </p><p><a href="https://iliao2345.github.io/blog_posts/arc_agi_without_pretraining/arc_agi_without_pretraining.html" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">iliao2345.github.io/blog_posts</span><span class="invisible">/arc_agi_without_pretraining/arc_agi_without_pretraining.html</span></a></p>
Marcel Fröhlich<p>Zenil et al describe an algorithmic causal deconvolution method that disentangles the dimensions which originally provided meaning to the sequence solely by analysing the sequence based on algorithmic information dynamics.</p><p><a href="https://arxiv.org/pdf/2303.16045" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/pdf/2303.16045</span><span class="invisible"></span></a></p><p>But how does that relate to transformers?</p><p><a href="https://mathstodon.xyz/tags/AlgorithmicInformationTheory" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicInformationTheory</span></a> <a href="https://mathstodon.xyz/tags/AlgorithmicComplexity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicComplexity</span></a> <a href="https://mathstodon.xyz/tags/CausalDeconvolution" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CausalDeconvolution</span></a></p>