sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

531
active users

#cscs

0 posts0 participants0 posts today
olеg lаvrоvsky<p>In response to my question on <a href="https://hachyderm.io/tags/openhardware" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openhardware</span></a> and alternatives to CUDA, he hopes that <a href="https://hachyderm.io/tags/OpenCL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenCL</span></a> will rise to the challenge of demanding 64-bit+ models rather than the current focus on 4-bit quantized builds that are insufficient for HPC science.</p><p>See also my notes and links on MLP like <a href="https://hachyderm.io/tags/Apertus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Apertus</span></a> at <a href="https://hachyderm.io/tags/CSCS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSCS</span></a> at <a href="https://swissai.dribdat.cc/project/14" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">swissai.dribdat.cc/project/14</span><span class="invisible"></span></a></p>
olеg lаvrоvsky<p>How much power does 3 month of training <a href="https://hachyderm.io/tags/Apertus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Apertus</span></a> on the most powerful open research cloud-native supercomputer in Europe? </p><p>⚡️About the same as raising 5 people to adulthood, or 50% peak (10MW) of a locomotive. Keep industrial power usage in perspective, and acknowledge the efforts into climate neutrality, asks Joost VandeVondele - who aims to scale up the infrastructure to support thousands of AI experiments and hundreds of SMEs in the future at <a href="https://hachyderm.io/tags/CSCS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSCS</span></a></p>
Neff :manjaro:<p><a href="https://mastodon.uno/tags/Apertus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Apertus</span></a> : Un LLM svizzero, sviluppato dai due Politecnici federali <a href="https://mastodon.uno/tags/ETHZ" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ETHZ</span></a> e <a href="https://mastodon.uno/tags/EPFL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EPFL</span></a> e il Centro Svizzero di Calcolo Scientifico <a href="https://mastodon.uno/tags/CSCS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSCS</span></a> promette applicazioni AI trasparenti e open source. Chissà come funziona! <a href="https://www.threads.com/@corrieredelticino" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">threads.com/@corrieredelticino</span><span class="invisible"></span></a></p>
Benjamin Carr, Ph.D. 👨🏻‍💻🧬<p><a href="https://hachyderm.io/tags/Switzerland" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Switzerland</span></a> launches transparent <a href="https://hachyderm.io/tags/ChatGPT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ChatGPT</span></a> alternative<br><a href="https://hachyderm.io/tags/Apertus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Apertus</span></a>: A fully <a href="https://hachyderm.io/tags/opensource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>opensource</span></a>, transparent, multilingual language model<br>The Apertus <a href="https://hachyderm.io/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> is comparable to Meta’s <a href="https://hachyderm.io/tags/Llama3" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Llama3</span></a> model from 2024 <br>The development of the <a href="https://hachyderm.io/tags/largelanguagemodel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>largelanguagemodel</span></a> (<a href="https://hachyderm.io/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a>) and the research into other domain-specific foundation models is funded by an investment of over 10 million GPU hours on the “<a href="https://hachyderm.io/tags/Alps" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Alps</span></a>” <a href="https://hachyderm.io/tags/Supercomputer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Supercomputer</span></a> by <a href="https://hachyderm.io/tags/CSCS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSCS</span></a>, and by the ETH Board <br><a href="https://www.cscs.ch/science/computer-science-hpc/2025/apertus-a-fully-open-transparent-multilingual-language-model" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">cscs.ch/science/computer-scien</span><span class="invisible">ce-hpc/2025/apertus-a-fully-open-transparent-multilingual-language-model</span></a></p>
Niclas<p>I recently visited the Swiss National Supercomputing Centre ( <a href="https://fosstodon.org/tags/CSCS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSCS</span></a> )in Lugano.</p><p>Now <a href="https://fosstodon.org/tags/ETH" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ETH</span></a> and <a href="https://fosstodon.org/tags/EPFL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EPFL</span></a> have publicly announced that they are training an <a href="https://fosstodon.org/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> on "Alps" supercomputing. The best part is that they are doing it <a href="https://fosstodon.org/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a></p><p>"Alps" will compute at 0.435 * 10^18 FLOPS. This means that "Alps" can calculate more in one day than a modern laptop in 2025 could calculate in 40,000 years. Amazing!</p><p><a href="https://ethz.ch/en/news-and-events/eth-news/news/2025/07/a-language-model-built-for-the-public-good.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">ethz.ch/en/news-and-events/eth</span><span class="invisible">-news/news/2025/07/a-language-model-built-for-the-public-good.html</span></a></p><p><a href="https://fosstodon.org/tags/Switzerland" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Switzerland</span></a> <a href="https://fosstodon.org/tags/SwissAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SwissAI</span></a> <a href="https://fosstodon.org/tags/Europe" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Europe</span></a> <a href="https://fosstodon.org/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://fosstodon.org/tags/supercomputer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>supercomputer</span></a></p>
🧿🪬🍄🌈🎮💻🚲🥓🎃💀🏴🛻🇺🇸<p><a href="https://mastodon.social/tags/ETH" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ETH</span></a> Zurich &amp; <a href="https://mastodon.social/tags/EPFL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EPFL</span></a> (w/ <a href="https://mastodon.social/tags/CSCS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSCS</span></a>) will drop a fully open large-language model trained on the “Alps” supercomputer. Built from scratch on public infrastructure, fluent in 1000 + languages, Apache-2 licensed. Landing late-summer 2025</p><p>&gt; A language model built for the public good</p><p><a href="https://ethz.ch/en/news-and-events/eth-news/news/2025/07/a-language-model-built-for-the-public-good.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">ethz.ch/en/news-and-events/eth</span><span class="invisible">-news/news/2025/07/a-language-model-built-for-the-public-good.html</span></a></p><p><a href="https://mastodon.social/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a> <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.social/tags/AIForGood" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIForGood</span></a> <a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://mastodon.social/tags/llms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llms</span></a> <a href="https://mastodon.social/tags/switzerland" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>switzerland</span></a></p>