sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

612
active users

#fp8

1 post1 participant0 posts today

#JackDongarra Makes a Stand for Traditional #HPC: "US still doesn’t have a clear, long-term plan for what comes next.... U.S. risks falling behind."

Challenges to high-performance computing threaten #US #innovation

The #AI boom has led chip makers to focus on #FP16 and #FP8, not the #FP64 used by scientific research. If chip companies stop making the parts that #scientists need, then it could become harder to do important research.
theconversation.com/challenges

The ConversationChallenges to high-performance computing threaten US innovationToday’s supercomputers are enormously powerful, but the work they do − running AI and tackling difficult science − is pushing them to their limits. Building bigger supercomputers won’t be easy.

Introducing Phind-405B and faster, high quality #AI answers for everyone

🚀 Phind-405B: New flagship #llm, based on Meta Llama 3.1 405B, designed for programming & technical tasks. #Phind405B

⚡ 128K tokens, 32K context window at launch, 92% on HumanEval, great for web app design. #Programming #AIModel

💡 Trained on 256 H100 GPUs with FP8 mixed precision, 40% memory reduction. #DeepSpeed #FP8

⚡ Phind Instant Model: Super fast, 350 tokens/sec, based on Meta Llama 3.1 8B. #PhindInstant

🚀 Runs on NVIDIA TensorRT-LLM with flash decoding, fused CUDA kernels. #NVIDIA #GPUs

🔍 Faster Search: Prefetches results, saves up to 800ms latency, better embeddings. #FastSearch

👨‍💻 Goal: Help developers experiment faster, new features coming soon! #DevTools #Innovation

phind.com/blog/introducing-phi