Zyphra has released ZAYA1 8B, an open reasoning mixture-of-experts model with 8 billion parameters and just 760 million active. It matches bigger rivals on benchmarks, including AIME 2025, and was trained end to end on AMD Instinct MI300 GPUs. The model uses “Markovian RSA” to think longer without context overflow and ships under Apache 2.0 for immediate commercial use.
Swipe through stories, personalise your feed, and save articles for later — all on the app.