DeepSeek-R1-0528
PulseAugur coverage of DeepSeek-R1-0528 — every cluster mentioning DeepSeek-R1-0528 across labs, papers, and developer communities, ranked by signal.
-
Zyphra's ZAYA1-8B model matches larger rivals with 700M active parameters
Zyphra has released ZAYA1-8B, a reasoning-focused mixture-of-experts model with 700 million active parameters. The model was trained from scratch on an AMD compute platform and utilizes a novel four-stage reinforcement …
-
Zyphra's ZAYA1-8B MoE model trained on AMD hardware outperforms larger rivals
Zyphra AI has released ZAYA1-8B, a Mixture of Experts (MoE) language model with 760 million active parameters and 8.4 billion total parameters. Trained on AMD hardware, this model demonstrates competitive performance ag…
-
DeepSeek releases R1-0528, an open-weights model rivaling Gemini 2.5 Pro
DeepSeek has released DeepSeek-R1-0528, an open-weights model that rivals Gemini 2.5 Pro in performance. This release marks a significant advancement in publicly available AI models, offering a powerful alternative for …