PulseAugur
LIVE 08:22:24
ENTITY DeepSeek-R1-0528

DeepSeek-R1-0528

PulseAugur coverage of DeepSeek-R1-0528 — every cluster mentioning DeepSeek-R1-0528 across labs, papers, and developer communities, ranked by signal.

Total · 30d
3
3 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
2
2 over 90d
TIER MIX · 90D
RECENT · PAGE 1/1 · 3 TOTAL
  1. TOOL · CL_22192 ·

    Zyphra's ZAYA1-8B model matches larger rivals with 700M active parameters

    Zyphra has released ZAYA1-8B, a reasoning-focused mixture-of-experts model with 700 million active parameters. The model was trained from scratch on an AMD compute platform and utilizes a novel four-stage reinforcement …

  2. TOOL · CL_20870 ·

    Zyphra's ZAYA1-8B MoE model trained on AMD hardware outperforms larger rivals

    Zyphra AI has released ZAYA1-8B, a Mixture of Experts (MoE) language model with 760 million active parameters and 8.4 billion total parameters. Trained on AMD hardware, this model demonstrates competitive performance ag…

  3. FRONTIER RELEASE · CL_01837 ·

    DeepSeek releases R1-0528, an open-weights model rivaling Gemini 2.5 Pro

    DeepSeek has released DeepSeek-R1-0528, an open-weights model that rivals Gemini 2.5 Pro in performance. This release marks a significant advancement in publicly available AI models, offering a powerful alternative for …