PulseAugur
LIVE 11:26:41
ENTITY DeepSeek Sparse Attention

DeepSeek Sparse Attention

PulseAugur coverage of DeepSeek Sparse Attention — every cluster mentioning DeepSeek Sparse Attention across labs, papers, and developer communities, ranked by signal.

Total · 30d
2
2 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
1
1 over 90d
TIER MIX · 90D
RELATIONSHIPS
RECENT · PAGE 1/1 · 4 TOTAL
  1. RESEARCH · CL_14427 ·

    Disentangled Safety Adapters offer efficient AI guardrails and flexible alignment

    Researchers have developed Disentangled Safety Adapters (DSA), a new framework designed to improve AI safety and alignment without sacrificing inference efficiency or flexibility. DSA works by using lightweight adapters…

  2. RESEARCH · CL_13821 ·

    EU launches AI Resources site with glossary and cross-references to nine digital acts

    A new website, AI resources.eu, has been launched to serve as an open, bilingual reference for the European Union's digital regulatory landscape. The site currently details nine key acts, including the AI Act and GDPR, …

  3. RESEARCH · CL_04296 ·

    DeepSeek V3.2 model introduces Sparse Attention for improved long-context processing

    DeepSeek has introduced its V3.2 model, incorporating DeepSeek Sparse Attention (DSA). This innovation reduces attention complexity from O(L²) to O(Lk), significantly enhancing efficiency for processing long contexts. T…

  4. FRONTIER RELEASE · CL_01752 ·

    MiniMax 2.7: GLM-5 at 1/3 cost SOTA Open Model

    MiniMax has released MiniMax 2.7, an open-source model that matches the performance of Z.ai's GLM-5 on several benchmarks but at a significantly lower cost. The model is noted for its efficiency and claims to be the fir…