PulseAugur
LIVE 03:23:11
ENTITY 100k context length

100k context length

PulseAugur coverage of 100k context length — every cluster mentioning 100k context length across labs, papers, and developer communities, ranked by signal.

Total · 30d
1
1 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
0
0 over 90d
TIER MIX · 90D
RECENT · PAGE 1/1 · 1 TOTAL
  1. RESEARCH · CL_03569 ·

    Quantized Qwen3.6-27B model achieves 100k context on 16GB VRAM

    A user on Reddit's r/LocalLLaMA has detailed a method for running the Qwen3.6-27B model on a system with 16GB of VRAM, achieving a context length of 100,000 tokens. The process involves creating a custom GGUF quantizati…