PulseAugur
LIVE 09:42:29
ENTITY Gelu

Gelu

PulseAugur coverage of Gelu — every cluster mentioning Gelu across labs, papers, and developer communities, ranked by signal.

Total · 30d
3
3 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
3
3 over 90d
TIER MIX · 90D
RECENT · PAGE 1/1 · 3 TOTAL
  1. RESEARCH · CL_18833 ·

    Neural networks achieve super-fast convergence and represent complex functions with floating-point arithmetic

    Two new arXiv papers explore theoretical aspects of neural network convergence and representation capabilities. The first paper demonstrates that neural network classifiers can achieve super-fast convergence rates under…

  2. RESEARCH · CL_06782 ·

    MLP skip connections can't be absorbed into residual-free models

    Researchers have investigated whether a skip connection around a single-hidden-layer MLP can be absorbed into a residual-free MLP of the same width. They found that for certain activation functions like ReLU^2 and ReGLU…

  3. RESEARCH · CL_03012 ·

    New GEM activation functions offer smoother, rational alternatives to ReLU

    Researchers have introduced Geometric Monomial (GEM), a new family of activation functions designed for deep neural networks. These functions utilize purely rational arithmetic and offer $C^{2N}$-smoothness, aiming to i…