Gelu
PulseAugur coverage of Gelu — every cluster mentioning Gelu across labs, papers, and developer communities, ranked by signal.
-
Neural networks achieve super-fast convergence and represent complex functions with floating-point arithmetic
Two new arXiv papers explore theoretical aspects of neural network convergence and representation capabilities. The first paper demonstrates that neural network classifiers can achieve super-fast convergence rates under…
-
MLP skip connections can't be absorbed into residual-free models
Researchers have investigated whether a skip connection around a single-hidden-layer MLP can be absorbed into a residual-free MLP of the same width. They found that for certain activation functions like ReLU^2 and ReGLU…
-
New GEM activation functions offer smoother, rational alternatives to ReLU
Researchers have introduced Geometric Monomial (GEM), a new family of activation functions designed for deep neural networks. These functions utilize purely rational arithmetic and offer $C^{2N}$-smoothness, aiming to i…