Researchers have introduced Geometric Monomial (GEM), a new family of activation functions designed for deep neural networks. These functions utilize purely rational arithmetic and offer $C^{2N}$-smoothness, aiming to improve upon the limitations of the standard ReLU. Experiments show that GEM variants can match or exceed the performance of established functions like GELU on various benchmarks, including CIFAR-10, CIFAR-100, MNIST, GPT-2, and BERT-small. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
IMPACT Introduces a new activation function family that shows competitive performance against GELU on various benchmarks, potentially improving deep learning model optimization.
RANK_REASON This is a research paper introducing a novel activation function family with experimental results.