PulseAugur
LIVE 23:14:53
ENTITY Neural Networks

Neural Networks

PulseAugur coverage of Neural Networks — every cluster mentioning Neural Networks across labs, papers, and developer communities, ranked by signal.

Total · 30d
143
143 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
137
137 over 90d
TIER MIX · 90D
SENTIMENT · 30D

5 day(s) with sentiment data

RECENT · PAGE 1/2 · 27 TOTAL
  1. COMMENTARY · CL_30306 ·

    Connectionist AI research dominated by practitioners from 1950s-1990s

    The history of connectionist AI research spans from the late 1950s, following the invention of neural networks, until the late 1990s, preceding the rise of deep learning. During this period, the field of connectionism w…

  2. TOOL · CL_29370 ·

    Random Matrix Theory detects overfitting in neural networks and LLMs

    Researchers have developed a novel method using Random Matrix Theory to detect overfitting in neural networks, particularly during the "anti-grokking" phase of long-horizon training. This technique identifies "Correlati…

  3. RESEARCH · CL_27694 ·

    New neural tilting framework improves AI safety inference

    Researchers have developed a new neural exponential tilting framework for variational inference in Lévy-driven stochastic differential equations. This method addresses the intractability of Bayesian inference for proces…

  4. TOOL · CL_28342 ·

    Gradient Descent Convergence Proven for Wide Shallow Neural Networks

    Researchers have theoretically analyzed the convergence properties of gradient descent in training wide, shallow neural networks with bounded nonlinearities. Their work extends previous findings beyond simple ReLU or si…

  5. COMMENTARY · CL_26744 ·

    AI Explained: Understanding Its Core to Grasp Its Dangers

    This article explores the fundamental nature of Artificial Intelligence, aiming to demystify the technology and highlight potential dangers. It delves into concepts like deep learning and neural networks to provide a fo…

  6. TOOL · CL_28355 ·

    New framework uses higher-order calculus for neural network verification

    Researchers have developed HiTaB, a new framework for verifying neural networks, which enhances safety and robustness in AI systems. This method systematically utilizes higher-order information, specifically the Hessian…

  7. TOOL · CL_25553 ·

    New DTSemNet method trains oblique decision trees without approximations

    Researchers have developed DTSemNet, a new method for training oblique decision trees without approximations. This approach uses a semantically equivalent and invertible neural network representation, allowing for end-t…

  8. TOOL · CL_23078 ·

    Neural networks possess structured inner worlds reflecting reality's geometry, enabling safer AI.

    Researchers propose that neural networks possess internal geometric structures that mirror the real world's organization. Developing theories and methods that acknowledge this neural geometry could lead to enhanced inte…

  9. TOOL · CL_25640 ·

    Neural networks in physics are vulnerable to hidden systematic errors

    Researchers have identified a significant vulnerability in neural network models used for high-energy physics analyses. These models, while powerful, can be systematically misled by subtle input perturbations that remai…

  10. TOOL · CL_20488 ·

    Researchers draw parallels between Boltzmann machines and quantum physics path integrals

    This paper draws an analogy between Boltzmann machines used in machine learning and Feynman path integrals from quantum physics. The authors suggest that hidden layers in neural networks can be viewed as discrete versio…

  11. TOOL · CL_20442 ·

    AI models learn time-inhomogeneous Markov dynamics in financial time series

    Researchers have developed a new framework that uses neural networks to parameterize time-varying Markov transition matrices for financial time series. This approach aims to balance the representational power of deep le…

  12. COMMENTARY · CL_18912 ·

    AI's impact debated: replacing engineers, boosting productivity, and disrupting academia

    A YouTube video argues that the mathematical basis for AI replacing engineers is flawed, citing limitations in neural networks, hardware, and energy costs. Separately, an article discusses Eliyahu Goldratt's Theory of C…

  13. RESEARCH · CL_18817 ·

    New Conformalized Percentile Interval method improves AI prediction accuracy

    Researchers have developed a new method called Conformalized Percentile Interval to improve the accuracy and efficiency of predictive intervals. This technique calibrates responses using the probability integral transfo…

  14. TOOL · CL_18770 ·

    Machine learning predicts topological properties using physics-informed neural networks

    Researchers have developed a novel machine learning technique to predict topological properties, specifically the Euler characteristic, from images. The model generates a unit vector field from an image, which is then i…

  15. RESEARCH · CL_16204 ·

    Review details multi-fidelity neural networks for composite mechanics modeling

    This paper reviews multi-fidelity surrogate modeling techniques for predicting the complex properties of composite materials. It covers methods ranging from Gaussian-process-based approaches like co-Kriging to multi-fid…

  16. RESEARCH · CL_16274 ·

    Researchers explore neural network complexity, computation, and graph theory connections

    Researchers are exploring new theoretical frameworks and computational models for neural networks. One paper introduces a unified framework to analyze and construct deep neural networks by modeling tensor operations, re…

  17. RESEARCH · CL_15546 ·

    EdgeLPR paper explores neural network precision vs performance trade-offs for LiDAR place recognition

    Researchers have developed EdgeLPR, a method for efficient LiDAR-based place recognition on edge devices. The approach utilizes Bird's Eye View representations to enable lightweight image-based networks for autonomous n…

  18. RESEARCH · CL_24187 ·

    New methods train neural networks with non-differentiable components

    Researchers have developed new methods for training neural networks that incorporate non-differentiable components, a common challenge in areas like spiking neurons or quantized layers. One approach, detailed in an arXi…

  19. RESEARCH · CL_14031 ·

    New research explores batch normalization's geometric impact on neural network partitions

    Two new research papers explore advancements in Batch Normalization (BN) for neural networks. One paper investigates how training-time BN affects the geometric partitioning of functions in piecewise-affine networks, sug…

  20. RESEARCH · CL_14039 ·

    New regularization methods improve neural network performance and complexity control

    Researchers have developed novel norm-based regularization techniques for neural networks, aiming to improve predictive performance and complexity control. These methods extend classical ridge and lasso penalties by inc…