ICLR 2026
PulseAugur coverage of ICLR 2026 — every cluster mentioning ICLR 2026 across labs, papers, and developer communities, ranked by signal.
1 day(s) with sentiment data
-
Google's TurboQuant cuts LLM memory use by 6x with no accuracy loss
Google researchers have developed a new technique called TurboQuant that significantly reduces the memory required by large language models. By employing a two-step process involving data rotation and scalar quantizatio…
-
MIT research reveals superposition enables LLM scaling, ICLR 2026 sees open science surge
Researchers from MIT have identified "superposition" as the key mechanism enabling language models to scale effectively. This phenomenon, where shared neurons encode multiple features, explains the consistent performanc…
-
Apple researchers unveil STARFlow-V, a normalizing flow video generator
Researchers from Apple and Cornell have introduced STARFlow-V, a novel video generation model utilizing normalizing flows. This approach offers an alternative to diffusion models, achieving comparable visual quality whi…
-
Apple showcases AI research at ICASSP and ICLR conferences
Apple is participating in the International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2026, presenting research on topics such as reducing multilingual gaps in speech models using audio-visual data …
-
MIT researchers unveil WRING, a new rotation-based method to debias AI vision models
Researchers from MIT Jameel Clinic and ICLR 2026 have developed a novel debiasing technique for AI vision models named WRING. This method utilizes rotation-based approaches to address biases in AI vision models, aiming …
-
Microsoft open-sources VibeVoice for long-form speech AI
Microsoft has open-sourced VibeVoice, a suite of advanced voice AI models. The VibeVoice family includes both Text-to-Speech (TTS) and Automatic Speech Recognition (ASR) capabilities. A key innovation is the use of cont…
-
Apple enables parallel RNN training, challenging transformer dominance
Apple researchers have developed ParaRNN, a new framework that enables parallel training of nonlinear Recurrent Neural Networks (RNNs). This advancement overcomes the historical sequential bottleneck in RNN training, ac…
-
Apple researchers unveil parallel RNN training and enhanced SSMs at ICLR 2026
Apple researchers are presenting new work at ICLR 2026, focusing on advancements in recurrent neural networks (RNNs) and state space models (SSMs). Their paper "ParaRNN" introduces a parallelized training framework that…