PulseAugur
LIVE 00:48:08
tool · [1 source] ·
0
tool

New adapter TFM-Retouche improves tabular foundation models without fine-tuning

Researchers have developed TFM-Retouche, a novel adapter designed to enhance tabular foundation models (TFMs) without requiring computationally expensive full fine-tuning. This lightweight, architecture-agnostic adapter operates in the input space, learning a small residual correction to better align data with the TFM's existing inductive biases. When applied to TabICLv2, the framework, named TabICLv2-Retouche, achieved top rankings on the TabArena-Lite benchmark, significantly improving aggregate Elo scores and maintaining efficiency. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Introduces a more efficient method for adapting tabular foundation models, potentially improving their performance on diverse datasets without extensive retraining.

RANK_REASON This is a research paper introducing a new method for adapting tabular foundation models. [lever_c_demoted from research: ic=1 ai=1.0]

Read on arXiv cs.LG →

COVERAGE [1]

  1. arXiv cs.LG TIER_1 · Duong Nguyen, Mohammed Jawhar, Nicolas Chesneau ·

    TFM-Retouche: A Lightweight Input-Space Adapter for Tabular Foundation Models

    arXiv:2605.06047v1 Announce Type: new Abstract: Tabular foundation models (TFMs), such as TabPFN-2.6, TabICLv2, ConTextTab, Mitra, LimiX, and TabDPT, achieve strong zero-shot performance through in-context learning, but their inductive biases remain fixed at inference time. Adapt…