PulseAugur
LIVE 08:13:40
research · [2 sources] ·
0
research

PUICL transformer enables in-context positive-unlabeled learning without fitting

Researchers have developed PUICL, a pretrained transformer model capable of performing positive-unlabeled (PU) learning through in-context learning. This approach eliminates the need for dataset-specific training or iterative optimization, allowing for rapid task solving. PUICL was trained on synthetic PU datasets and demonstrated superior performance in AUC and accuracy compared to four standard PU learning baselines across 20 benchmarks. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Extends in-context learning to semi-supervised PU classification, potentially enabling faster and more adaptable solutions for tasks with limited labeled data.

RANK_REASON Academic paper detailing a new machine learning method.

Read on arXiv stat.ML →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Siyan Liu, Yi Chang, Manli Cheng, Qinglong Tian, Pengfei Li ·

    In-Context Positive-Unlabeled Learning

    arXiv:2605.05591v1 Announce Type: cross Abstract: Positive-unlabeled (PU) learning addresses binary classification when only a set of labeled positives is available alongside a pool of unlabeled samples drawn from a mixture of positives and negatives. Existing PU methods typicall…

  2. arXiv stat.ML TIER_1 · Pengfei Li ·

    In-Context Positive-Unlabeled Learning

    Positive-unlabeled (PU) learning addresses binary classification when only a set of labeled positives is available alongside a pool of unlabeled samples drawn from a mixture of positives and negatives. Existing PU methods typically require dataset-specific training or iterative o…