Cognitive-Uncertainty Guided Knowledge Distillation for Accurate Classification of Student Misconceptions
Researchers have developed a novel two-stage knowledge distillation framework to improve the accuracy of classifying student misconceptions, particularly addressing data scarcity and noisy labels. This method mines high-value samples by leveraging cognitive uncertainty from a teacher model, enabling smaller student models to achieve superior performance. Experiments demonstrated significant accuracy gains on algebra misconception benchmarks, outperforming larger state-of-the-art models. AI
IMPACT This research could lead to more effective AI tutors and educational tools by improving the classification of student learning difficulties.