Knowledge distillation with uncertainty quantification

זיקוק מידע עם כימות אי ודאות

מספר פרויקט
418
סטטוס - הצעה
הצעה
אחראי אקדמי
שנה
2025

הרקע לפרויקט:

Knowledge distillation refers to training a small model ("student") based on the knowledge gained by a computationally expensive model ("teacher"). In classification, this means that the student will learn to predict the logits vector of the teacher rather than the (less informative) label. In many applications, a classifier needs to quantify the uncertainty in its prediction. In this project, we will explore how uncertainty quantification methods can benefit/improve the knowledge distillation setting.

מטרת הפרויקט:

The goal of the project is to explore uncertainty quantification methods (e.g., confidence calibration and conformal prediction) in the knowledge distillation setting. Specifically, we aim to:

  1. Devising algorithms for improving the student's performance using the uncertainty quantification of the teacher;
  2. Devising algorithms for improving the uncertainty quantification of the student using the extended knowledge of the teacher.

תכולת הפרויקט:

  1. Understanding knowledge distillation in classification, confidence calibration, and conformal prediction.
  2. Devising algorithms for improving the student's performance using the uncertainty quantification of the teacher.
  3. Devising algorithms for improving the uncertainty quantification of the student using the extended knowledge of the teacher.

קורסי קדם:

מבוא ללמידת מכונה, רישום לקורס למידה עמוקה

מקורות:

* https://arxiv.org/abs/1503.02531
* https://arxiv.org/abs/2107.07511

תאריך עדכון אחרון : 30/09/2024