Conditional Knowledge Distillation for Model Specialization 


Vol. 48,  No. 4, pp. 369-376, Apr.  2021
10.5626/JOK.2021.48.4.369


PDF

  Abstract

Many recent works on model compression in neural networks are based on knowledge distillation (KD). However, since the basic goal of KD is to transfer the entire knowledge set of a teacher model to a student model, the standard KD may not represent the best use of the model’s capacity when a user wishes to classify only a small subset of classes. Also, it is necessary to possess the original teacher model dataset for KD, but for various practical reasons, such as privacy issues, the entire dataset may not be available. Thus, this paper proposes conditional knowledge distillation (CKD), which only distills specialized knowledge corresponding to a given subset of classes, as well as data-free CKD (DF-CKD), which does not require the original data. As a major extension, we devise Joint-CKD, which jointly performs DF-CKD and CKD with only a small additional dataset collected by a client. Our experimental results show that the CKD and DF-CKD methods are superior to standard KD, and also confirm that joint use of CKD and DF-CKD is effective at further improving the overall accuracy of a specialized model.


  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

H. Kim and D. Choi, "Conditional Knowledge Distillation for Model Specialization," Journal of KIISE, JOK, vol. 48, no. 4, pp. 369-376, 2021. DOI: 10.5626/JOK.2021.48.4.369.


[ACM Style]

Hakbin Kim and Dong-Wan Choi. 2021. Conditional Knowledge Distillation for Model Specialization. Journal of KIISE, JOK, 48, 4, (2021), 369-376. DOI: 10.5626/JOK.2021.48.4.369.


[KCI Style]

김학빈, 최동완, "모델 전문화를 위한 조건부 지식 증류 기법," 한국정보과학회 논문지, 제48권, 제4호, 369~376쪽, 2021. DOI: 10.5626/JOK.2021.48.4.369.


[Endnote/Zotero/Mendeley (RIS)]  Download


[BibTeX]  Download



Search




Journal of KIISE

  • ISSN : 2383-630X(Print)
  • ISSN : 2383-6296(Electronic)
  • KCI Accredited Journal

Editorial Office

  • Tel. +82-2-588-9240
  • Fax. +82-2-521-1352
  • E-mail. chwoo@kiise.or.kr