Conditional Knowledge Distillation for Model Specialization
Vol. 48, No. 4, pp. 369-376, Apr. 2021

Abstract
Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.
|
Cite this article
[IEEE Style]
H. Kim and D. Choi, "Conditional Knowledge Distillation for Model Specialization," Journal of KIISE, JOK, vol. 48, no. 4, pp. 369-376, 2021. DOI: 10.5626/JOK.2021.48.4.369.
[ACM Style]
Hakbin Kim and Dong-Wan Choi. 2021. Conditional Knowledge Distillation for Model Specialization. Journal of KIISE, JOK, 48, 4, (2021), 369-376. DOI: 10.5626/JOK.2021.48.4.369.
[KCI Style]
김학빈, 최동완, "모델 전문화를 위한 조건부 지식 증류 기법," 한국정보과학회 논문지, 제48권, 제4호, 369~376쪽, 2021. DOI: 10.5626/JOK.2021.48.4.369.
[Endnote/Zotero/Mendeley (RIS)] Download
[BibTeX] Download
Search

Journal of KIISE
- ISSN : 2383-630X(Print)
- ISSN : 2383-6296(Electronic)
- KCI Accredited Journal
Editorial Office
- Tel. +82-2-588-9240
- Fax. +82-2-521-1352
- E-mail. chwoo@kiise.or.kr