Paper
22 February 2023 Class-wise knowledge distillation
Fei Li, Yifang Yang
Author Affiliations +
Proceedings Volume 12587, Third International Seminar on Artificial Intelligence, Networking, and Information Technology (AINIT 2022); 125870T (2023) https://doi.org/10.1117/12.2667603
Event: Third International Seminar on Artificial Intelligence, Networking, and Information Technology (AINIT 2022), 2022, Shanghai, China
Abstract
Knowledge distillation (KD) transfers knowledge of a teacher model to improve the performance of a student model which is usually equipped with a lower capacity. The standard KD framework, however, neglects that the DNNs exhibit a wide range of class-wise accuracy and the performance of some classes is even decreased after distillation. Observing the above phenomena, we propose a novel Class-Wise Knowledge Distillation method to find the hard classes with a simple yet effective technique and then make the students take more effort to learn these hard classes. In the experiments on image classification tasks using CIFAR-100 dataset, we demonstrate that the proposed method outperforms the other KD methods and achieves excellent performance enhancement on various networks.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Fei Li and Yifang Yang "Class-wise knowledge distillation", Proc. SPIE 12587, Third International Seminar on Artificial Intelligence, Networking, and Information Technology (AINIT 2022), 125870T (22 February 2023); https://doi.org/10.1117/12.2667603
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Networks

Neural networks

Performance modeling

Visual process modeling

Visualization

RGB color model

Network architectures

Back to Top