Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment
Abstract
:1. Introduction
- We present a novel distillation approach that compresses the knowledge of teacher networks into a compact student network, enabling efficient few-shot classification. The incorporation of global and local relationship strategies during the distillation process effectively directs the student network towards achieving performance levels akin to those of the teacher network.
- We contribute a new dataset that contains 100 classes of electrical equipment with 4000 images. The dataset contains a wide range of various pieces of electrical equipment, including power generation equipment, distribution equipment, industrial electrical equipment, and household electrical equipment.
- We demonstrate the effectiveness of our proposed method by validating it on three public datasets and comparing it with the SOTA methods on the electrical image dataset we introduced. Our proposed method outperforms all other methods and achieves the best performance.
2. Methodology
2.1. Problem Definition
2.2. FSC Network Based on Global and Local Knowledge Distillation
2.2.1. Pre-Train of Teacher Network
2.2.2. Global and Local Knowledge Distillation
2.2.3. Few-Shot Evaluation
3. Experiments
3.1. EEI-100 Dataset
3.2. Experiments on Public Datasets
3.2.1. Experiment Setup
3.2.2. Parametric Analysis Experiment
3.2.3. Ablation Studies
3.2.4. Comparison Experiment Compared with Existing Methods
- On the MiniImageNet dataset, our proposed method achieves the best classification performance. Compared with the best-performing method in the meta-learning-based category, HGNN, our method outperforms it by 2.23% and 0.9% on 1-shot and 5-shot classification tasks, respectively. In the transfer learning-based category, compared with the best performing method, CGCS, our method outperforms it by 2.33% and 1.26% on 1-shot and 5-shot classification tasks, respectively.
- On the CIFAR-FS dataset, our proposed method also achieves top performance. Our method outperforms the best-performing method, PSST, by 2.67% and 0.42% on 1-shot and 5-shot classification tasks, respectively.
- On the CUB-200-2011 dataset, our proposed method achieves the highest classification performance. Our method outperforms the best-performing method, HGNN, by 1.42% and 0.99% on 1-shot and 5-shot classification tasks, respectively.
3.3. Experiments on EEI-100 Dataset
3.3.1. Parametric Analysis Experiment
3.3.2. Comparison Experiment with Existing Methods
4. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
SOTA | State-of-the-art |
UAV | Unmanned aerial vehicle |
CNN | convolutional neural network |
FSL | Few-shot learning |
FSC | Few-shot classification |
Appendix A
References
- Peng, J.; Sun, L.; Wang, K.; Song, L. ED-YOLO power inspection UAV obstacle avoidance target detection algorithm based on model compression. Chin. J. Sci. Instrum. 2021, 10, 161–170. [Google Scholar]
- Bogdan, T.N.; Bruno, M.; Rafael, W.; Victor, B.G.; Vanderlei, Z.; Lourival, L. A Computer Vision System for Monitoring Disconnect Switches in Distribution Substations. IEEE Trans. Power Deliv. 2022, 37, 833–841. [Google Scholar]
- Zhang, Z.D.; Zhang, B.; Lan, Z.C.; Lu, H.C.; Li, D.Y.; Pei, L.; Yu, W.X. FINet: An Insulator Dataset and Detection Benchmark Based on Synthetic Fog and Improved YOLOv5. IEEE Trans. Instrum. Meas. 2022, 71, 6006508. [Google Scholar] [CrossRef]
- Xu, Y.; Li, Y.; Wang, Y.; Zhong, D.; Zhang, G. Improved few-shot learning method for transformer fault diagnosis based on approximation space and belief functions. Expert Syst. Appl. 2021, 167, 114105. [Google Scholar] [CrossRef]
- Yi, Y.; Chen, Z.; Wang, L. Intelligent Aging Diagnosis of Conductor in Smart Grid Using Label-Distribution Deep Convolutional Neural Networks. IEEE Trans. Instrum. Meas. 2022, 71, 3501308. [Google Scholar] [CrossRef]
- Finn, C.; Abbeel, P.; Levine, S. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Network. In Proceedings of the 34th International Conference on Machine Learning (ICML), Sydney, Australia, 6–11 August 2017. [Google Scholar]
- Li, Z.; Zhou, F.; Chen, F.; Li, H. Meta-SGD: Learning to Learn Quickly for Few-Shot Learning. arXiv 2017, arXiv:1707.09835. [Google Scholar]
- Ravi, S.; Larochelle, H. Optimization as a model for few-shot learning. In Proceedings of the 5th International Conference on Learning Representations (ICLR), Toulon, France, 24–26 April 2017. [Google Scholar]
- Wu, Z.; Li, Y.; Guo, L.; Jia, K. PARN: Position-Aware Relation Networks for few-shot learning. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019. [Google Scholar]
- Gidaris, S.; Bursuc, A.; Komodakis, N.; Perez, P.; Cord, M.; Ecole, L. Boosting few-shot visual learning with self-supervision. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019. [Google Scholar]
- Zhang, H.; Zhang, J.; Koniusz, P. Few-shot learning via saliency-guided hallucination of samples. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Hou, R.; Chang, H.; Ma, B.; Shan, S.; Chen, X. Cross attention network for few-shot classification. In Proceedings of the 33rd International Conference on Neural Information Processing Systems (NIPS), Vancouver, BC, Canada, 8–14 December 2019. [Google Scholar]
- Guo, Y.; Cheung, N. Attentive weights generation for few shot learning via information maximization. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020. [Google Scholar]
- Li, H.; Eigen, D.; Dodge, S.; Zeiler, M.; Wang, X. Finding task-relevant features for few-shot learning by category traversal. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Nguyen, V.N.; Løkse, S.; Wickstrøm, K.; Kampffmeyer, M.; Roverso, D.; Jenssen, R. SEN: A novel feature normalization dissimilarity measure for prototypical few-Shot learning networks. In Proceedings of the 16th European Conference on Computer Vision (ECCV), Glasgow, Scotland, 23–28 August 2020. [Google Scholar]
- Wertheime, D.; Tang, L.; Hariharan, B. Few-shot classification with feature map reconstruction networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Virtual Conference, 19–25 June 2021. [Google Scholar]
- Li, W.; Wang, L.; Xu, J.; Huo, J.; Gao, Y.; Luo, J. Revisiting local descriptor based image-to-class measure for few-shot learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Zhang, C.; Cai, Y.; Lin, G.; Shen, C. DeepEMD: Few-shot image classification with differentiable Earth Mover’s distance and structured classifiers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020. [Google Scholar]
- Chen, Y.; Liu, Y.; Kira, Z.; Wang, Y.F.; Huang, J. A closer look at few-shot classification. In Proceedings of the 7th International Conference on Learning Representations (ICLR), New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Liu, B.; Cao, Y.; Lin, Y.; Zhang, Z.; Long, M.; Hu, H. Negative margin matters: Understanding margin in few-shot classification. In Proceedings of the 16th European conference on computer vision (ECCV), Glasgow, Scotland, 23–23 August 2020. [Google Scholar]
- Mangla, P.; Singh, M.; Sinha, A.; Kumari, N.; Balasubramanian, V.; Krishnamurthy, B. Charting the right manifold: Manifold mixup for few-shot learning. In Proceedings of the the IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–7 January 2020. [Google Scholar]
- Su, J.; Maji, S.; Hariharan, B. When does self-supervision improve few-shot learning. In Proceedings of the 16th European conference on computer vision (ECCV), Glasgow, Scotland, 23–23 August 2020. [Google Scholar]
- Shao, S.; Xing, L.; Wang, Y.; Xu, R.; Zhao, C.; Wang, Y.J.; Liu, B. MHFC: Multi-head feature collaboration for few-shot learning. In Proceedings of the 29th ACM International Conference on Multimedia (MM), Virtual Conference, 20–24 October 2021. [Google Scholar]
- Geoffrey, H.; Oriol, V.; Jeff, D. Distilling the Knowledge in a Neural Network. arXiv 2015, arXiv:1503.02531. [Google Scholar]
- Adriana, R.; Nicolas, B.; Samira, E.K.; Antoine, C.; Carlo, G.; Yoshua, B. FitNets: Hints for Thin Deep Nets. arXiv 2014, arXiv:1412.6550. [Google Scholar]
- Zagoruyko, S.; Komodakis, N. Paying more attention to attention: Improving the performance of convolutional neural networks via attention transfer. In Proceedings of the 5th International Conference on Learning Representations (ICLR), Toulon, France, 24–26 April 2017. [Google Scholar]
- Park, W.; Kim, D.; Lu, Y.; Cho, M. Relational knowledge distillation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Peng, B.; Jin, X.; Liu, J.; Zhou, S.; Wu, Y.; Liu, Y.; Li, D.; Zhang, Z. Correlation congruence for knowledge distillation. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019. [Google Scholar]
- Zhou, B.; Zhang, X.; Zhao, J.; Zhao, F.; Yan, C.; Xu, Y.; Gu, J. Few-shot electric equipment classification via mutual learning of transfer-learning model. In Proceedings of the IEEE 5th International Electrical and Energy Conference (CIEEC), Nanjing, China, 27–29 May 2022. [Google Scholar]
Method | Backbone | MiniImageNet | CIFAR-FS | CUB | |||
---|---|---|---|---|---|---|---|
1-Shot | 5-Shot | 1-Shot | 5-Shot | 1-Shot | 5-Shot | ||
Global | Conv4 | 57.32 ± 0.84 | 72.90 ± 0.64 | 66.40 ± 0.93 | 80.44 ± 0.67 | 70.20 ± 0.93 | 83.88 ± 0.57 |
Local | Conv4 | 57.65 ± 0.83 | 73.06 ± 0.64 | 66.63 ± 0.93 | 80.64 ± 0.67 | 70.12 ± 0.93 | 83.66 ± 0.57 |
Global-Local | Conv4 | 57.86 ± 0.83 | 73.38 ± 0.62 | 67.04 ± 0.91 | 80.84 ± 0.68 | 70.44 ± 0.92 | 84.19 ± 0.56 |
Method | Backbone | MiniImageNet | CIFAR-FS | CUB | |||
---|---|---|---|---|---|---|---|
1-Shot | 5-Shot | 1-Shot | 5-Shot | 1-Shot | 5-Shot | ||
Meta-learning | |||||||
Relational | Conv4 | 50.44 ± 0.82 | 65.32 ± 0.70 | 55.00 ± 1.00 | 69.30 ± 0.80 | 62.45± 0.98 | 76.11± 0.69 |
MetaOpt SVM | Conv4 | 52.87 ± 0.57 | 68.76 ± 0.48 | - | - | - | - |
PN+rot | Conv4 | 53.63 ± 0.43 | 71.70 ± 0.36 | - | - | - | - |
CovaMNet | Conv4 | 51.19 ± 0.76 | 67.65± 0.63 | - | - | 52.42 ± 0.76 | 63.76 ± 0.64 |
DN4 | Conv4 | 51.24 ± 0.74 | 71.02 ± 0.64 | - | - | 46.84 ± 0.81 | 74.92 ± 0.64 |
MeTAL | Conv4 | 52.63 ± 0.37 | 70.52 ± 0.29 | - | - | ||
HGNN | Conv4 | 55.63 ± 0.20 | 72.48 ± 0.16 | - | - | 69.02 ± 0.22 | 83.20 ± 0.15 |
DSFN | Conv4 | 50.21 ± 0.64 | 72.20 ± 0.51 | - | - | - | - |
PSST | Conv4 | - | - | 64.37 ± 0.33 | 80.42± 0.32 | - | - |
Transfer-learning | |||||||
Baseline++ | Conv4 | 48.24 ± 0.75 | 66.43 ± 0.63 | - | - | 60.53 ± 0.83 | 79.34 ± 0.61 |
Neg-Cosine | Conv4 | 52.84 ± 0.76 | 70.41 ± 0.66 | - | - | - | - |
SKD | Conv4 | 48.14 | 66.36 | - | - | - | - |
CGCS | Conv4 | 55.53 ± 0.20 | 72.12 ± 0.16 | - | - | - | - |
Our method | Conv4 | 57.86 ± 0.83 | 73.38 ± 0.62 | 67.04 ± 0.91 | 80.84 ± 0.68 | 70.44 ± 0.92 | 84.19 ± 0.56 |
Method | 1-Shot | 5-Shot |
---|---|---|
CGCS | 72.85 ± 0.68 | 89.68 ± 0.27 |
Neg-Cosine | 74.57 ± 0.63 | 90.54 ± 0.25 |
HGNN | 75.61 ± 0.62 | 93.54 ± 0.24 |
Our method | 75.80 ± 0.67 | 94.12 ± 0.20 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, B.; Zhao, J.; Yan, C.; Zhang, X.; Gu, J. Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Appl. Sci. 2023, 13, 7016. https://doi.org/10.3390/app13127016
Zhou B, Zhao J, Yan C, Zhang X, Gu J. Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Applied Sciences. 2023; 13(12):7016. https://doi.org/10.3390/app13127016
Chicago/Turabian StyleZhou, Bojun, Jiahao Zhao, Chunkai Yan, Xinsong Zhang, and Juping Gu. 2023. "Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment" Applied Sciences 13, no. 12: 7016. https://doi.org/10.3390/app13127016
APA StyleZhou, B., Zhao, J., Yan, C., Zhang, X., & Gu, J. (2023). Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment. Applied Sciences, 13(12), 7016. https://doi.org/10.3390/app13127016