Next Article in Journal
Estimation of Photovoltaic Potential of Solar-Powered Electric Vehicle: Case Study of Commuters on Donghae Expressway, Korea
Previous Article in Journal
Numerically Investigating the Effect of Trim on the Resistance of a Container Ship in Confined and Shallow Water
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Background-Filtering Feature-Enhanced Graph Neural Networks for Few-Shot Learning

1
School of Computer and Information Engineering, Harbin University of Commerce, Harbin 150028, China
2
Institute of System Engineering, Harbin University of Commerce, Harbin 150028, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(15), 6571; https://doi.org/10.3390/app14156571 (registering DOI)
Submission received: 3 July 2024 / Revised: 21 July 2024 / Accepted: 25 July 2024 / Published: 27 July 2024

Abstract

The fundamental idea behind few-shot learning is to employ sparse labeled data to effectively handle novel tasks, whereas most existing mainstream approaches mostly rely on prior experience gained from previous situations. Nonetheless, effective knowledge transfer is sometimes hampered by constraints on new samples and barriers between classes. To address these issues, this research presents a novel background-filtering feature-enhanced graph network (BFFE-GNN) that attempts to generate relationships between graphs in order to explicitly describe and transmit inter-class relationships. To specifically address the issue of inadequate information utilization brought on by sample background interference, which is frequent in classification tasks, we employ a novel background-filtering feature-enhanced graph network. Effective data information extraction from complicated datasets is challenging due to the original module’s relatively simple network structure during the feature extraction stage and the interference of the image background. The background-filtering module was specifically introduced, which enhances spatial attention. This not only improves the quality of feature extraction but also effectively lessens the influence of picture background on classification results. In addition, we have improved the background-filtering-based feature gap calculation by implementing a feature-enhanced module. To demonstrate the adaptability of the BFFE-GNN model, we not only ran experiments on two publicly available datasets, MiniImagenet and TiredImagenet, but we also created our own Tool dataset. The method’s exceptional performance and universal applicability in the field of few-shot picture classification are clearly demonstrated by the experimental findings, which indicate that it greatly outperforms the majority of existing similar methods. This discovery establishes a strong basis for further research in the field of few-shot learning in addition to offering fresh insights into the subject.
Keywords: few-shot learning; graph neural network (GNN); background filtering; feature enhancement; image classification few-shot learning; graph neural network (GNN); background filtering; feature enhancement; image classification

Share and Cite

MDPI and ACS Style

Wang, B.; Wang, Y.; Xu, Y. Background-Filtering Feature-Enhanced Graph Neural Networks for Few-Shot Learning. Appl. Sci. 2024, 14, 6571. https://doi.org/10.3390/app14156571

AMA Style

Wang B, Wang Y, Xu Y. Background-Filtering Feature-Enhanced Graph Neural Networks for Few-Shot Learning. Applied Sciences. 2024; 14(15):6571. https://doi.org/10.3390/app14156571

Chicago/Turabian Style

Wang, Binbin, Yuemao Wang, and Yaoqun Xu. 2024. "Background-Filtering Feature-Enhanced Graph Neural Networks for Few-Shot Learning" Applied Sciences 14, no. 15: 6571. https://doi.org/10.3390/app14156571

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop