Next Article in Journal
A Simplified Pointing Model for Alt-Az Telescopes
Next Article in Special Issue
Editorial on the Special Issue: New Trends in Image Processing III
Previous Article in Journal
Geothermal Resource Mapping in Northern Botswana Inferred from Three-Dimensional Magnetotelluric Inversion
Previous Article in Special Issue
Who Cares about the Weather? Inferring Weather Conditions for Weather-Aware Object Detection in Thermal Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

SAFP-YOLO: Enhanced Object Detection Speed Using Spatial Attention-Based Filter Pruning

1
Department of Computer Convergence Software, Korea University, Sejong 30019, Republic of Korea
2
Info Valley Korea Co., Ltd., Anyang 14067, Republic of Korea
3
Department of Software, Sangmyung University, Cheonan 31066, Republic of Korea
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2023, 13(20), 11237; https://doi.org/10.3390/app132011237
Submission received: 19 August 2023 / Revised: 9 October 2023 / Accepted: 10 October 2023 / Published: 12 October 2023
(This article belongs to the Special Issue New Trends in Image Processing III)

Abstract

Because object detection accuracy has significantly improved advancements in deep learning techniques, many real-time applications have applied one-stage detectors, such as You Only Look Once (YOLO), owing to their fast execution speed and accuracy. However, for a practical deployment, the deployment cost should be considered. In this paper, a method for pruning the unimportant filters of YOLO is proposed to satisfy the real-time requirements of a low-cost embedded board. Attention mechanisms have been widely used to improve the accuracy of deep learning models. However, the proposed method uses spatial attention to improve the execution speed of YOLO by evaluating the importance of each YOLO filter. The feature maps before and after spatial attention are compared, and then the unimportant filters of YOLO can be pruned based on this comparison. To the best of our knowledge, this is the first report considering both accuracy and speed with Spatial Attention-based Filter Pruning (SAFP) for lightweight object detectors. To demonstrate the effectiveness of the proposed method, it was applied to the YOLOv4 and YOLOv7 baseline models. With the pig (baseline YOLOv4 84.4%@3.9FPS vs. proposed SAFP-YOLO 78.6%@20.9FPS) and vehicle (baseline YOLOv7 81.8%@3.8FPS vs. proposed SAFP-YOLO 75.7%@20.0FPS) datasets, the proposed method significantly improved the execution speed of YOLOv4 and YOLOv7 (i.e., by a factor of five) on a low-cost embedded board, TX-2, with acceptable accuracy.
Keywords: object detection; deep learning; attention mechanism; filter pruning object detection; deep learning; attention mechanism; filter pruning

Share and Cite

MDPI and ACS Style

Ahn, H.; Son, S.; Roh, J.; Baek, H.; Lee, S.; Chung, Y.; Park, D. SAFP-YOLO: Enhanced Object Detection Speed Using Spatial Attention-Based Filter Pruning. Appl. Sci. 2023, 13, 11237. https://doi.org/10.3390/app132011237

AMA Style

Ahn H, Son S, Roh J, Baek H, Lee S, Chung Y, Park D. SAFP-YOLO: Enhanced Object Detection Speed Using Spatial Attention-Based Filter Pruning. Applied Sciences. 2023; 13(20):11237. https://doi.org/10.3390/app132011237

Chicago/Turabian Style

Ahn, Hanse, Seungwook Son, Jaehyeon Roh, Hwapyeong Baek, Sungju Lee, Yongwha Chung, and Daihee Park. 2023. "SAFP-YOLO: Enhanced Object Detection Speed Using Spatial Attention-Based Filter Pruning" Applied Sciences 13, no. 20: 11237. https://doi.org/10.3390/app132011237

APA Style

Ahn, H., Son, S., Roh, J., Baek, H., Lee, S., Chung, Y., & Park, D. (2023). SAFP-YOLO: Enhanced Object Detection Speed Using Spatial Attention-Based Filter Pruning. Applied Sciences, 13(20), 11237. https://doi.org/10.3390/app132011237

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop