Next Article in Journal
Area and Performance Estimates of Finite State Machines in Reconfigurable Systems
Next Article in Special Issue
Accuracy of Measurement Tools for Ocular-Origin Anomalous Head Posture and the Cervical Range of Motion Kinematics in Children with an Anomalous Head Position
Previous Article in Journal
Numerical Simulation of Microwave-Induced Cracking of Coal Containing Pyrite Powder Based on a Multi-Field Coupling Model
Previous Article in Special Issue
Advances in Wearable Smart Chemical Sensors for Health Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AI-Based Analysis of Archery Shooting Time from Anchoring to Release Using Pose Estimation and Computer Vision

1
Department of AI & Informatics, Graduate School, Sangmyung University, Seoul 03016, Republic of Korea
2
Department of Sports ICT Convergence, Sangmyung University, Seoul 03016, Republic of Korea
3
Techars Company, Hwaseong-si 18392, Republic of Korea
4
Department of Human-Centered Artificial Intelligence, Sangmyung University, Seoul 03016, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(24), 11838; https://doi.org/10.3390/app142411838
Submission received: 13 November 2024 / Revised: 13 December 2024 / Accepted: 16 December 2024 / Published: 18 December 2024
(This article belongs to the Special Issue Advances in Motion Monitoring System)

Abstract

:
This study presents a novel method for automatically analyzing archery shooting time using AI and computer vision technologies, with a particular focus on the critical anchoring to release phase, which directly influences performance. The proposed approach detects the start of the anchoring phase using pose estimation and accurately measures the shooting time by detecting the bowstring within the athlete’s facial bounding box, utilizing Canny edge detection and the probabilistic Hough transform. To ensure stability, low-pass filtering was applied to both the facial bounding box and pose estimation results, and an algorithm was implemented to handle intermittent bowstring detection due to various external factors. The proposed method was validated by comparing its results with expert manual measurements obtained using Dartfish software v2022 achieving a mean absolute error (MAE) of 0.34 s and an R 2 score of 0.95. This demonstrates a significant improvement compared to the bowstring-only method, which resulted in an MAE of 1.4 s and an R 2 score of 0.89. Previous research has demonstrated a correlation between shooting time and arrow accuracy. Therefore, this method can provide real-time feedback to athletes, overcoming the limitations of traditional manual measurement techniques. It enables immediate technical adjustments during training, which can contribute to overall performance improvement.

1. Introduction

Archery is a sport in which athletes shoot arrows at a target from a certain distance, with the objective of scoring points based on the accuracy of their shots. It is one of the oldest events in the Summer Olympic Games. In archery competitions, the entire process, from the preparation phase to releasing the arrow, must be completed within a set timeframe, and it is crucial to shoot the arrow as close to the center of the target as possible. The preparation phase for shooting an arrow involves drawing the bow, aiming, and releasing, and some experts have divided this process into six distinct stages: (1) bow holding; (2) drawing; (3) anchoring; (4) aiming; (5) release; and (6) full draw [1]. Bow holding refers to the stage where the arrow is nocked onto the bow, and drawing is the process of pulling the bowstring, with the balance of force between the pulling and pushing hands being important. The anchoring and aiming stages are interconnected, involving pulling the bow to aim at the target point. The release stage is the final phase of the shot, where the arrow is released. Finally, the full draw stage occurs after the release, where maintaining proper form is essential, as the posture immediately after the release can affect the trajectory of the arrow. Given the time constraints for performing these steps, proper management is required to ensure optimal performance [2,3].
Shooting time is one of the most critical factors influencing the accuracy and consistency of archers’ performance [4,5]. It refers to the duration from the preparation phase to the moment the arrow is released, with stability and consistency during this period being directly linked to performance outcomes. Although shooting time varies depending on individual habits and styles, maintaining an optimal shooting time is crucial for all archers.
Archery is also a representative sport that combines science and technology, where physical abilities, mental focus, and psychological stability are all critical to performance [6]. For instance, during shooting time, archers must effectively manage physiological signals such as heart rate, breathing, and posture. These factors play a vital role in minimizing psychological stress during competition and ensuring precision in both arrow trajectory and accuracy.
Existing studies have extensively examined the relationship between the time from aiming to arrow release and scoring performance. Shooting time, which can be quantitatively evaluated during the preparation phase, has been shown to significantly influence performance. Tinazci, Cevdet found that shorter drawing times improve arrow accuracy, while longer aiming times negatively impact accuracy, leading to lower scores for some archers [7]. Takai et al. further analyzed medalists and non-medalists, revealing that medalists generally have shorter shooting times than non-medalists [8]. These findings highlight that shooting time is closely related to performance and that accurate measurement of shooting time can help archers enhance their training and improve their performance. Therefore, precisely measuring shooting time during training and competitions and integrating these measurements into practice are essential for improving an archer’s proficiency.
In modern archery, new approaches to analyzing and improving performance have been introduced alongside advancements in technology. Motion analysis techniques, physiological signal measurements, artificial intelligence (AI), and machine learning-based data analysis are emerging as promising alternatives to traditional methods. These methods offer faster and more accurate data compared to manual approaches.
However, traditional methods of sports movement analysis, such as video analysis using specialized software, have relied heavily on manual processes. These methods are time-consuming, labor-intensive, and limited in their ability to provide real-time feedback [9]. Additionally, manual analysis is prone to errors and inconsistencies due to its dependence on the analyst’s skill level and the repetitive, detailed observations required. In archery, in particular, detecting subtle variations in shooting techniques, which differ from one athlete to another, often requires specialized equipment. As a result, research and interpretation of archery shooting techniques have been limited [10].
This study proposes a method to automatically analyze the shooting time from anchoring to release by detecting the anchoring phase based on the angle of the archer’s right arm in the captured video data, and then recognizing the bowstring using Canny edge detection and Hough transform in the detected face region. This approach overcomes the limitations of manual analysis, providing accurate and consistent results with less error. In addition, the proposed method can be applied not only to training but also to real match scenarios by providing real-time feedback to the athlete rather than post-mortem analysis.

2. Related Works

In archery competitions, shooting time is a critical factor directly linked to performance, and effectively managing it plays a key role in determining athletes’ success [9,11,12]. As shown in Table 1, research highlights the critical role of archery performance analysis in enhancing athletic outcomes. It identifies key performance factors, such as reduced muscle activity and minimized postural sway, which are essential for designing effective training programs and performance optimization strategies. Furthermore, analyzing the timing components of successful teams provides valuable benchmarks for developing advanced training protocols and maximizing team performance. The importance of reducing release times and improving posture consistency is also underscored, enabling coaches and athletes to refine techniques and achieve superior results. Additionally, data driven systems that integrate shooting data with biomechanics and psychological factors facilitate continuous monitoring and improvement, supporting the development of advanced equipment and athlete training programs. Finally, research on phase specific time management helps archers and coaches optimize critical moments during performance, ensuring consistency and success. These findings affirm that archery performance analysis is a cornerstone for the sustained growth and achievement of both individuals and teams. With the recent advancements in artificial intelligence (AI), the field of sports vision technology has rapidly expanded, and AI-based analysis and training systems are being introduced in various sports [13,14,15,16,17]. Research highlights the significant advancements in sports vision technology, driven by recent developments in artificial intelligence (AI), which have transformed sports analysis and feedback systems. AI-based analysis and training systems are increasingly being adopted across various sports, offering a new paradigm for improving athletic performance and game analysis. Specifically, computer vision technology enables real-time analysis of athletes’ movements and provides immediate feedback, facilitating more precise and customized training compared to traditional methods. In the field of archery, camera-based computer vision technologies are being widely implemented, with active research focused on improving the precision of competition analysis and refereeing systems. Studies in this area process data captured from cameras to evaluate scores or analyze athletes’ shooting postures, significantly contributing to enhancing the objectivity and accuracy of archery competitions. These technological innovations play a crucial role in elevating the overall standards of archery and are expected to drive further advancements in related research. This progress underscores the critical impact of sports vision technology in modern sports, enabling continuous improvements in performance analysis, training, and competition standards.
However, important variables such as shooting time, which can assess the consistency of movements in archery, are typically measured manually or require sensors attached to the athlete’s body, making on field application inconvenient. Additionally, data collection is often conducted through post analysis, making real-time feedback difficult. In various sports, AI-based analysis and training systems have already been implemented to precisely analyze athletes’ performance and provide customized training programs, achieving significant success. However, the application of such technologies in archery is still in its early stages, and there are few examples of automatically analyzing key variables like shooting time. Therefore, further research and development are needed to effectively utilize AI and computer vision technologies to improve archery performance. Automated shooting time measurement based on computer vision technology is expected to contribute significantly to enhancing athletes’ performance in archery.

3. Materials and Methods

3.1. Data Collection

In this study, a method for automatically measuring shooting time in archery was developed and evaluated using a dataset constructed from actual archery broadcast footage. The data used in the research was collected from publicly available archery match broadcasts on YouTube, including events from the 2024 Paris Olympics and the Korean National Archery Team Special Match, allowing for a comprehensive analysis of world class competition environments and athlete movements. The URL of the data source used for testing in this study is specified in the Data Availability Statement. The dataset consists of high-resolution (1080P) videos shot from the front, clearly showing the athletes’ upper bodies and faces, and with the archers’ right arms sufficiently contained within the frame. In addition, video clips containing the entire sequence of drawing, anchoring, aiming, and release were cropped and used, and the shooting time was defined as the period from anchoring to release phase. Each clip contained the process of a single arrow shot. Figure 1 shows examples of the video clips and defined shooting times used in this study, and prior consent was obtained for the use of the athlete’s image in the paper. A total of 58 video clips were used in the experiment. Shooting time was calculated using Equation (1), by converting the number of frames (F) from the start of the anchoring phase to the release phase into actual time, with all video clips recorded at 30 frames per second (FPS).
Actual Shooting Time = F F P S ( s )
To evaluate the shooting time measured using the method proposed in this study, a sports video analysis expert manually measured the actual shooting time using Dartfish software v2022 [19], a specialized tool for sports analysis. These manual measurements were used as the ground truth for the evaluation.

3.2. Pose Estimation for Detecting the Start of Anchoring in Archery

In this study, the Mediapipe framework [20] was used to automatically detect the start of the anchoring phase in order to measure the shooting time of archers during competitions. Mediapipe is an open-source framework developed by Google, which, as shown in Figure 2, estimates the coordinates of 33 key body landmarks in real time. For this study, three Mediapipe landmarks were used to calculate the angle of the athlete’s right arm. The landmarks corresponding to the right shoulder, right elbow, and right wrist, with indices 12, 14, and 16, respectively, were utilized as shown in Figure 2.
Mediapipe landmarks can experience jitter due to noise from factors such as lighting conditions and occlusion. To accurately detect the athlete’s shooting state, a low-pass filter was applied. This method compares the landmark coordinates of the current frame with those of the previous frame; if the difference between the two is small, the coordinates of the previous frame are retained. However, if the difference exceeds a certain threshold, the current frame’s coordinates are updated, assuming actual movement has occurred. In this study, the threshold was empirically set to 10. By applying this approach, noise and jitter in the landmarks caused by various factors were reduced, allowing for more accurate detection of the anchoring start time through the stabilized landmarks.
To detect when the athlete has started the anchoring phase, the filtered landmarks from indices 12, 14, and 16 were used to calculate the angle ( θ ) of the arm through the dot product of the vectors between the right shoulder and elbow, and between the elbow and wrist, as shown in Figure 3. This method was applied to video clips containing the full sequence of drawing, anchoring, aiming, and release, with the arm angle calculated for each frame. To account for unexpected variance, this study computed the angle difference between each frame and calculated the average angle difference for the last 10 consecutive frames. The average angle difference was larger in the draw phase, but smaller in the anchoring and aiming phases. Based on this characteristic, this study empirically determined that the athlete was in the anchoring phase when the average angle difference was less than 1∘ and the arm angle was less than 25∘.

3.3. Shooting Time Measurement Method

In this study, to automatically detect and measure shooting time, the bow’s position within the facial region was tracked. Once the anchoring phase was initiated based on the athlete’s arm angle, the facial region’s bounding box was detected in real time. To ensure accuracy, we applied a low-pass filter to compare the coordinates of the current frame with those of the previous frame to reduce noise. If the difference between the coordinates is greater than the threshold D , we used the current coordinates; otherwise, we kept the previous coordinates. In this study, D was set to 10. Then, to accurately detect the bow’s position within the facial region, image processing techniques were applied. First, the facial region was converted to grayscale, and Gaussian blur was applied to reduce noise and smooth the background, making the bow’s line more distinguishable [21]. The blurred image was then processed using Canny edge detection to create an edge image by detecting strong brightness changes. To detect the straight boundary of the bow, the Probabilistic Hough transform was employed. Unlike the basic Hough transform, which calculates all possible lines for every edge point, the Probabilistic Hough transform randomly selects a subset of edge points, making the computation faster and reducing the number of detected lines, thus simplifying the identification of the desired bow line. Using this method, we detected the anchoring line within the face region and calculated the shooting time by counting the number of frames from the start of the anchoring phase to the release state.

3.4. Bowstring Detection Method Optimized for Archery

In the process of detecting the bowstring within the facial region, various lines may be detected, which could lead to inaccurate measurement of the shooting time if non-bow lines are mistakenly detected. For instance, lines such as the athlete’s glasses or jawline were occasionally detected in the video clips. To address this, the proposed method identifies the line that best matches the characteristics of the bowstring and uses only this line for measuring shooting time. The bowstring in the facial region typically appears near the center of the face and falls within a specific range of slopes. Considering this, the method selects the line closest to the center of the face among the candidates detected by the Probabilistic Hough transform and evaluates its slope. Specifically, as shown in Figure 4, the Euclidean distance between the face center and each line was calculated to find the closest line, and the slope ( θ ) of this line was assessed to determine if it matched the typical bowstring slope. The reference direction was defined as vertical, extending upward from the center of the face, and only lines with a slope between 0 and 30∘ relative to this direction were identified as the bowstring.
In this method, there were instances where the bowstring within the facial region was not consistently detected due to variables such as lighting. These interruptions could reduce the accuracy of shooting time measurement. To prevent this, the study considered the release action to have ended only if the bowstring was undetected for more than 10 consecutive frames. If fewer than 10 consecutive frames lacked bowstring detection, those frames were still considered part of the period between the anchoring and release phases to maintain shooting time accuracy. This approach minimized the impact of temporary detection errors on the accuracy of the measured shooting time.
Algorithm 1 presents the step-by-step process of the proposed method. Figure 5 is an example screen showing the application of the proposed program to an actual video clip, where the red box represents the face bounding box and the blue line represents the detected bowstring. (a) shows the step of measuring the shooting time in real time from the anchoring start point, and (b) shows the step of calculating the right arm angle in real time to find the anchoring start point. The video showing the entire process can be found at [https://www.youtube.com/watch?v=tvMkULD66yo] (accessed on 15 December 2024), and the video is based on a video shot during the actual training process, not a prebuilt video clip.
Algorithm 1 Archery shooting time detection algorithm
  • Input: Video clip
  • Output: Shooting time with visualization
  1:
while video frame is available do
  2:
    Read a frame from the video
  3:
    Step 1: Pose Detection
  4:
    Detect pose landmarks using a pose detection module
  5:
    if pose landmarks are detected then
  6:
        Proceed to Step 2
  7:
    else
  8:
        Continue to the next frame
  9:
    end if
10:
  Step 2: Calculation of Arm Angle
11:
  Calculate arm angles:
12:
   right_angle ← angle(right_shoulder, right_elbow, right_wrist)
13:
  if mean angle variation < 1 for 10 frames and r i g h t _ a n g l e < 25  then
14:
        Set a n c h o r i n g _ s t a r t _ p o i n t True
15:
        Proceed to Step 3
16:
    else
17:
        Continue to the next frame
18:
    end if
19:
    Step 3: Detect Face Bounding Box
20:
    Detect face bounding box using a face detection module
21:
    Apply low pass filter to stabilize bounding box coordinates
22:
    Crop the face region from the frame
23:
    Step 4: Detect Bowstring Candidates
24:
    Preprocess the face region:
25:
   Convert to grayscale, normalize pixel values, and apply Gaussian blur
26:
    Detect edges using Canny edge detection
27:
    Detect lines using Probabilistic Hough Transform
28:
    Validate bowstring candidates based on slope:
29:
    if  0 < s l o p e < 30  then
30:
        Among the bowstring candidates, select the line closest to the center of the face
31:
        Select the valid bowstring
32:
    end if
33:
    Step 5: Release Detection
34:
    Record shooting time
35:
    Display results
36:
end while
37:
Close the video file and release resources

4. Results

In this section, the accuracy of detecting the start of the anchoring phase using pose estimation is evaluated, and the shooting time measured by the Dartfish software is compared with the shooting time measured using the method proposed in this study. Additionally, the results of measuring the shooting time using only the bowstring and detecting the anchoring start point based on the arm angle are also included. A total of 58 video clips, each containing the full sequence of drawing, anchoring, aiming, and release, were used for the experiments. The clips included archers from various nationalities participating in different competitions. The experiments were conducted using a program implemented based on Python v3.8 in an environment with an Intel Core i7-13700K processor and 32 GB RAM (Intel, Santa Clara, CA, USA). The program achieved an average performance of over 30 FPS, confirming that it can effectively meet real-time processing requirements.

4.1. Detection Results of the Anchoring Start Phase Based on Pose Estimation

This section presents the results of detecting the anchoring start time using the proposed pose estimation method. The accuracy of the detected anchoring start time was evaluated using the mean absolute error (MAE) and R 2 score, as defined in Equation (2). Across 58 video clips, the method demonstrated an average error of approximately 0.29 s, with an R 2 score of 0.91.
MAE = 1 N i = 1 N ( | y i y ^ i | )
Figure 6 visualizes the scatter plot and regression line of each video clip detected by the proposed method and the actual anchoring start time. It shows that the start time detected by the proposed method is very close to the actual start time for most of the video clips.

4.2. Detection Results of the Shooting Time

To evaluate the accuracy of the shooting time measured using the method proposed in this paper, the difference from the actual shooting time was calculated using the MAE as defined in Equation (2). This represents the absolute time difference between the two, and an average error of approximately 0.34 s was observed across 58 video clips. Figure 7 visualizes a scatter plot and regression line comparing the actual shooting time of the archers with the shooting time measured by the proposed method, showing that the relationship between the two closely aligns with the y = x line. Additionally, the R 2 value was 0.95, indicating a very high correlation.
Next, to validate the effectiveness of the proposed pose estimation-based method for detecting the anchoring start point in archers, the results of measuring shooting time using only the bowstring within the facial region as a comparison were extracted. The results showed an MAE of approximately 1.4 s across 58 video clips. Figure 8 illustrates the scatter plot and regression line comparing the actual shooting time of the archers with the shooting time measured using only the bowstring. Compared to the proposed method, the regression line tended to deviate further from the y = x line, and the R 2 value decreased to 0.89.

5. Discussion

5.1. Results Analysis

This section analyzes the experimental results to evaluate the proposed method. Figure 7 and Figure 8 demonstrate that the proposed approach significantly outperforms the conventional bowstring-only method in accurately measuring shooting time. The bowstring-only method overestimated the shooting time by an average of approximately 1.4 s across 58 video clips. This discrepancy arises because the bowstring-only method begins measuring shooting time as soon as the bow enters the athlete’s facial bounding box during the drawing phase, even before the anchoring phase begins. This issue is particularly pronounced for athletes with longer drawing phases, leading to substantial errors in the measurement process. To address these limitations, the proposed method integrates pose estimation, utilizing skeletal motion to accurately identify the start of the anchoring phase. The bowstring is then detected, and shooting time is measured precisely from this identified anchoring start point. As shown in Figure 6, the proposed method successfully detects the beginning of the anchoring phase, avoiding errors caused by prematurely measuring shooting time during the drawing phases. To quantitatively validate the results, a paired t-test was conducted to evaluate the statistical significance of the performance differences between the proposed method and the bowstring-only method. The test produced a p-value below 0.05 and close to 0, confirming that the performance differences between the two methods are statistically significant.

5.2. Challenging Cases and Limitations

In this study, several key challenges were identified in the process of detecting shooting time. First, external factors such as lighting conditions occasionally resulted in insufficient contrast between the bowstring and the background in certain test data. This issue led to reduced accuracy in bowstring detection and, consequently, a relatively high MAE when compared to the actual shooting time. Second, the proposed method for detecting the start of the anchoring phase relies on pose estimation, utilizing three key joint points of the athlete’s right arm—shoulder, elbow, and wrist. While this approach is effective when the right arm is fully visible within the video frame, its performance degrades when the arm is partially occluded or moves out of the frame, negatively affecting the accuracy of shooting time detection. Third, in most archery competition videos, the athletes’ facial regions are sufficiently large and of high resolution, enabling reliable bowstring detection. However, in some instances, where athletes appear farther from the camera and their faces are smaller, the lower resolution makes bowstring detection more challenging.
Despite these challenges, the proposed method demonstrated improved accuracy in shooting time detection compared to conventional bowstring-only approaches, even under suboptimal lighting conditions. This improvement is reflected in the results of this study. Additionally, the pose estimation algorithm exhibited robust performance, enabling the detection of the anchoring start point even when parts of the arm were occluded or moved out of the video frame. Consequently, only a minimal number of test cases where the arm was significantly obscured or out of frame were excluded from the analysis.
To address the issue of low facial resolution caused by increased shooting distance, an alternative shooting time detection method was analyzed, utilizing only the three joint points of the athlete’s right arm (shoulder, elbow, and wrist). This method maintained the same criteria for detecting the anchoring start point as the proposed approach. However, for identifying the end point of shooting time, corresponding to the release phase, this method relied on changes in the arm’s angle rather than detecting the bowstring. Specifically, it determined the release point by identifying frames where the 10 frame moving average of the right arm’s angle exceeded a threshold value D. In this study, D was empirically set to 5∘.
Figure 9 shows the results applied to the existing dataset using only the right arm angle. Compared to Figure 7 and Figure 8, the method using only the right arm angle tended to detect longer shooting times than the proposed method, and tended to detect shorter shooting times than the method using only the bowstring. In addition, the results showed a R 2 score of 0.93 and an MAE of approximately 0.512, showing a slight difference compared to the method proposed in this study, but showing improved performance compared to the method using only the bowstring. Additionally, for 11 datasets where bowstring detection was difficult and shooting time could not be measured with the proposed method, the average MAE was confirmed to be 0.54 when only the right arm angle was used. These results show that the proposed method that detects the anchoring start point using the right arm angle and the release point using the bowstring has the best performance, but it can be utilized as a sub-method that can achieve high accuracy in challenging situations where it is difficult to detect the bowstring with only the arm angle. Through this, we confirmed high accuracy even in challenging cases where the bowstring detection is difficult, confirming the possibility of applicability to a wide range of data.

6. Conclusions

This study presents a novel method to automatically analyze archery shooting times by applying AI and computer vision techniques to the sport of archery. The focus is on measuring shooting times from the anchoring phase to the release phase, which are important factors in archery performance. The proposed method can accurately measure shooting times by detecting the start of the anchoring phase and identifying the bowstring within the face region using pose estimation techniques. This approach is validated by comparing it with manual measurements obtained by experts using Dartfish software, and shows superior accuracy compared to methods that rely only on bowstring detection. Specifically, while the method that only detects bowstring achieves an MAE of 1.4 s and an R 2 score of 0.89, the proposed method significantly improves the performance to an MAE of 0.34 s and an R 2 score of 0.95. In addition, based on previous studies that athletes with shorter shooting times tend to perform better, the proposed method provides real-time feedback so that athletes can immediately adjust their shooting technique during training to improve their performance. In future work, we plan to improve the method by utilizing deep learning approaches to predict key points at both ends of the arrow, implementing a more robust solution that is less affected by external factors such as lighting conditions.

Author Contributions

Conceptualization, E.C.L. and S.L.; methodology, S.L.; software, S.L.; validation, S.L.; formal analysis, S.L.; investigation, S.L., J.-Y.M. and J.K.; resources, S.L. and J.-Y.M.; data curation, S.L., J.-Y.M. and J.K.; writing—original draft preparation, S.L. and J.-Y.M.; writing—review and editing, E.C.L. and S.L.; visualization, S.L.; supervision, E.C.L.; project administration, E.C.L.; funding acquisition, E.C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was conducted with the support of the “Convergence Graduate School Support Program for the Sports Industry” funded by the Ministry of Culture, Sports and Tourism of the Republic of Korea and the Korea Sports Promotion Foundation (B0080319002396).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The video data used in this research were sourced from publicly available YouTube videos uploaded by broadcasting companies and the Korea Archery Association. The videos used in this study can be accessed at the following URLs (accessed on 15 December 2024): https://www.youtube.com/watch?v=5W758P_Wc7U, https://www.youtube.com/watch?v=r27etewKeB4, https://www.youtube.com/watch?v=-aF9tSyW_y4, https://www.youtube.com/watch?v=2Ys-E3YjLBE, https://www.youtube.com/watch?v=VT-eLoEogJA, https://www.youtube.com/watch?v=G_Cyh2af1HA, https://www.youtube.com/watch?v=7fb33bq0hUo, https://www.youtube.com/watch?v=ARnz-IfEHnU, https://www.youtube.com/live/EssXSuUPyRk?si=dUuxFHHSxjeM6pwj, https://www.youtube.com/watch?v=YYuK_VYwp10, https://www.youtube.com/watch?v=f7lZX1gFjzU, https://www.youtube.com/watch?v=zrMPgB0jxLU. The videos were cropped to capture the entire archery process, focusing on the athletes’ facial bounding boxes for pose estimation and bowstring detection throughout the sequence.

Conflicts of Interest

Author Jinman Kim was employed by the company Techars. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Nishizono, H.; Shibayama, H.; Izuta, T.; Saito, K. Analysis of archery shooting techniques by means of electromyography. In Proceedings of the 5 International Symposium on Biomechanics in Sports (1987), Athens, Greece, 13–17 July 1987; ISBS-Conference Proceedings Archive: Marquette, MI, USA, 1987. [Google Scholar]
  2. Jang, Y.S.; Hong, K.D. Development of archery training management system for archer’s performance. J. Korea Contents Assoc. 2008, 8, 213–222. [Google Scholar] [CrossRef]
  3. Clemente, F.M.; Couceiro, M.; Rocha, R.; Mendes, R. Study of the heart rate and accuracy performance of archers. J. Phys. Educ. Sport 2011, 11, 434–437. [Google Scholar]
  4. Kim, K. Exploration of Tension in Archery. Korea Sport Res. 2007, 18, 337–348. [Google Scholar]
  5. An, H.S. Effects of balance and kinematic factors on archery score during archery shooting. J. Korea Converg. Soc. 2018, 9, 239–246. [Google Scholar]
  6. Kim, H.; Ki, B. Exploring Key Coaching Factors of Elite Archery Coaches. Korean J. Phys. Educ. 2021, 25, 105–121. [Google Scholar]
  7. Tinazci, C. Shooting dynamics in archery: A multidimensional analysis from drawing to releasing in male archers. Procedia Eng. 2011, 13, 290–296. [Google Scholar] [CrossRef]
  8. Takai, H.; Kubo, Y.; Araki, M. Characteristics of shooting time of the world’s top level male archery athletes. NSSU J. Sport Sci. 2012, 1, 8–12. [Google Scholar]
  9. Moon, J.Y.; Lee, U.C. Analysis of the Correlation between Shooting Time of Elite Archers and Athletic Performance: Case Study of the Korean National Team in Women’s Archery at the 2020 Tokyo Olympics. J. Next-Gener. Converg. Technol. Assoc. 2023, 7, 1399–1405. (In Korean) [Google Scholar]
  10. Kim, H.; Kim, J. The Consistency of an Elite Archer’s Shooting Movement for Improving His Performance. Korean J. Phys. Educ. 2006, 45, 473–483. [Google Scholar]
  11. Kim, J.T.; Lee, S.J.; Kim, S.S. Influence of Rating of Perceived Exertion on Kinematic Characteristics in Top Class Archery Athletes. J. Coach. Dev. 2014, 16, 99–106. [Google Scholar]
  12. Lau, J.S.; Ghafar, R.; Zulkifli, E.Z.; Hashim, H.A.; Sakim, H.A. Comparison of Shooting Time Characteristics and Shooting Posture Between High-and Low-Performance Archers. Ann. Appl. Sport Sci. 2023, 11. [Google Scholar] [CrossRef]
  13. Liu, Y.; Cheng, X.; Ikenaga, T. Motion-Aware and Data-Independent Model Based Multi-View 3D Pose Refinement for Volleyball Spike Analysis. Multimed. Tools Appl. 2023, 83, 22995–23018. [Google Scholar] [CrossRef]
  14. Zhu, K.; Wong, A.; McPhee, J. FenceNet: Fine-Grained Footwork Recognition in Fencing. In Proceedings of the 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), New Orleans, LA, USA, 19–20 June 2022; IEEE: Piscataway, NJ, USA, 2022. [Google Scholar]
  15. Ren, H. Sports Video Athlete Detection Based on Deep Learning. Neural Comput. Appl. 2023, 35, 4201–4210. [Google Scholar] [CrossRef]
  16. Qohar, A.; Akbar, R.; Hendriawan, A. Automatic Score in Archery Target Using Simple Image Processing Method. In Proceedings of the 5th International Conference on Applied Science and Technology on Engineering Science (iCAST-ES 2022), Mumbai, India, 2–3 December 2022. [Google Scholar]
  17. Phang, J.; Lim, K.; Lease, B.; Chiam, D. Computer Vision-Based Automated Archery Performance Logging System. In Proceedings of the 2nd International Conference on Innovation and Technology in Sports, ICITS 2023, Kuala Lumpur, Malaysia, 27–28 November 2023; Lecture Notes in Bioengineering; Springer Nature: Singapore, 2024; pp. 535–544. [Google Scholar]
  18. Callaway, J.A.; Broomfield, A.S. Inter-rater reliability and criterion validity of scatter diagrams as an input method for marksmanship analysis: Computerised notational analysis for archery. Int. J. Perform. Anal. Sport 2012, 12, 291–310. [Google Scholar] [CrossRef]
  19. Dartfish. MyDartfish Pro S: High-Performance Sports Video Solution. 2024. Available online: https://www.dartfish.com/pro_s (accessed on 2 October 2024).
  20. Bazarevsky, V.; Grishchenko, I.; Raveendran, K.; Zhu, T.; Zhang, F.; Grundmann, M. On-Device Real-Time Body Pose Tracking. arXiv 2020, arXiv:2006.10204. [Google Scholar]
  21. Deng, G.; Cahill, L. An Adaptive Gaussian Filter for Noise Reduction and Edge Detection. In Proceedings of the 1993 IEEE Conference Record Nuclear Science Symposium and Medical Imaging Conference, San Francisco, CA, USA, 31 October–6 November 1993; IEEE: Piscataway, NJ, USA, 2005. [Google Scholar]
Figure 1. Illustration of the archery shooting process, showcasing key stages from aiming to release, as captured comprehensively in each video clip of the dataset.
Figure 1. Illustration of the archery shooting process, showcasing key stages from aiming to release, as captured comprehensively in each video clip of the dataset.
Applsci 14 11838 g001
Figure 2. Mediapipe pose estimation landmarks and corresponding index numbers.
Figure 2. Mediapipe pose estimation landmarks and corresponding index numbers.
Applsci 14 11838 g002
Figure 3. Example of calculating arm angles based on 3D pose estimation.
Figure 3. Example of calculating arm angles based on 3D pose estimation.
Applsci 14 11838 g003
Figure 4. Example of a process to detect the bowstring within the archer’s face bounding box.
Figure 4. Example of a process to detect the bowstring within the archer’s face bounding box.
Applsci 14 11838 g004
Figure 5. Example of applying the proposed method to measure archery shooting time using actual video clips: (a) illustrates the measurement of an archer’s shooting time; (b) visualizes real-time changes in the angle of the right arm as a graph to identify the anchoring start point.
Figure 5. Example of applying the proposed method to measure archery shooting time using actual video clips: (a) illustrates the measurement of an archer’s shooting time; (b) visualizes real-time changes in the angle of the right arm as a graph to identify the anchoring start point.
Applsci 14 11838 g005
Figure 6. Scatter plot with regression line showing the relationship between the anchoring start points detected using the proposed pose estimation-based method and the actual anchoring start points, along with the R 2 score indicating the model’s accuracy.
Figure 6. Scatter plot with regression line showing the relationship between the anchoring start points detected using the proposed pose estimation-based method and the actual anchoring start points, along with the R 2 score indicating the model’s accuracy.
Applsci 14 11838 g006
Figure 7. Scatter plot with regression line showing the relationship between measured shooting times using the proposed method and actual shooting times, along with the R 2 score indicating the model’s accuracy.
Figure 7. Scatter plot with regression line showing the relationship between measured shooting times using the proposed method and actual shooting times, along with the R 2 score indicating the model’s accuracy.
Applsci 14 11838 g007
Figure 8. Scatter plot with a regression line showing the relationship between the measured shooting time using only the bowstring and the actual shooting time, and the R 2 score indicating the accuracy of the model.
Figure 8. Scatter plot with a regression line showing the relationship between the measured shooting time using only the bowstring and the actual shooting time, and the R 2 score indicating the accuracy of the model.
Applsci 14 11838 g008
Figure 9. Scatter plot with regression line showing the relationship between measured shooting time and actual shooting time using only arm angle and R 2 score showing the accuracy of the model.
Figure 9. Scatter plot with regression line showing the relationship between measured shooting time and actual shooting time using only arm angle and R 2 score showing the accuracy of the model.
Applsci 14 11838 g009
Table 1. Summary of studies on archery shooting time and sports vision technology applications.
Table 1. Summary of studies on archery shooting time and sports vision technology applications.
CategoryAuthors & YearTitlesResearch ContentResearch Applicability
Archery Performance AnalysisTinazci, C. (2011)Shooting dynamics in archery: A multidimensional analysis from drawing to releasing in male archers [7]This study provides a foundational understanding of the physiological and mechanical variables influencing archery performance, focusing on muscle activity and postural dynamics.Its findings highlight key performance-related variables such as reduced muscle activity and postural sway, which can guide training regimens and performance enhancement strategies. The study’s insights are particularly relevant for optimizing biomechanics in high-pressure competitive scenarios.
Moon, J. Y. and Lee, E. C. (2023)Analysis of the relationship between shooting time and performance in elite archers: A case study of the 2020 Tokyo Olympic women’s national archery team [9]A quantitative analysis of the shooting time of the Korean women’s national archery team, which achieved a 9th consecutive team victory in the 2020 Tokyo Olympics.By examining the timing components of a globally successful team, this research offers a valuable reference for developing training protocols and performance optimization strategies in archery teams.
Kim, J. T., Lee, S. J., and Kim, S. S. (2014)Comparison of Shooting Time Characteristics and Shooting Posture Between High and Low Performance Archers [11]A comparison of time characteristics and posture between high- and low-performance archers. The study found that archers with shorter release times showed better performance.Coaches and athletes can leverage these findings to refine techniques, particularly by focusing on reducing release times and improving posture consistency for better results.
Callaway, J. A., and Broomfield, A. S. (2012)Inter-Rater Reliability and Criterion Validity of Scatter Diagrams as an Input Method for Marksmanship Analysis: Computerised Notational Analysis for Archery [18]The use of computerized scatter plots for shot analysis in target sports has been shown to be both valid and reliable. By inputting each arrow’s position into specialized software, precise coordinates are generated, enabling coaches, athletes, and researchers to monitor changes in equipment settings, biomechanics, physiology, and psychology. This system facilitates the continuous development of athletes, sports, and equipment.The system enables continuous monitoring and improvement by linking shot data to various factors such as biomechanics and psychology, thereby supporting data-driven coaching and equipment design.
Lau, J. S., Ghafar, R., Zulkifli, E. Z., Hashim, H. A., and Mat Sakim, H. A. (2023)Analysis of Kinematic Variables Based on Perception of Elite Archers [12]This study compared the variables affecting the shooting performance of top female archers during ranking rounds. It found that the time spent on each phase was slower in “bad” performances compared to “good” ones.The research provides detailed insights into phase-specific time management, helping archers and coaches to focus on optimizing critical moments for consistent performance.
Sports Vision TechnologyLiu, Y., Cheng, X., and Ikenaga, T. (2024)Motion aware and data independent model based multi view 3D pose refinement for volleyball spike analysis [13]Proposed a method for estimating and refining 3D poses in volleyball spike analysis using computer vision technology.The approach enhances the analysis of volleyball techniques by providing detailed 3D motion insights without relying on traditional markers. This makes it particularly useful for coaching, performance improvement, and injury prevention in volleyball.
Zhu, K., Wong, A., and McPhee, J. (2022)FenceNet: Fine grained footwork recognition in fencing [14]Introduced a new architecture called FenceNet, which automates the classification of fine-grained footwork techniques in fencing using 2D pose data without wearable sensors.By leveraging 2D pose estimation, this system simplifies data collection while maintaining high precision. It can be used to improve training efficiency and provide detailed feedback for athletes and coaches in fencing.
Ren, H. (2023)Sports video athlete detection based on deep learning [15]Proposed a system for automatically detecting and evaluating athletes’ postures in sports videos using deep learning and sports vision technologies. The study demonstrated high accuracy in capturing key movements based on skeletal motion.The system has significant potential in various sports for automated performance analysis and feedback. Its accuracy in detecting key movements makes it a valuable tool for refining techniques and assessing biomechanical efficiency.
Qohar, A., Akbar, R., & Hendriawan, A. (2023)Automatic Score in Archery Target Using Simple Image Processing Method [16]Developed an automatic scoring system for indoor and outdoor archery competitions using image processing techniques with high accuracy.The system reduces human error in scoring, enhances objectivity in competitions, and streamlines the process for both athletes and judges, making it a vital tool for modernizing archery competitions.
Phang, J. T. S., Lim, K. H., Lease, B. A., & Chiam, D. H. (2023)Computer Vision-Based Automated Archery Performance [17]Proposed a deep learning-based markerless motion capture system for analyzing archers’ shooting postures.The system allows for unobtrusive and efficient posture analysis, which can be used for training, technique refinement, and biomechanical studies in archery without the need for specialized equipment.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lee, S.; Moon, J.-Y.; Kim, J.; Lee, E.C. AI-Based Analysis of Archery Shooting Time from Anchoring to Release Using Pose Estimation and Computer Vision. Appl. Sci. 2024, 14, 11838. https://doi.org/10.3390/app142411838

AMA Style

Lee S, Moon J-Y, Kim J, Lee EC. AI-Based Analysis of Archery Shooting Time from Anchoring to Release Using Pose Estimation and Computer Vision. Applied Sciences. 2024; 14(24):11838. https://doi.org/10.3390/app142411838

Chicago/Turabian Style

Lee, Seungkeon, Ji-Yeon Moon, Jinman Kim, and Eui Chul Lee. 2024. "AI-Based Analysis of Archery Shooting Time from Anchoring to Release Using Pose Estimation and Computer Vision" Applied Sciences 14, no. 24: 11838. https://doi.org/10.3390/app142411838

APA Style

Lee, S., Moon, J.-Y., Kim, J., & Lee, E. C. (2024). AI-Based Analysis of Archery Shooting Time from Anchoring to Release Using Pose Estimation and Computer Vision. Applied Sciences, 14(24), 11838. https://doi.org/10.3390/app142411838

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop