**6. GIS Application Visualization**

We developed a system with which to process a large orthophotograph ("\*.tif") to predict potential PWD-infected trees, as shown in Figure **??**. The system automatically detects PWD location and saves the location information in standard ESRI format files. The developed system was integrated with QGIS (https://qgis.org, accessed on 20 December 2021) to visualize the input "\*.tif" images and potential disease locations (Figure **??**). Potential disease regions with a high confidence score are represented by yellow bounding boxes while "disease-like" objects are illustrated by blue bounding boxes. The output file preserves class label, predicted confidence score, as well as left-top and right-bottom position for the target bounding box. As shown in Figure **??**b, the column score represents the confidence score within [0,1], and it indicates how much a tree looks like a PWD-infected tree. The expert can reduce the threshold to find more potential disease spots. The columns lr\_i and rb\_i include the coordinate values of a bounding box in the image coordinate system. The columns lt and rb represent the GPS coordinates in a coordinate reference system [**?** ] (EPSG: 5186 Korean 2000/Central Belt 2010). Using the lt and rb values, expert can locate PWD-infected trees in field investigation.


**Figure 9.** Visualization of the inference result in the QGIS program.

### **7. Discussion and Future Work**

In this paper, we proposed a system for improving the performance of the object detection model that detects PWD-infected trees using RGB-based UAV images. To learn a robust network, we created a large dataset which contains a total of 6121 disease spots from various infected stages and areas. The comparison results show that our proposed system has grea<sup>t</sup> consistency across different backbone structures. HNM can select "disease-like" objects from six categories. Trained and fine-tuned networks successfully built better decision boundaries with which to distinguish true PWD objects from the six "disease-like" ones. EDA and TTA achieved significant gains by alleviating the data bias problem. In addition, 711 out of 730 PWD-infected trees were identified in 10 large size orthophotographs, indicating that this method shows grea<sup>t</sup> potential in locating PWD in various pine forest resources. Finally, the integrated software can automatically locate potential PWD-infected locations and save to ESRI format. It is also convenient to visualize the results in the GIS application for field investigation.

However, there is still work to be done. For example, the best way to utilize context information during the training remains unclear. PWD only infects the pine family, so tree species classification will help reduce inference time and make it easier to precisely locate

infected regions. The other problem to be overcome is the method of filtering low-quality images. In UAV images, it is difficult to ensure a consistent resolution, implying that poor-resolution images with ambiguous features decrease the performance. Further, RGBbased PWD detection method still has limitations, as it confuses PWD-infected trees with "disease-like" objects in the early and later stages. Reference [**?** ] has demonstrated that PWD infected trees exhibit a reduction in normalized difference vegetation index (NDVI), and [**?** ] showed the effectiveness of conifer broadleaf classification with multi-spectral image. These studies provide insight into the usage of multispectral information to aid in recognition. However, multi-spectral images typically have a lower resolution than RGB images, so the best way to collect and efficiently use multi-spectral images remains a question of interest. We believe our proposed method can be used as preprocessing stage to filter the irrelevant region as well as to find fuzzy PWD hotspot. It is only necessary to reanalyze suspected images with the multi-spectral image, which greatly reduces the time for data collection.

**Author Contributions:** J.Y. designed the study, conducted the experiments and prepared the original manuscript. R.Z. prepared the dataset. J.L. supervised the research and edited the manuscript. All authors have read and agreed to the published version of the manuscript.

**Funding:** This work was supported in Brain Korea 21 PLUS Project.

**Institutional Review Board Statement:** Not applicable.

**Informed Consent Statement:** Not applicable.

**Data Availability Statement:** Data sharing is not applicable to this article.

**Acknowledgments:** The authors express special thanks to Korea Forestry Promotion Institute which has provided the data for experiments in the paper. I would like to give my sincere gratitude to Yagya Raj Pandeya and Bhuwan bhattarai, gave me grea<sup>t</sup> help for grammer correction.

**Conflicts of Interest:** The authors declare that there is no conflict of interest.
