**6. Conclusions**

The main goal of this study is to propose a new method that improves vine disease detection in UAV images. A new deep learning architecture for vine disease detection (VddNet), and automatic multispectral orthophotos registration have been proposed. UAV images in the visible and near-infrared spectra are the input data of the detection system for generating a disease map. UAV input images were aligned using an optimized multispectral registration algorithm. Aligned images were then used in the process of building registered orthophotos. During this process, a digital surface model (DSM) was generated to built a depth map. At the end, VddNet generated the disease map from visible, near-infrared and depth map data. The proposed method brought many benefits to the whole process. The automatic multispectral orthophotos registration provides high precision and fast processing compared to conventional procedures. A 3D processing enables the building of the depth map, which is relevant for the VddNet training and segmentation process. Depth map data reduce misclassification and confusion between close color classes. VddNet improves disease detection and global segmentation compared to the state-of-the-art architectures. Moreover, orthophotos are georeferenced with GNSS coordinates, making it easier to locate diseased vines for traitment. In future work, it would be interesting to acquire new multispectral channels to enhance disease detection and improve the VddNet architecture.

**Author Contributions:** M.K. and A.H. conceived and designed the method; M.K. implemented the method and performed the experiments; M.K., A.H. and R.C. discussed the results and revised the manuscript. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research received no external funding.

**Acknowledgments:** This work is part of the VINODRONE project supported by the Region Centre-Val de Loire (France). We gratefully acknowledge Region Centre-Val de Loire for its support.

**Conflicts of Interest:** The authors declare no conflict of interest.
