**5. Conclusions**

We have introduced a workflow that simplifies the image-based delineation of visible boundaries to support the automated mapping of land tenure from various sources' remote sensing imagery. In comparison to our previous work [21], the approach is now more automated and more accurate due to the integration of CNN deep learning, compared to RF machine learning. For RF-derived boundary likelihoods, we obtained an accuracy of 41% and a precision of 49%. For CNN-derived boundary likelihoods, we obtained an accuracy of 52% and a precision of 76%. CNNs eliminate the need to generate hand-crafted features required for RF. Furthermore, our approach has proven to be less tiring and more effective compared to manual delineation, due to the decreased over-segmentation and our new delineation functionalities. We limit over-segmentation by reducing the number of segment lines by 80% through filtering. Through the new delineation functionalities, the delineation effort per parcel requires 38% less time and 80% fewer clicks compared to manual delineation. The approach works on data from different sensors (aerial and UAV) of different resolutions (0.02–0.25 m). Advantages are strongest when delineating in rural areas due to the continuous visibility of monotonic boundaries. Manual delineation remains superior in cases where the boundary is not fully visible, i.e., covered by shadow or vegetation. While our approach has been developed for cadastral mapping, it can also be used to delineate objects in other application fields, such as land use mapping, agricultural monitoring, topographical mapping, road tracking, or building extraction.

**Author Contributions:** Conceptualization, S.C., M.K., M.Y.Y. and G.V.; methodology, S.C., M.K., M.Y.Y. and G.V.; software, S.C.; validation, S.C., M.K., M.Y.Y. and G.V.; formal analysis, S.C.; investigation, S.C.; resources, S.C.; data curation, S.C.; writing—original draft preparation, S.C.; writing—review and editing, S.C., M.K., M.Y.Y. and G.V.; visualization, S.C.; supervision, M.K., M.Y.Y. and G.V.; project administration, M.K.; funding acquisition, G.V.

**Funding:** This research was funded by Horizon 2020 program of the European Union (project number 687828).

**Acknowledgments:** We are grateful for the support of our African project partners INES Ruhengeri (Rwanda), Bahir Dar University (Ethiopia), Technical University of Kenya, and Esri Rwanda to support the data capture, which was guided and further processed by Claudia Stöcker from University of Twente. Berhanu Kefale Alemie from Bahir Dar University provided aerial data and reference data, as well as knowledge about local land administration situation.

**Conflicts of Interest:** The authors declare no conflict of interest.
