**3. Materials and Methods**

The workflow of this research includes three major parts, namely data preparation, boundary detection and accuracy evaluation. In the first part, we selected three training tiles and one testing tile from each study area and prepared the RGB layer and boundary reference for each tile. In the second part, we applied FCNs, MRS and gPb for cadastral boundary detection and validated their performance based on the testing tiles. For the last part, we employed precision-recall measures for accuracy assessment, with a 0.4 m tolerance from reference data.

#### *3.1. Data Preparation*

The UAV images used in this research were acquired for the its4land (https://its4land.com/) project in Rwanda in 2018. All data collection flights were carried out by Charis UAS Ltd. The drone used for data collection in Busogo was a DJI Inspire 2, equipped with Zenmuse X5S sensor. The drone used in Muhoza was a FireFLY6 from BIRDSEYEVIEW, with a SONY A6000 sensor. Both sensors acquire three bands (RGB) and capture nadir images. The flight height above the ground for Busogo was 100 m and for Muhoza 90 m. The final Ground Sampling Distance (GSD) was 2.18 cm in Busogo and 2.15 cm in Muhoza. For more detailed information about flight planning and image acquisition refer to [17].

In this research, the spatial resolution of the UAV images was resampled from 0.02 m to 0.1 m considering the balance between accuracy and computational time. Four tiles of 2000 × 2000 pixels were selected from each study site for the experimental analysis. Among them, three tiles were used for training and one for testing the algorithm. The training and testing tiles in Busogo are named TR1, TR2, TR3 and TS1, and those in Muhoza are named TR4, TR5, TR6 and TS2 (Figure 2).

**Figure 2.** The Unmanned Aerial Vehicle (UAV) images and boundary reference for selected tiles. TR1, TR2, TR3 and TS1 are selected tiles from Busogo; TR4, TR5, TR6 and TS2 are selected tiles from Muhoza. For each area, the former three were used for training and the last one was used for testing the algorithms. The boundary references in TR1~TR6 are the yellow lines. In TS1 and TS2, we separated the boundary references as visible (green lines) and invisible (red lines).

For each tile, RGB images and the boundary reference were prepared as input for the classification task. The reference data was acquired by merging the 2008 national cadaster and Rwandan experts' digitization. The 2008 national cadaster is currently outdated, hence the experts' digitization is provided as supplements. This acquired reference was presented as polygons in a shapefile format showing the land parcels. However, to feed the FCN, the boundary reference has to be in a raster format with the

same spatial resolution as RGB images. Therefore, we first converted the polygons into borderlines and then rasterized the boundary lines with a spatial resolution of 0.1 m. Figure 2 visualizes the RGB images and boundary reference for the selected tiles in Busogo and Muhoza. To have a better understanding of the detection results on the testing tiles, we separated the boundary reference as visible and invisible in TS1 and TS2, which are marked as green and red in the above maps, respectively. We extracted visible cadastral boundaries by following clearly visible features, including strips of stones, fences, edge of the rooftop, road ridges, change in textural pattern and water drainage. Table 1 shows the rules that we followed for extracting visible boundaries in an extraction guide. The rest are considered invisible cadastral boundaries.


## **Table 1.** Extraction guide for visible cadastral boundaries.

## *3.2. Boundary Detection*
