Next Article in Journal
Individual and School Correlates of DIT-2 Scores Using a Multilevel Modeling and Data Mining Analysis
Previous Article in Journal
Measures and Methods for the Evaluation of ATO Algorithms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Photogrammetric Precise Surveying Based on the Adjusted 3D Control Linear Network Deployed on a Measured Object

1
Faculty of Geodesy and Cartography, Warsaw University of Technology, Pl. Politechniki 1, 00-661 Warsaw, Poland
2
Faculty of Technical Sciences, University of Lower Silesia, Wagonowa St. 9, 53-609 Wrocław, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(9), 4571; https://doi.org/10.3390/app12094571
Submission received: 3 March 2022 / Revised: 20 April 2022 / Accepted: 29 April 2022 / Published: 30 April 2022
(This article belongs to the Section Civil Engineering)

Abstract

:

Featured Application

The elaborated method can be applied to objects dimensioning, volume calculations, or displacement measurements. The approach uses popular and commonly available instruments, making the solution universal.

Abstract

In surveying engineering tasks, close-range photogrammetry belongs to leading technology by considering different aspects like the achievable accuracy, availability of hardware and software, accessibility to measured objects, or the economy. Hence, constant studies on photogrammetric data processing are desirable. Especially in industrial applications, the control points for close-range photogrammetry are usually measured using total stations. In the case of smaller items, more precise positions of control points can be obtained by deploying and adjusting a three-dimensional linear network located on the object. This article analyzes the accuracy of the proposed method based on the measurement of the linear network using a professional tape with a precision of ±1 mm. It is shown what accuracy of object feature dimensioning can be obtained based on the proposed innovative network method for photo-point measurement, using only the minimum required number of two stereo-images. The photogrammetric 3D model derived from them and captured with a non-metric camera is characterized by the highest possible precision, which qualifies the presented approach to accurate measurements used in the surveying engineering. The authors prove that the distance between two randomly optional points derived from the 3D model of a dimensioned object is equal to the actual distance measured directly on it with one-millimeter accuracy.

1. Introduction

Close-range photogrammetry (CRP) is a versatile surveying technology able to create object models of different sizes, from small industrial elements to large buildings [1,2]. The accuracy of CRP mostly depends on the size of the pixel, the distance between subsequent camera positions, and the respective distances to the photographed item. Moreover, the essential factor is the distribution, density, and accuracy of the control points. Such photo-points deployed on engineering objects and captured with CRP or terrestrial laser scanning (TLS) are usually measured using classical methods by utilizing total stations, for example [3,4,5,6,7]. The CRP techniques are used in various engineering tasks related to studying the geometry of structures [8], both on the macro and micro scale [9,10]. According to the current state-of-the-art in land surveying and the capacities offered by modern computers, it is possible to apply CRP for capturing static and dynamic objects in motion. For example, in publication [11], the authors focused on the self-developed photogrammetric system consisting of dedicated hardware and employing a unique adjustment strategy to determine the accurate position of the mapped elements. They proved that such an integrated approach significantly increases the positioning accuracy allowing for fast registration of the existing scenes. The GNSS satellite receiver and the IMU inertial unit were used as additional devices supporting the integrated surveying. Another work [12] discusses precisely capturing geometrical changes in 2D using CRP. The proposed research method allows for high accuracy in determining displacements of the object’s control points expressed, even in sub-millimeter values. The presented method consists of a geometric system utilizing different camera positions and is mainly dedicated to indoor applications. The high accuracy was confirmed by a comparative study evaluating the displacements of the single-name points placed in a test stand using micrometers with a resolution of 0.01 mm. The root-means-square (RMS) values of the differences between the predefined and evaluated displacements in the two directions (local X and Y) are 0.11 and 0.02 mm, which closely predisposes this approach to metrological tasks. However, relatively small entities placed in a limited space were assessed in this case. A comparative study on the influence of control points number on CRP positioning is explained in [13]. Nevertheless, the studied solution concerns aerial photogrammetry using unmanned aerial vehicles (UAV). The obtained accuracies vary within a few centimeters, allowing for the solution to map the displacements of natural objects of considerable size (for example, open-pit mines, geotechnical structures, slopes, or embankments) [14]. All previously cited works, together with [15], confirm the usefulness of photogrammetric methods in industrial solutions, which require the exact determination of the examined geometry elements. Recently, low-cost solutions have also become very popular in the precise dimensioning of things [16]. This approach uses inexpensive devices and technologies, competing with well-known, classic solutions provided by recognized manufacturers. The study [17] presented a low-cost CRP system for imaging building façades. Such a system uses a specially designed measuring set employing a camera with a laser distance meter, making it possible to conduct measurements with millimeter-level accuracy. The authors used the self-developed software to assess the impact of various uncertainties (including camera calibration errors) on the surveying results. In this case, a crucial factor was recognizing the measurement set’s geometrical parameters, which determines the absolute accuracy of the derived object model. CRP is also successfully used together with other techniques for remote detection of object geometry [18], such as TLS. An example of such integration in archaeological applications is shown in [19]. The presented approach allows for accurately mapping objects in 3D, treating precisely determined geometric data (distances between points) as a significant enhancement of a point cloud. A similar integration of data from remote sensing imaging and numerical terrain models is presented in [20]. In the accurate and reliable determination of the geometric parameters of engineering objects using photogrammetry, a crucial role is played not only by the instrumental aspect but, perhaps above all, the methodological attitude. Developing an optimal strategy for acquiring, processing, and analyzing measurement results is the pivotal problem of modern geomatics. The issues mentioned above motivated the authors to develop and test a CRP method based on adjusting the spatial control network located either on a test object or in its proximity. The article analyzes the accuracy of the proposed method based on previous surveying, such as a linear control network using a measuring tape with a precision of ±1 mm. Such accuracy can hardly be obtained by employing a common total-station, especially since targeting and measuring distances to object edges, corners, and other characteristic points are always problematic, especially while executing reflectorless surveys. We have demonstrated the accuracy of object dimensioning based on the proposed innovative photo-point network deployment using only the minimum required number of two stereo images.

2. Materials and Methods

The developed procedure is based on the classic stereo-pair of pictures taken for a test site using a popular camera. The control network consisting of linear values has been marked directly on the object, the points of which represent its characteristic elements. The control points gain Cartesian coordinates expressed in the local system encompassing the pictured entity as well as the pixel coordinates determined for the left and right photograph. Then, the measured linear elements of the control network related to the local coordinates of the selected points are adjusted by the least-squares method employing the Gauss-Markov model [21]. The adjusted positions of the control points are then used to set out the orientation and distortion parameters both for the left and right image, respectively. For the resulting center projections, the accuracy assessment is carried out considering the estimated errors in determining their parameters. As a result, the photogrammetrically derived linear elements of the control network are compared with the actual measures derived from the direct object survey.
To verify the developed method, we took control measurements in the university classroom, assuming an office desk with some auxiliary elements. The control network oriented in the local XYZ coordinate encompassed all its characteristic points (object vertices). All linear components were measured three times with a surveying tape, thus determining their values with ± 1 mm precision. The 3D linear network established on the investigated object consisted of 12 points joined by 40 distances. Then, using a professional mirror-reflex camera, a Canon EOS 500D [22], two photos were taken covering the item from its left and right side (Figure 1 and Figure 2).
The key technical parameters of the camera used are presented in Table 1 [23].
It should be added that the camera is commonly available on the market, emphasizing the low-cost feature of the presented method. For the experiment, the photos were taken using the classic indoor shooting program at short distances to the subject (portrait function at a fixed focal length, f/2 aperture, shutter speed 1: 200 s, ISO 400, no flash).
All mathematical operations were performed using Mathcad v. 15.0 software (PTC—https://www.mathcad.com–accessed on 28 April 2022; previously known as Mathsoft) by programming appropriate calculation worksheets.

3. Results

As is shown in Figure 1 and Figure 2, the reference frame X, Y, Z is connected with the network, assuming the Z-axis passes through point 1 perpendicularly to the plane with points 1, 2, and 4. The axis Y is parallel to the section |1–2|, and the third orthogonal axis X is parallel to |1–4|. Hence, the three connecting reference sections with network points 1, 2 and 4 obtained the following coordinates: 1 (0, 0, h), 2 (0, |1–2|, h), 4 (|1–4|, 0, h). The sections |1–2| and |1–4| represent measured distances between points 1 and 2 as well as 1 and 4, respectively. The value h is the approximate height of the object. The approximate coordinates X, Z, Y of all remaining points 3, 5, …, 12 are computed by solving the network consisting of 40 measured distances between all points.
To evaluate the proposed approach, we consider an example 3D network presented in Figure 3, which exemplifies a standard spatial network used in land surveying. The observational linear equation of the measured spatial distances between any two spatial network points P (XP, YP, ZP) and Q (XQ, YQ, ZQ) is based on the well-known formula used in three-dimensional surveying [24]:
v = X Q X P s 0 d X P Y Q Y P s 0 d Y P Z Q Z P s 0 d Z P + X Q X P s 0 d X Q + Y Q Y P s 0 d Y Q + Z Q Z P s 0 d Z Q ( s s 0 ) , σ s
where: dXP, dYP, dZP, and dXQ, dYQ, dZQ are corrections of the approximate coordinates XP, YP, ZP and XQ, YQ, ZQ of the points P and Q, respectively, v is the random error of distance measurement s with zero value expectation and known standard deviation equaling σs = 1 mm, s0 is an approximate value of the measured distance s:
s 0 = ( X Q X P ) 2 + ( Y Q Y P ) 2 + ( Z Q Z P ) 2
The Gauss-Markov observational model [21,25] for all measured spatial distance observations s1, s2, …, sn, n = 40, (Figure 1 and Figure 2) is completed regarding the Equation (3):
v = A x ( s s 0 ) , Σ s
where: x—vector of unknown corrections: dY2, dX3, dY3, dZ3, dX4, dY4, dX5, dY5, dZ5, …, dX12, dY12, dZ12, v—vector of unknown observational error residuals v1, v2, …, vn, A-known design matrix, s—vector of observations s1, s2, …, sn, s0—vector of approximate values of observations s0,1, s0,2, …, s0,n, σs—diagonal covariance matrix of observations s, composed from standard deviations σs,1, σs,2, …, σs,n.
In the adjustment, the network is connected with the reference frame by three coordinates, X1, Y1, Z1 of point 1, two coordinates X2, Z2 of point 2, and one coordinate Z4 of point 4. These coordinates are constant; it means dX1 = dY1 = dZ1 = 0, dX2 = dZ2 = 0, dZ4 = 0.
The least-squares solution of the Gauss-Markov model (8): v T   s 1 v = m i n is given by well-known formulas [21,26]:
x = Σ x A T Σ s 1 ( s s 0 ) ,
where the equations:
Σ x = ( A T Σ s 1 A ) 1
Σ v = Σ s A Σ x A T
represent the covariance matrices of x and v, respectively.
The maximal value of the standard deviations σX, σY, σZ of the adjusted coordinates Y2 + dY2, X3 + dX3 Y3 + dY3, Z3 + dZ3, X4 + dX4, Y4 + dY4, X5 + dX5, Y5 + dY5, Z5 + dZ5X12 + dX12, Y12 + dY12, Z12 + Z12, taken from diagonal elements of the covariance matrix (5) is equal to 1.2 mm.
The adjusted positions of the network point 1, 2, …, 12 are used as so-called control points for the determination of the orientation and distortion parameters of the left and right stereo photos in the object reference frame (X, Y, Z).

3.1. Accuracy of the Photogrammetric Intersection

The Photogrammetric Central Projection

The photogrammetric central projection is a projection of the three-dimensional object space (X, Y, Z) onto a two-dimensional image plane (x, y) (Figure 1 and Figure 2). The central projection equations (X, Y, Z)   (x, y) are given by following [26,27]:
x = x 0 f s r 11 ( X X 0 ) + r 12 ( Y Y 0 ) + r 13 ( Z Z 0 ) r 31 ( X X 0 ) + r 32 ( Y Y 0 ) + r 33 ( Z Z 0 ) [ k 1 ( ( x x 0 ) 2 + ( y y 0 ) 2 ) + k 2 ( ( x x 0 ) 2 + ( y y 0 ) 2 ) 2 ] ( x x 0 ) k 3 [ 3 ( x x 0 ) 2 + ( y y 0 ) 2 ] 2 k 4 ( x x 0 ) ( y y 0 )
y = y 0 + f s r 21 ( X X 0 ) + r 22 ( Y Y 0 ) + r 23 ( Z Z 0 ) r 31 ( X X 0 ) + r 32 ( Y Y 0 ) + r 33 ( Z Z 0 ) [ k 1 ( ( x x 0 ) 2 + ( y y 0 ) 2 ) + k 2 ( ( x x 0 ) 2 + ( y y 0 ) 2 ) 2 ] ( y y 0 ) 2 k 3 ( x x 0 ) ( y y 0 ) k 4 [ ( x x 0 ) 2 + 3 ( y y 0 ) 2 ]
where X, Y, Z are the coordinates of an object point P; x, y are the pixel image coordinates of the projected point P; Y0, Y0, Z0 are the coordinates of the central projection point O; x0, y0 are the image pixel coordinates of the central point O; k1, k2, k3, k4, are parameters of the camera distortion, fs = f/s, f is the focal length of the camera, s is the CCD cell size of the camera, whilst:
[ r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 ] = [ cos ϕ cos κ cos ω sin κ + sin ω sin ϕ cos κ sin ω sin κ cos ω sin ϕ cos κ cos ϕ sin κ cos ω cos κ sin ω sin ϕ sin κ sin ω cos κ + cos ω sin ϕ sin κ sin ϕ sin ω cos ϕ cos ω cos ϕ ]  
is the matrix of rotation (ω, φ, ϰ) of the object reference frame (X, Y, Z) related to the camera reference frame (Yk, Yk, Zk). The frame’s origin (Yk, Yk, Zk) is the central projection point O, the axes Yk, Yk run parallel to the sections of the CCD frame, and Zk is perpendicular to the CCD frame. The six parameters Y0, Y0, Z0, ω, φ, ϰ define the external orientation of the camera reference frame (Yk, Yk, Zk) concerning the object reference frame (X, Z, Y). The three parameters x0, y0, f = fss define the internal orientation of the image referring to the camera reference frame (Yk, Yk, Zk).

3.2. The Photogrammetric Spatial Resection

The photogrammetric spatial resection method allows for determining orientation (Y0, Y0, Z0, ω, φ, ϰ, x0, y0, fs) and distortion (k1, k2, k3, k4) parameters of an image based on the known control points deployed on the object [28]. In the case of the left image, the 13 orientation and distortion parameters were computed by solving the set of 22 Equations (7) and (8) for eleven visible control points 1, 2, …, 12 (Figure 1) by the least-squares method. The camera’s determined position (Y0, Y0, Z0) in the left (L) is shown in Figure 3. The pixel coordinates xcomp, ycomp of the control points computed after adjustment according to the Equations (7) and (8) are not right equal to the measured values x, y. The residual mean squares (RMS) of the obtained pixel coordinates residuals εx = xxcomp, εy = yycomp are equal to 1.8 and 2.0 in pixels, respectively.
In the case of the right image, the 13 orientation and distortion parameters were computed by solving a set of 22 Equations (7) and (8) for eleven visible control points 1, 2, …, 12 (Figure 2) by the least-squares method. The camera’s determined position (Y0, Y0, Z0) in the right (R) is shown in Figure 4. The RMS for obtained pixel coordinates residuals εx = xxcomp, εy = yycomp are equal 1.3 and 1.6 in pixel values, respectively.
Due to the nonlinear least-squares problem, the approximate values of the parameters were computed using the Direct Linear Transformation (DLT) method [29].

3.3. The Photogrammetric Triangulation

The photogrammetric triangulation is a method of simultaneous determination of the orientation and distortion parameters of the stereo or multi-images as well as the coordinates X, Y, Z of new so-called tie points located on the object [30]. In our experiment, we chose only one tie point A (Figure 1 and Figure 2). In this case, the 26 orientation and distortion parameters of the left and right stereo images plus three coordinates XA, YA, ZA of the tie point A were computed by solving the set of 48 Equations (7) and (8) by the least-squares method [31]: 24 equations for twelve visible points on the left image 1, 2, …, 12, A and 24 equations for twelve visible points on the right image 1, 2, …, 12, A. The RMS values for the obtained pixel coordinates residuals εx, εy are equal to 1.5 and 2.1 in pixel values, respectively. The determined positions of the camera in the left and right positions concerning the 12 network points are shown in Figure 3.

3.4. The Accuracy of the Photogrammetric Intersection

The photogrammetric intersection is a method of computing the coordinates of object points based on the spatially oriented stereo or multi-images referenced in the object reference system X, Y, Z. For example, the coordinates XB, YB, ZB of the new object point B (Figure 1 and Figure 2) are computed by solving four Equations (7) and (8) by the least-squares method: two equations for measured pixel coordinates of the projected point B on the left image (Figure 1) and two equations for projected point B on the right image (Figure 2). The computed spatial distance between points A (XA, YA, ZA) and B (XB, YB, ZB) is equal to [(XBXA)2 + (YBYA)2 + (ZBZA)2]1/2 = 0.210 m. The distance between relevant points measured by tape is also equal to 0.210 m. So, in this case, the error of the photogrammetrically measured distance between two points A and B on the object is equal to 0.000 [m].
In the second example, the distance between points C and D is computed (Figure 1 and Figure 2). The coordinates of points C and D were computed in the same way as those of point B. The computed spatial distance between photogrammetrically determined points C (xC, yC, zC) and D (xD, yD, zD) is equal [(XDXC)2 + (YDYC)2 + (ZDZC)2]1/2 = 0.989 m. The distance between these points measured by tape equals 0.990 m. In this experiment, an error of the photogrammetrically measured distance between two points C and D on the object equals 1 mm. For clarity, the aforementioned results are summarized in Table 2.

4. Discussion

Close-range photogrammetry is used in many engineering fields. Employing it for object dimensioning and assessing displacement values requires developing and testing an appropriate strategy. In the case of small pieces, more precise control points can be set out by founding and adjusting a three-dimensional linear network on the object. Our experiment shows that the adjusted positions of the control points are determined with high accuracy of about ±1 mm. Furthermore, such a high accuracy refers to the entire 3D photogrammetric model derived from the control points deployed directly on the examined structure. Hence, it is possible to achieve desirable measurement uncertainty related to only two images taken with the non-metric camera Canon EOS 500D.
The proposed approach is progressive and can be freely modified to be integrated with other surveying technologies. For example, it can be utilized while dimensioning indoor spaces for the area and volume calculations. Another possibility is to use it in displacement monitoring, control measurements, or finishing works on a construction site.
In contemporary geomatics, one of the deciding factors when choosing a particular surveying method is its availability and time consumption. In the discussed case, both conditions are successfully fulfilled. Using a non-metric, commonly used camera makes the solution low-cost. Also, data processing is not time-consuming, and the algorithm can be run on a standard computer. The application is possible for each object, where a three-dimensional linear network can be established. Such a point deployment can be possible in different ways by selecting characteristic object points (for example, its vertices and edges) or projecting light spots using a common laser emitter. The network can be measured using a tape or hand-held laser distance meter as well as, in case of lesser object availability, a standard total station offering a reflectorless distance surveying option. In such a case, however, one should count with a final accuracy lower than 1 mm.
The limitation in using the developed method may be poor lighting conditions on the site, insufficient resolution of the camera, and the inability to cover the object with a control network. It can be mentioned that the location of network points may refer to both characteristic ends of the measured object as well as points marked using available laser projectors. These items are the subject of further research conducted by the authors.

Author Contributions

Conceptualization, E.O.; methodology, K.K. and E.O.; validation, K.K. and E.O.; writing—original draft preparation, K.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study is available on request from the co-author Edward Osada: [email protected].

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. Pozzoli, A.; Mussio, L. Quick solutions particularly in close range photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIV, 273–278. [Google Scholar]
  2. Osada, E.; Sośnica, K.; Borkowski, A.; Owczarek-Wesołowska, M.; Gromczak, A. Direct Georeferencing Method for Terrestrial Laser Scanning Using GNSS Data and the Vertical Deflection from Global Earth Gravity Models. Sensors 2017, 17, 1489. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Remondino, F.; El-Hakim, S. Image-Based 3D Modelling: A Review. Photogramm. Rec. 2006, 21, 269–291. [Google Scholar] [CrossRef]
  4. Luhmann, T.; Robson, S.; Kyle, S.; Boehm, J. Close Range Photogrammetry and 3D Imaging, 2nd ed.; Walter de Gruyter GmbH: Berlin, Germany; Boston, MA, USA, 2013; pp. 221–253. [Google Scholar]
  5. Cyganek, B.; Siebert, J.P. An Introduction to 3D Computer Vision, Techniques and Algorithms; John Wiley & Sons Ltd.: Hoboken, NJ, USA, 2009; pp. 323–342. [Google Scholar]
  6. Mataa, E.; Hernandeza, M.A.; Cardenala, J.; Pereza, J.L. Assisted control point measurement for close range photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIX-B5, 369–373. [Google Scholar] [CrossRef] [Green Version]
  7. Osada, E.; Owczarek-Wesołowska, M.; Ficner, M.; Kurpinski, G. Total Station/GNSS/EGM integrated geocentric positioning method. Surv. Rev. 2017, 49, 206–211. [Google Scholar] [CrossRef]
  8. Luhmann, T.; Robson, S.; Kyle, S.; Harley, I. Close Range Photogrammetry—Principles, Techniques and Applications; Whittles Publishing: Scotland, UK, 2011; pp. 135–181. [Google Scholar]
  9. Revilla-León, M.; Att, W.; Özcan, M.; Rubenstein, J. Comparison of conventional, photogrammetry, and intraoral scanning accuracy of complete-arch implant impression procedures evaluated with a coordinate measuring machine. J. Prosthet. Dent. 2021, 125, 470–478. [Google Scholar] [CrossRef]
  10. Sapirstein, P. A high-precision photogrammetric recording system for small artifacts. J. Cult. Herit. 2018, 31, 33–45. [Google Scholar] [CrossRef]
  11. Yakar, M.; Yilmaz, H.M.; Mutluoglu, O. Close range photogrammetry and robotic total station in volume calculation. Int. J. Phys. Sci. 2010, 5, 86–96. [Google Scholar]
  12. Navarro, S.; Lerma, J.L. Accuracy analysis of a mobile mapping system for close range photogrammetric projects. Measurement 2016, 93, 148–156. [Google Scholar] [CrossRef] [Green Version]
  13. Khalil, A.M. Two-dimensional displacement measurement using static close range photogrammetry and a single fixed camera. Alex. Eng. J. 2011, 50, 219–227. [Google Scholar] [CrossRef] [Green Version]
  14. Ding-bang, Z.; Yi, Z.; Tao, C.; Yuan, M.; Kun, F.; Ankit, G.; Akhil, G. Measurement of displacement for open pit to underground mining transition using digital photogrammetry. Measurement 2017, 109, 187–199. [Google Scholar]
  15. Ye, N.; Zhu, H.; Wei, M.; Zhang, L. Accurate and dense point cloud generation for industrial measurement via target-free photogrammetry. Opt. Lasers Eng. 2021, 140, 106521. [Google Scholar] [CrossRef]
  16. Agüera-Vega, F.; Carvajal-Ramírez, F.; Martínez-Carricondo, P. Assessment of photogrammetric map-ping accuracy based on variation ground control points number using unmanned aerial vehicle. Measurement 2017, 98, 221–227. [Google Scholar] [CrossRef]
  17. Ćmielewski, K.; Karsznia, K.; Kuchmister, J.; Gołuch, P.; Wilczyńska, I. Accuracy and functional assessment of an original low-cost fibre-based inclinometer designed for structural monitoring. Open Geosci. 2020, 12, 1052–1059. [Google Scholar] [CrossRef]
  18. Esmaeili, F.; Varshosaz, M.; Ebadi, H. Displacement measurement of the soil nail walls by using close range photogrammetry and introduction of CPDA method. Measurement 2013, 46, 3449–3459. [Google Scholar] [CrossRef]
  19. Ordóñez, C.; Martínez, J.; Arias, P.; Armesto, J. Measuring building façades with a low-cost close-range photogrammetry system. Autom. Constr. 2010, 19, 742–749. [Google Scholar] [CrossRef]
  20. Lerma, J.L.; Navarro, S.; Cabrelles, M.; Villaverde, V. Terrestrial laser scanning and close range photogrammetry for 3D archaeological documentation: The Upper Palaeolithic Cave of Parpalló as a case study. J. Archaeol. Sci. 2010, 37, 499–507. [Google Scholar] [CrossRef]
  21. Christensen, R. (Ed.) General Gauss–Markov models. In Plane Answers to Complex Questions: The Theory of Linear Models, 4th ed.; Springer: New York, NY, USA, 2011; pp. 237–266. [Google Scholar]
  22. Canon EOS 500D Specifications—Technical Brochure; Canon Hongkong Company Limited: Hongkong, China, 2008.
  23. Esmaeili, F.; Ebadi, H.; Saadatseresht, M.; Kalantary, F. Application of UAV Photogrammetry in Displacement Measurement of the Soil Nail Walls Using Local Features and CPDA Method. ISPRS Int. J. Geo-Inf. 2019, 8, 25. [Google Scholar] [CrossRef] [Green Version]
  24. Forstner, W.; Wrobel, B.P. Photogrammetric Computer Vision, Statistics, Geometry, Orientation and Reconstruction; Springer International: Cham, Switzerland, 2016; pp. 195–246. [Google Scholar]
  25. Isotalo, J. Linear Estimation and Prediction in the General Gauss–Markov Model. Ph.D. Dissertation, University of Tampere, Tampere, Finland, 2007. Available online: https://trepo.tuni.fi/bitstream/handle/10024/67741/978-951-44-7018-9.pdf?sequence=1 (accessed on 24 February 2022).
  26. Jiang, R.; Jauregui, D.V.; White, K.R. Close-range photogrammetry applications in bridge measurement: Literature review. Measurement 2008, 41, 823–834. [Google Scholar] [CrossRef]
  27. Luhmann, T. Close range photogrammetry for industrial applications. J. Photogramm. Remote Sens. 2010, 65, 558–569. [Google Scholar] [CrossRef]
  28. Wang, X.; Clarke, T.A. Separate adjustment of close-range photogrammetric measurements. J. Photogramm. Remote Sens. 1998, XXXII, 177–184. [Google Scholar]
  29. El-Ashmawy, K.L.A. Using direct linear transformation (DLT) method for aerial photogrammetry applications. Geod. Cartogr. 2018, 44, 71–79. [Google Scholar] [CrossRef] [Green Version]
  30. Schenk, K. From point-based to feature-based aerial triangulation. J. Photogramm. Remote Sens. 2004, 58, 315–329. [Google Scholar] [CrossRef]
  31. Bellamy, C.; Watterson, G. Least Squares Adjustment of networks. Surv. Rev. 1970, 20, 250–258. [Google Scholar] [CrossRef]
Figure 1. The left stereo picture with the 3D precise linear network on the object (the points 1, …, 12, A, B, C, D represent characteristic vertices of the object; X, Y, Z define a local 3D coordinate system; x, y represent pixel image coordinates, while h is the object’s approximate height).
Figure 1. The left stereo picture with the 3D precise linear network on the object (the points 1, …, 12, A, B, C, D represent characteristic vertices of the object; X, Y, Z define a local 3D coordinate system; x, y represent pixel image coordinates, while h is the object’s approximate height).
Applsci 12 04571 g001
Figure 2. The right stereo picture with 3D precise linear network on the object (the points 1, …, 12 and A, B, C, D represent characteristic vertices of the object; X, Y, Z define a local 3D coordinate system; x, y represent pixel image coordinates).
Figure 2. The right stereo picture with 3D precise linear network on the object (the points 1, …, 12 and A, B, C, D represent characteristic vertices of the object; X, Y, Z define a local 3D coordinate system; x, y represent pixel image coordinates).
Applsci 12 04571 g002
Figure 3. Exemplary spatial network.
Figure 3. Exemplary spatial network.
Applsci 12 04571 g003
Figure 4. Determined camera locations L-left and R-right with 12 control points deployed on the test object.
Figure 4. Determined camera locations L-left and R-right with 12 control points deployed on the test object.
Applsci 12 04571 g004
Table 1. Key technical parameters of the camera Canon EOS 500D.
Table 1. Key technical parameters of the camera Canon EOS 500D.
ParameterDescription
TypeDigital AF/AE SLR camera with built-in flash
Recording mediaSD memory card, SDHC memory card
Image typeJPEG, RAW (14-bit, Canon original)
Image sensor typeCMOS sensor
Image sensor size22.3 × 14.9 mm
ResolutionEffective pixels: approx. 15.10 MPx
Total pixels: approx. 15.50 MPx
Color filter systemRGB primary color filter
Low-pass filterFixed position in front of the image sensor
Focus modesServo mode; Auto-focus; Manual focus
Exposure controlFull Auto, Portrait, Landscape,
Close-up, Sports, Night Portrait, Flash Off,
Creative Auto, Program), shutter-priority AE, aperture-priority AE, auto depth-of-field AE, manual exposure, E-TTL II autoflash, movie shooting
Table 2. The accuracy of the photogrammetric intersection expressed as the comparison between subsequent control points measured in different methods.
Table 2. The accuracy of the photogrammetric intersection expressed as the comparison between subsequent control points measured in different methods.
From
Point
To
Point
Tape Measured
Distance [m]
Photogrammetric Measured Distance [m]Residual Value
σ [m]
AB0.2100.2100.000
CD0.9900.9890.001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Karsznia, K.; Osada, E. Photogrammetric Precise Surveying Based on the Adjusted 3D Control Linear Network Deployed on a Measured Object. Appl. Sci. 2022, 12, 4571. https://doi.org/10.3390/app12094571

AMA Style

Karsznia K, Osada E. Photogrammetric Precise Surveying Based on the Adjusted 3D Control Linear Network Deployed on a Measured Object. Applied Sciences. 2022; 12(9):4571. https://doi.org/10.3390/app12094571

Chicago/Turabian Style

Karsznia, Krzysztof, and Edward Osada. 2022. "Photogrammetric Precise Surveying Based on the Adjusted 3D Control Linear Network Deployed on a Measured Object" Applied Sciences 12, no. 9: 4571. https://doi.org/10.3390/app12094571

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop