Next Article in Journal
Critical State and the Loosest Jammed State of Granular Materials
Previous Article in Journal
Design and Development of a Relational Database Management System (RDBMS) with Open Source Tools for the Processing of Data Monitored in a Set of Photovoltaic (PV) Plants
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

3D Displacement Measurement of Railway Bridge According to Cyclic Loads of Different Types of Railcars with Sequential Photogrammetry

1
Department of Civil Engineering, Sunchon National University, Sunchon 57922, Republic of Korea
2
Department of Civil Engineering, Interdisciplinary Major of Ocean Renewable Energy Engineering, Korea Maritime and Ocean University, Busan 49112, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(3), 1359; https://doi.org/10.3390/app13031359
Submission received: 18 November 2022 / Revised: 11 January 2023 / Accepted: 17 January 2023 / Published: 19 January 2023

Abstract

:

Featured Application

The proposed method can be used for a 3d displacement measurement of a large scale civil structure using the automated sequential photogrammetry.

Abstract

In the early days of railroads in Korea, railway bridges were constructed as steel plate-girder structures, which are vulnerable to vibration and torsion. Many of these bridges have since been replaced with concrete-slab structures, which have high stability. Nevertheless, steel railway bridges still remain all over the country, and a lot of manpower and cost is being invested in the maintenance and repair of such bridges. Moreover, there have not been experimental analyses aiming to measure the cyclic loads that occur when a train enters. To ensure bridge safety, it is necessary to periodically inspect deformations. To this end, the present study proposed a sequential photogrammetric technique for measuring the deformation of a steel railway bridge for three types of railcars. Sequential stereo images of the bridge with multiple feature points are obtained using sequential photographing cameras, to determine the ground coordinates of each point as a function of time based on the space intersection from the relative orientation with coplanarity and the scale adjustment. All of these processes are performed through automated techniques using only the cameras and the targets. With this setup, the 3-dimensional dynamic motions of the bridge due to the cyclic loading of trains could be measured. In addition, the displacements by the proposed method were compared to those obtained with the 3D Laser tracker. The horizontal displacements errors did not exceed 0.5 mm and the vertical error was within 2.3 mm in root mean square error (RMSE) at camera-to-object distances of about 9 m.

1. Introduction

Railway bridges are constantly exposed to the cyclic loads of trains. These loads cause structural damage to bridges, which can then potentially collapse, so it is crucial to periodically check the deformation to ensure bridge safety. In general, bridge deformation measurement is performed periodically by a contact type in which a sensor is directly attached to the bridge, such as a strain-gauge, or by a non-contact type, such as a laser tracker. On-site measurement of the two types requires skilled technicians and cost, as well as a complicated work process. Moreover, both conventional methods are able to generally measure only a single direction deformation at a limited point on the bridge, thus requiring many instruments to measure many points [1].
Recently, photogrammetry—a non-contact technique based on camera images—is widely used for the 3D modeling of large-scale structures. In particular, this technique has been applied in many fields to measure the 3D coordinates of the target objects from 2D images [2,3,4]. Altuntas et al. [5] used low-cost cameras to document a historical masonry arch bridge. Gardin and Jimenez [6] compared optical cameras including infra-red with laser scanning for 3D model generation and inspection of a railway bridge. Pucko et al. [7] applied the time series 3D scan for monitoring construction progress. Brusak [8] compared laser tracker measurements and laser, photo-based, and handheld scanning for the use of 3D modelling in shipbuilding. The most popular hardware methods used to achieve this purpose are optical cameras and laser scanners on a tripod or UAV. Subramanian and Gheisari [9] compared 360 panoramic photogrammetry with a laser scanner. Sahebdivani et al. [10] applied UAV for rail track detection with 3D modeling. There are also studies on the photogrammetric application in soil improvement [11,12].
This technique has the potential to replace the costly measurement equipment that is currently needed to investigate structures because automated progress is also possible, and this technique allows for safer observations [13]. The technique is also known as DIC (Digital Image Correlation) that is used to measure the displacement and strains in many fields of science and engineering [14,15,16,17,18,19]. This photogrammetric method has been investigated for displacement measurement as a non-contact technique based on cameras since the 1970s [20,21]. In the field of civil engineering, the measurement of displacement is very important for verifying that the displacements occurring are within the safety limits [22]. Detchev et al. [23] measured deformation using a low-cost camera for a concrete beam, and the vertical RMSE was found to be 0.06 mm. Fujita et al. [24] used image processing for the 3D measurement of structural deformation in a shake table. Sánchez-Aparicio et al. [25] applied the bundle adjustment for a full-scale two-story unreinforced masonry structure and reported 0.6 pixels of displacement detection for circular targets. Markiewicz et al. [26] used terrestrial laser scanning (TLS) with structure-from-motion (SfM) methods to monitor cultural heritage objects. Other studies have used photogrammetry for bridge measurement using traditional surveying methods such as a total station [27,28]. Basharat et al. [29] proposed a sensor network with a video camera for bridge health monitoring. Valeca et al. [30] applied photogrammetry to two-span RC beam pedestrian bridge on-site monitoring with a load test and reported an accuracy of 0.1 mm for three axes. Jiang and Jauregui [31] applied a digital close-range photogrammetric bridge deformation measurement to show a 1~2 mm level of accuracy compared to gage measurements. Leitch [32] conducted laboratory photogrammetric testing of a steel beam, a non-composite steel girder bridge, and achieved sub-millimeter accuracy. Hosseini et al. [33] used particle image velocimetry to measure displacement in steel and RC beams. Erdenebat et al. [34] used close-range photogrammetry, engineering leveling, and displacement sensors to determine the deflection curve of a reinforced concrete beam. Ahn et al. [35] used the photogrammetric load traffic vibration measurement of three bridges to see the vibrational magnitudes well within the design standards of the American Association of State Highway Transportation (AASHTO). Cunha et al. [36] used an image-based 3D measurement equipment for the structural dynamics of a suspension bridge. Handayani and Taufik [37] utilized the photogrammetric technique to measure the 3D displacement of a bridge span. Lee and Han [38] proposed a technique for computing the sensor orientation parameters of the camera using only a photogrammetric board, and they measured the instantaneous deformation of a railroad bridge within an error of 2 mm in three dimensions.
The present study measured the deformation of a steel railway bridge for the cyclic loads of three types of railcars, using the sequential photogrammetric technique. Conventionally, interior and exterior orientation parameters (IOPs and EOPs) of the camera are required, for example through the bundle adjustment. These are obtained from control points (CPs) with 3D precise coordinates that have been surveyed using a terrestrial instrument, such as total stations. However, in this implementation we proposed a relative orientation method that uses coplanarity and scale adjustment using the feature points on the bridge to automate the photogrammetric.
The objective of the study is application of relative orientation-based efficient sequential image processing for 3D displacement measurement of a plate girder. The method is proposed to compensate some limitations of a conventional laser displacement sensor. A laser sensor is often installed in a limited number for points of interests because the sensor accurate but costly. The proposed method is designed applicable to any point where a cheap target is placed. The photogrammetric targets installation is very easy and a lot of targets can be used for denser displacement monitoring. In addition, the proposed method can measure 3D displacement while the laser sensor measures one-direction displacement. However, it still needs a target that can be a limitation to a point of inaccessible area.
The paper is organized as follows. Section 2 introduces the proposed photogrammetric method with the relative orientation and sequential image processing for 3D displacement measurement. Section 3 illustrates experimental results and is followed by the conclusion in Section 4.

2. Methodology

Cyclic loads are transferred to plate girders as a train passes a steel railway bridge. These impacts may lead to torsional stress in the plate girder in 3D. To detect 3D deformation, it is required to obtain sequential images using more than two cameras and to compute the 3D coordinates of target points placed on the bridge with the passage of time. Figure 1 depicts the whole process proposed in this research for the measurement of 3D positions using the sequential photogrammetric technique. This method can concurrently measure the 3D displacements of multiple spots on the bridge using a non-contact manner while targets are installed. The 3D displacement of the bridge was determined from the stereo images by the two cameras as follows.
First of all, two unique points or two attached targets are measured within the object area, and the distance is measured in a sophisticated manner to adjust relative orientation parameters (ROPs) scales. Using ROPs requires lesser control points and simplifies the processing because the number of adjusted parameters is reduced. In the experiment, we attached two targets and used them since we were able to access the test area. We also attached other targets to the bridge span. The dynamical movements of these targets were traced from multi-momentum 3D locations computed by the space intersection. After that, the image sequence of the distorting railway bridge was taken using two digital cameras when the train entered the bridge. The stereo images could be synchronized using a remote controller and two camera receivers. This prevented the shaking of the cameras due to manual control.
Pre-calibrated data was used for the IOPs of the camera [30], and feature points located on the bridge were extracted as CPs to determine ROPs ( X O i ,   Y O i , Z O i , ω i ,   φ i ,   κ i ) by the coplanarity (i = 1 and i = 2 for the left and right cameras). In this study, IOPs include the focal length (fi), positions of the principal point ( x O i and y O i ), and radial distortion parameter ( k i 1 ) in the CCD image plane. Pre-calibrated means the IOPs are determined in a lab environment. The ROPs include three spatial locations, which are the components of the base-vector between the two cameras, and rotation angles of the camera in the ground space. The coplanarity condition can be expressed as:
b x ( v 1 w 2 w 1 v 2 ) + b y ( w 1 u 2 u 1 w 2 ) + b z ( u 1 v 2 v 1 u 2 ) = 0
with
b x = X O 2 X O 1 b y = Y O 2 Y O 1 b z = Z O 2 Z O 1
[   u     v     w   ] i T = M i T · [   x k     y k f   ] i T
where 1, 2 = left and right camera; i = 1 or 2; k = ID of CPs; XOi, YOi, and ZOi are the 3-D ground coordinates of the camera perspective center from the user-defined coordinate system; M i is a rotation matrix that contains components mi,11, mi,12, …, mi,33 based on rotational angles ( ω i ,   φ i ,   κ i ) with respect to the ground coordinate system; x i = x i x O i + k i 1 ( x x O i ) ; y i = y i y O i + k i 1 ( y y O i ) ; and x i and y i are the observed image coordinates of the CPs.
Only the image coordinates of the CPs are measured to determine the ROPs from the coplanarity condition in Equation (1), which shows that the two camera centers—i.e., any target point with the corresponding image points on the stereo-pair lie on a common plane, as shown in Figure 2. One coplanarity equation is described for one object point, which appears in the stereo images. The equation contains image coordinates of the corresponding points without object space coordinates, as shown in Equations (1)–(3). The ROPs are determined using the conjugate image coordinates of some more CPs to the first stereo frames. These parameters can be calculated according to the iterative least-squares method using Equation (1).
As a second step, scale adjustment is performed to accurately measure the subject using ROPs that have been determined solely from the image location without the spatial location of the CPs. It should be noted that the distance (b) between the centers of the left and right camera lenses should be directly measured in the coplanarity of Figure 2, but this would not be feasible. Therefore, an accurate scale for the ROPs should be taken into account while considering the inaccurate lens-to-lens distance. In this study, the scale is determined using the ratio of the distance measured in an elaborate manner versus the distance computed by ROPs for two attached targets. The ROPs component under the coplanarity condition can be adjusted using the determined scale shown in Equations (4) and (5):
[   X O   Y O   Z O   ] i T = s · [   X O   Y O   Z O   ] i T
  M i = s · M i
where i =1, 2 (left and right camera); s is a scale factor; and X O ,   Y O ,   Z O , and   M i are ground coordinates of the camera centers and rotation matrices by the scale adjustment, respectively.
As a third step, to measure the instantaneous displacement of the bridge, it is necessary to obtain 3D coordinates of all the targets on the stereo sequence images using the left and right cameras. These are determined based on the space intersection with the pre-identified IOPs and the scale-adjusted ROPs, and the corresponding image coordinates x i j k and y i j k (i = number of cameras, j = number of frames, k = number of targets). The target’s image coordinates of the first stereo frames are obtained manually. From the second stereo frames, the target-matching technique automatically traces the target’s image coordinates for all frames (Figure 3). In this study, target-matching based on the normalized cross correlation and the least-squares-technique was carried out with the first frames as a reference, and the new location of targets was obtained from the next frames [39]. The target matching based on the two matching method is for better sub-pixel image matching. The least square matching can overcome the brightness difference and geometric image distortion between the target and reference images.
Finally, using IOPs and scaled ROPs which are pre-determined, the corresponding image coordinates ( x i j k , y i j k ) are identified, and the 3D spatial coordinates of targets attached to the bridge in each stereo frame can be determined from the space intersection, which is based on the collinearity condition (Equation (6)):
[   x i j k y i j k f i ] T = λ i · M i · [   X j k X O i Y j k Y O i Z j k Z O i ] T
where i = 1, 2 (left and right camera); j is the number of frames; k = the number of targets; is a scale factor between image and object space; and X ,   Y ,     and   Z are the ground coordinates of the targets on the bridge.
This methodology makes it possible to measure the entire span of a bridge because there is no limit to the number of targets with which to detect the 3D displacement. Since the target is made from a thin piece of paper, it has no mass loading compared to the contact sensors such as strain gages. Additionally, using unique features on the surface, such as edges, the artificial targets may also be unnecessary [39].

3. Experimental Results

An experiment was performed to verify the proposed method for the 3D instantaneous displacement measurement using two cameras. The railway bridge tested in the experiment was installed with a length of 150 m across the Suncheon River, and it is quietly long among the I-plate girder bridges. The distance of the bridge opening is about 9 m for displacement measurement. It is impossible to take a measure of the entire bridge displacement crossing over a river. A bridge span was selected for the experiment such that a camera could be easily installed as shown in Figure 4. Image coordinates of the 26 feature points marked on the side of one span bridge were obtained and used as CPs to compute the ROP. To check the accuracy of the ROPs and measure the three-dimensional displacement, 18 donut-shaped targets were stuck on the bridge span. The 3-D spatial coordinates of the targets were measured using the total-station, a distance-and-angle measuring instrument used for the 3D terrestrial survey. The field test was conducted on 29 March 2016 from 4:00 PM to 5:00 PM, at which time the temperature was about 18 degrees Celsius and the humidity was 48 percent. Since it was performed at a time of the season when temperature change does not occur dramatically during the day, and the measurement time for each train does not exceed 20 s, it is not expected to have a significant effect on the deformation measurement of the steel rail bridge.
The technique uses two identical cameras to take consecutive images, a remote controller, and two receivers for time synchronizing of the stereo frames. The camera is a non-metric NIKON D200 that can take six frames per second (fps); this study does not use sophisticated expensive devices. The resolution was 3872 pixels × 2592 pixels (pixel size = 6 µm × 6 µm). The camera focal length, the baseline, and camera-to-object distances were 55 mm, about 3 m, and 9 m from the bridge, respectively. A 3D laser tracker and a laser reflector were also installed to compare displacement accuracy with the proposed method. Figure 4 shows the installed equipment and the points used in the test area.
Using two cameras with a remote controller and two receivers, photographs were taken as three types of trains entered the bridge. At several-second intervals, two maintenance trains were each captured with five photographs, and the passenger train was captured with six photographs. The first maintenance train entered the test area from left to right, and the other trains entered from right to left, as shown in Figure 5. In a previous experience, we analyzed vibration by the photographing moment with a remote control and camera shake due to ground vibration when a train passed over the bridge, but there was almost no influence in the change of the image coordinates.
The first stereo frames were taken before trains entered the test span of the bridge. The ROPs of the two cameras were calculated by the relative orientation method using coplanarity and scale adjustment with 26 feature points and their image coordinates provided in the first stereo frames on the bridge. The scale 0.89 was determined from the ratio between the measured and calculated distances from ROPs on points N1 and N2 in the right in Figure 4. Table 1 shows the results of the scale-adjusted ROPs, and their standard deviations (σROP) to check the quality of the determined parameters. The precisions indicate that, overall, the parameters are determined well through the solution of the least-squares method.
To verify the accuracy of the scale-adjusted parameters, the ground coordinates of the 18 checkpoints were computed by the co-planarity equation using Equation (6). Their 3D coordinates were also rotated and shifted to compare the results with the measurements by the total-station. Figure 6 shows the target locations obtained by ROP—applying rotation and shift—before and after scale adjustment. It clearly reveals that the target locations after scaling ROPs are almost identical to the measured locations compared to before scaling. Table 2 shows that the accuracy after the scale-adjusted ROPs is dramatically improved compared to the value before adjustment. They illustrated sufficiently accurate consistency within 2 mm root means square error and 4 mm absolute maximum error, respectively; it was therefore determined that the scale-adjusted ROPs could be enough to measure the bridge instantaneous displacement.
In the next step, the image-matching technique was carried out with the first image as a reference, the image coordinates of 18 targets including a laser reflector point attached at the bridge were determined by applying the target-matching technique for the left and right all frames. Figure 7 shows an example in which the reflector position was accurately obtained in the second and 16th images (bottom), respectively, from its positions in the first left/right images (top). Then, as a final step, while the trains enter a test span of the bridge, the 3D ground coordinates of all targets and a reflector point are computed by the co-planarity based on Equation (6) using the scale-adjusted ROPs and the image coordinates with the passage of time. Meanwhile, to compare the proposed techniques, 3D displacements of the reflector point on the bridge were simultaneously measured at 0.1-s intervals by a laser tracker, as shown in Figure 8. The laser tracker measures the reflector position with less than 0.5 mm precision in the three axes [40].
Figure 8 compares the displacement of the reflector point caused by the cyclic load from the first train to the third train measured by the proposed method and a laser tracker. In this experiment, the photos were only acquired at arbitrary moments compared to the measurement using the laser tracker, so the deformation at the time is indicated as a dot in Figure 8. In this figure, the X, Y and Z axes are negative values for the left, downward, and camera directions concerning the center of the reflector, respectively, and the opposite directions are positive values. When pointing only to the measurement time-step of the proposed method in Figure 8, the measurement of the laser tracker shows almost no displacement in the X and Z directions, while the proposed method showed displacement of about 3 mm. However, in the Y direction, the difference between the proposed method and the laser tracker is less than 1 mm. As shown in Figure 8, due to the third passenger train having the heaviest weight and the fastest speed among the three types of trains, the passenger train’s deflection (Y-axis displacement) by the laser tracker was the largest. As illustrated in Table 2, the ROP photogrammetry technique exhibits an error of approximately 2 mm. Therefore, the correlation between train weight and speed could not be identified within this error range.
This study also extracted and compared only the same time-step points among the laser tracker measurements as the proposed method’s measurements, as shown in Figure 9. In the Y direction, the deformation shapes are substantially similar between the proposed method and laser tracker results, while it is slightly different in the X direction. Specifically, it shows a relatively large error in the Z direction. These causes are the low performance of the camera including the calibrated IOPs, target matching errors of feature points on the bridge surface used to determine ROPs, and incomplete time synchronization, even though stereo images from both cameras were taken simultaneously by a remote control. Nevertheless, it was observed that the displacement forms of the proposed method are similar to the results of the laser tracker in the X- and-Z axes enlarged to a 0.1 mm scale. Therefore, it seems to be the case that the displacement values of the proposed method are exaggerated because the real size of one pixel is about 1 mm, and the ROPs scale error is caused through the photogrammetric process as mentioned above, but the displacement directivity is relatively accurate.
Table 3 lists the discrepancies between the proposed method and the laser tracker measurement at the same time-step points in the 3D axes. The RMSEs for all types of trains were approximately 0.3–2.3 mm. The absolute maximum error of the proposed method did not exceed 1.5 mm in either the X or Y directions, whereas an error of about 3.5 mm occurred in the Z direction. These results have a similar tendency to the accuracy of the scale-adjusted ROPs presented in Table 2. In this implementation, the one-pixel size of the images is about 1 mm.
Figure 10 shows the 3D displacement time-steps measured using the proposed method at six target locations. Twelve other targets were also measured, but there was no significant difference in the results, which are shown in Figure 10, so they are omitted in this article. Both the top and the bottom in Figure 10 show movements similar to the Y-axis sag and the X-Z-axis twist shown in Figure 9. The three-axis displacement of the six targets also occurred in a similar manner to the laser point results, and the third passenger train exhibited the largest fluctuation range. The sagging of the bridge by the three types of trains of up to approximately 4 mm occurred in the Y-direction. At the same time, torsion up to about 4 mm also occurred in the X-Z axes.

4. Discussion

The scale-adjusted ROPs showed accuracy below 2 mm root means square error and 4 mm absolute maximum error, respectively that is enough to measure the instantaneous displacement for a bridge of large span showing large displacement.
The sequential measurement based on the image matching showed results within almost one pixel, whereas in the Z direction of the camera depth, the result is approximately 2–3 times bigger due to the asynchronous time between stereo images and the uncertainty of the two cameras parameters for the calculation of the 3D spatial coordinates.
The laser tracker is high precision, but it is costly and cannot represent the full-body twist of the span. The proposed method has the potential of being able to simultaneously measure deflection and torsion over the entire span of a bridge despite the occurrence of millimeter-level errors. Therefore, it can be used to check the overall condition of the bridge before the detailed diagnosis using a precise measuring instrument such as electrical strain gages.
We believe the proposed method can be used a compensation method to the laser tracker, especially for denser 3D displacement measurement over a girder, or for the areas where a laser tracker is not installed.

5. Conclusions

Railway bridges are constantly exposed to the cyclic loading of trains during their life cycle. Due to such loading, bridges have the potential for structural damage and collapse. Thus, regular observation is needed to secure bridge safety. Conventionally, bridge deformation measurements are conducted by contact type measurement methods such as a strain-gauge, or by non-contact type measurement methods, such as a laser tracker. However, it is complicated to survey because the two types require skilled technicians and cost. Moreover, both conventional methods can generally only measure at limited points on the bridge.
This study presented a sequential photogrammetry to measure the span 3D displacement of a railway bridge. To automate the photogrammetric process almost, we proposed a relative orientation method that uses coplanarity and scale adjustment using the feature points on the bridge. We measured the 3D displacement of a bridge span using two cameras of the same type with the proposed method. Compared to a precise 3D laser tracker, the difference was about 1 mm in the directions of the three axes and the 3D displacement forms were also similar. There is almost no displacement in either the X- or Z-directions in the laser tracker. However, it cannot indicate the entire torsion of the span since this survey is only for one point. Thus, through the proposed method, we remotely confirmed the occurrence of sag and twist due to the cyclic loading of the train simultaneously in this bridge.
If a high-speed camera is used with the proposed technique, the dynamic deformation of the bridge can be continuously measured at intervals of 0.1 s or more, as well as selectively measuring only the event time during which the loading is applied. The proposed method is advantageous because it does not require a control point survey or the installation of a photogrammetric board to implement photogrammetry, but only requires measurement of the distance between two feature points on the bridge for scale-adjustment of the ROPs. The distance can also be precisely measured using a portable laser rangefinder. Therefore, the whole photogrammetric-process could almost be automated. Through the proposed method, it is possible to find the cause of the risk of a railroad bridges, and prevent safety accidents caused by manpower inspection. This method may also replace other inconvenient and costly tests.

Author Contributions

Conceptualization, H.L.; data curation, H.L.; formal analysis, H.L. and J.O.; methodology, H.L. and J.O.; validation, H.L. and J.O.; writing—original draft, H.L.; writing—review and editing, H.L. and J.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2018R1D1A1B06049484).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Luigi, B.; Marco, S. Development and implementation of image-based algorithms for measurement of deformations in material testing. Sensors 2010, 10, 7469–7495. [Google Scholar]
  2. Dold, J. The role of a digital intelligent camera in automating industrial photogrammetry. Photogramm. Rec. 1998, 16, 199–212. [Google Scholar] [CrossRef]
  3. Luhmann, T.; Robson, S.; Kyle, S.; Harley, I. Close Range Photogrammetry: Principles, Methods and Applications; Whittles Publishing: Dunbeath, Caithness, UK, 2006; pp. 1–29. [Google Scholar]
  4. Glira, P.; Olsbock, K.; Kadiofsky, T.; Schorghuber, M.; Weichselbaum, J.; Zinner, G.; Fel, L. Photogrammetric 3D mobile mapping of rail tracks. ISPRS J. Photogramm. Remote Sens. 2022, 183, 352–363. [Google Scholar] [CrossRef]
  5. Altuntas, C.; Hezer, S.; Kirli, S. Image based methods for surveying heritage of masonry arch bridge with the example of Dokuzunhan in Konya, Turkey. Sci. Cult. 2017, 3, 13–20. [Google Scholar]
  6. Gardin, D.C.; Jimenez, A. Optical Methods for 3D-Reconstruction of Railway Bridges. Master’s Thesis, Department of Civil, Environmental and Natural Resources Engineering, Luleå University of Technology, Luleå, Sweden, 2018. [Google Scholar]
  7. Pučko, Z.; Šuman, N.; Rebolj, D. Automated continuous construction progress monitoring using multiple workplace real time 3D scans. Adv. Eng. Inform. 2018, 38, 27–40. [Google Scholar] [CrossRef]
  8. Brusak, I. Geometric Inspection of 3D Production Parts in Shipbuilding—Comparison and Assessment of Current Optical Measuring Methods. Master’s Thesis, University of Applied Sciences, Hochschule Neubrandenburg, Neubrandenburg, Germany, 2018. [Google Scholar]
  9. Subramanian, P.; Gheisari, M. Using 360-Degree Panoramic Photogrammetry and Laser Scanning Techniques to Create Point Cloud Data: A Comparative Pilot Study. In Proceedings of the 55th ASC Annual International Conference, the Associated Schools of Construction, Denver, CO, USA, 10–13 April 2019. [Google Scholar]
  10. Sahebdivani, S.; Arefi, H.; Maboudi, M. Rail track detection and projection-based 3D modeling from UAV point cloud. Sensors 2020, 20, 5220. [Google Scholar] [CrossRef]
  11. Johari, A.; Golkarfard, H.; Davoudi, F.; Fazeli, A. Experimental Investigation of Collapsible Soils Treatment Using Nano-silica in the Sivand Dam Region, Iran. Iran. J. Sci. Technol. Trans. Civ. Eng. 2022, 46, 1301–1310. [Google Scholar] [CrossRef]
  12. Johari, A.; Golkarfard, H.; Davoudi, F.; Fazeli, A. A predictive model based on the experimental investigation of collapsible soil treatment using nano-clay in the Sivand Dam region, Iran. Bull. Eng. Geol. Environ. 2021, 80, 6725–6748. [Google Scholar] [CrossRef]
  13. Maas, H.-G.; Hampel, U. Photogrammetric techniques in civil engineering material testing and structure monitoring. Photogramm. Eng. Remote Sens. 2006, 72, 39–45. [Google Scholar] [CrossRef]
  14. Belloni, V.; Ravanelli, R.; Nascetti, A.; Di Rita, M.; Mattei, D.; Crespi, M. py2DIC: A New Free and Open Source Software for Displacement and Strain Measurements in the Field of Experimental Mechanics. Sensors 2019, 19, 3832. [Google Scholar] [CrossRef] [Green Version]
  15. Pan, B.; Qian, K.; Xie, H.; Asundi, A. Two-dimensional digital image correlation for in-plane displacement and strain measurement: A review. Meas. Sci. Technol. 2009, 20, 062001. [Google Scholar] [CrossRef]
  16. Ngeljaratan, L.; Moustafa, M.A. System Identification of Large-Scale Bridges Using Target-Tracking Digital Image Correlation. Front. Built Environ. 2019, 5, 85. [Google Scholar] [CrossRef]
  17. Nonis, C.; Niezrecki, C.; Yu, T.Y.; Ahmed, S.; Su, C.F.; Schmidt, T. Structural Health Monitoring of Bridges using Digital Image Correlation. In Proceedings of the SPIE—The International Society for Optical Engineering, San Diego, CA, USA, 25–29 August 2013; Volume 8695, p. 869507. [Google Scholar] [CrossRef]
  18. Barros, F.; Aguiar, S.; Sousa, P.J.; Cachaço, A.; Tavares, P.J.; Moreira, P.M.; Ranzal, D.; Cardoso, N.; Fernandes, N.; Fernandes, R.; et al. Cannizzaro, Displacement monitoring of a pedestrian bridge using 3D digital image correlation. Procedia Struct. Integr. 2022, 37, 880–887. [Google Scholar] [CrossRef]
  19. Sousa, P.J.; Barros, F.; Lobo, P.; Tavares, P.J.; Moreira, P.M. Moreira, Experimental measurement of bridge deflection using Digital Image Correlation. Procedia Struct. Integr. 2019, 17, 806–811. [Google Scholar] [CrossRef]
  20. Torlegard, A.K.I. State-of-the-art of close-range photogrammetry. Photogramm. Eng. Remote Sens. 1976, 42, 71–79. [Google Scholar]
  21. Bales, F.B. Close-range photogrammetry for bridge measurement. Transp. Res. Rec. 1984, 950, 39–44. [Google Scholar]
  22. Brownjohn, J.M.W. Structural health monitoring of civil infrastructure. Philosophical transactions of the royal society A: Mathematical. Phys. Eng. Sci. 2007, 365, 589–622. [Google Scholar]
  23. Detchev, I.; Habib, A.; He, F.; El-Badry, M. Deformation Monitoring with Off-the-Shelf Digital Cameras for Civil Engineering Fatigue Testing. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 45, 195–202. [Google Scholar] [CrossRef] [Green Version]
  24. Fujita, S.; Furuya, O.; Niitsu, Y.; Mikoshiba, T.; Yamazaki, H. Study on 3-D measurement method for structural dynamic displacement in shake table tests using image processing. Seism. Eng. 2002, 2, 35–41. [Google Scholar]
  25. Sánchez-Aparicio, L.J.; Herrero-Huerta, M.; Esposito, R.; Schipper, H.; González-Aguilera, D. Photogrammetric solution for analysis of out-of-plane movements of a masonry structure in a large-scale laboratory experiment. Remote Sens. 2019, 11, 1871. [Google Scholar] [CrossRef] [Green Version]
  26. Markiewicz, J.; Łapiński, S.; Kot, P.; Tobiasz, A.; Muradov, M.; Nikel, J.; Shaw, A.; Al-Shamma’a, A. The quality assessment of different geolocalisation methods for a sensor system to monitor structural health of monumental objects. Sensors 2020, 20, 2915. [Google Scholar] [CrossRef] [PubMed]
  27. Jiang, R.; Jauregui, D.V.; White, K.R. Close-range photogrammetry applications in bridge measurement: Literature review. Measurement 2008, 41, 823–834. [Google Scholar] [CrossRef]
  28. Psimoulis, P.; Stiros, S. Measuring deflections of a short-span railway bridge using a robotic total station. J. Bridge Eng. 2013, 18, 182–185. [Google Scholar] [CrossRef]
  29. Basharat, A.; Catbas, N.; Shah, M. A framework for intelligent sensor network with video camera for structural health monitoring of bridges. In Proceedings of the Third IEEE International Conference on Pervasive Computing and Communications Workshops, Washington, DC, USA, 8–12 March 2005. [Google Scholar]
  30. Valença, J.; Júlio, E.; Araujo, H. Application of photogrammetry to bridge monitoring. In Proceedings of the 12th Conference Structural Faults and Repair, Edinburgh, UK, 12 June 2008. [Google Scholar]
  31. Jiang, R.; Jauregui, D.V. Development of a digital close-range photogrammetric bridge deflection measurement system. Measurement 2010, 43, 1431–1438. [Google Scholar] [CrossRef]
  32. Leitch, K.R. Close-Range Photogrammetric Measurement of Bridge Deformations: A Non-Contact Analysis Method, 1st ed.; Lambert Academic Publishing: London, UK, 2010; pp. 1–208. [Google Scholar]
  33. Hosseini, A.; Mostofinejad, D.; Hajialilue-Bonab, M. Displacement and strain field measurement in steel and RC beams using particle image velocimetry. J. Eng. Mech. 2014, 140, 04014086. [Google Scholar] [CrossRef]
  34. Erdenebat, D.; Waldmann, D.; Scherbaum, F.; Teferle, N. The deformation area difference (DAD) method for condition assessment of reinforced structures. Eng. Struct. 2018, 155, 315–329. [Google Scholar]
  35. Ahn, Y.; Peterson, S.; Nazari, M. Bridge Monitoring Using a Digital Camera: Photogrammetry-Based Bridge Dynamics Monitoring; Report; Mineta Transportation Institute: San José, CA, USA, 2019. [Google Scholar]
  36. Cunha, A.; Caetano, E.; Ribeiro, P. Optical metrology applied to 3D displacement measurement of long-span suspension bridge dynamics. In Proceedings of the 9th International Conference on Structural Dynamics, Porto, Portugal, 30 June–2 July 2014. [Google Scholar]
  37. Handayani, H.H.; Taufik, M. Preliminary study of bridge deformation monitoring using GPS and CRP (case study: Suramadu Bridge). Procedia Environ. Sci. 2015, 24, 266–276. [Google Scholar] [CrossRef] [Green Version]
  38. Lee, H.; Han, D. Deformation measurement of a railroad bridge using a photogrammetric board without control point survey. J. Sens. 2018, 4, 6851252. [Google Scholar] [CrossRef]
  39. Lee, H.S.; Rhee, H.N.; Oh, J.H.; Park, J.H. Measurement of 3-D vibrational motion by dynamic photogrammetry using least-square image matching for sub-pixel targeting to improve accuracy. Sensors 2016, 16, 359–374. [Google Scholar] [CrossRef]
  40. API Sensor. Available online: http://apisensor.com (accessed on 1 November 2022).
Figure 1. (a) Illustration to obtain consecutive stereo-image by the two cameras on the railway bridge at the moment of the train loading; (b) flow-chart for displacement measurement with sequential photogrammetric technique.
Figure 1. (a) Illustration to obtain consecutive stereo-image by the two cameras on the railway bridge at the moment of the train loading; (b) flow-chart for displacement measurement with sequential photogrammetric technique.
Applsci 13 01359 g001
Figure 2. Geometry of the coplanarity for relative orientation.
Figure 2. Geometry of the coplanarity for relative orientation.
Applsci 13 01359 g002
Figure 3. Matching to obtain image coordinates of the targets in sequence images.
Figure 3. Matching to obtain image coordinates of the targets in sequence images.
Applsci 13 01359 g003
Figure 4. (a) Test span of the curviform railway bridge in Suncheon of South Korea, two-camera setup with a remote controller and two receivers, laser tracker; (b) layout of 26 feature points and 18 target checkpoints on the test span bridge, N1 and N2 points used to adjust the scale of the ROPs, the zoomed-in view of a laser reflector in the lower left part of the bridge.
Figure 4. (a) Test span of the curviform railway bridge in Suncheon of South Korea, two-camera setup with a remote controller and two receivers, laser tracker; (b) layout of 26 feature points and 18 target checkpoints on the test span bridge, N1 and N2 points used to adjust the scale of the ROPs, the zoomed-in view of a laser reflector in the lower left part of the bridge.
Applsci 13 01359 g004
Figure 5. Train entry on the test span of the bridge. (a) The first maintenance train captured with five photos in seven seconds; (b) the second maintenance train captured with five photos in four seconds; (c) the third passenger train captured with six photos in 16 s.
Figure 5. Train entry on the test span of the bridge. (a) The first maintenance train captured with five photos in seven seconds; (b) the second maintenance train captured with five photos in four seconds; (c) the third passenger train captured with six photos in 16 s.
Applsci 13 01359 g005aApplsci 13 01359 g005b
Figure 6. Eighteen target checkpoint locations before and after scale adjustment of the relative orientation parameter. (a) Before scale adjustment; (b) after scale adjustment.
Figure 6. Eighteen target checkpoint locations before and after scale adjustment of the relative orientation parameter. (a) Before scale adjustment; (b) after scale adjustment.
Applsci 13 01359 g006
Figure 7. Target matching results using the correlation-coefficient and the least-squares techniques. (a) Matched points of the second images from the first images; (b) matched points of the 16th images from the first images.
Figure 7. Target matching results using the correlation-coefficient and the least-squares techniques. (a) Matched points of the second images from the first images; (b) matched points of the 16th images from the first images.
Applsci 13 01359 g007
Figure 8. Displacement results of a reflector position in 3D directions by the laser tracker and the proposed method. (a) First train; (b) second train; (c) third train.
Figure 8. Displacement results of a reflector position in 3D directions by the laser tracker and the proposed method. (a) First train; (b) second train; (c) third train.
Applsci 13 01359 g008aApplsci 13 01359 g008b
Figure 9. Enlarged deformation curve in the 3D axes’ directions. (a) First train; (b) second train; (c) third train.
Figure 9. Enlarged deformation curve in the 3D axes’ directions. (a) First train; (b) second train; (c) third train.
Applsci 13 01359 g009
Figure 10. Consecutive deformation patterns during the loading of the train measured by the proposed method. The 3D space (top) and X-Z plane (bottom) displacements are magnified by 10 and 100 times, respectively, and color represents time. (a) First train; (b) second train; (c) third train.
Figure 10. Consecutive deformation patterns during the loading of the train measured by the proposed method. The 3D space (top) and X-Z plane (bottom) displacements are magnified by 10 and 100 times, respectively, and color represents time. (a) First train; (b) second train; (c) third train.
Applsci 13 01359 g010
Table 1. Scale-adjusted ROPs of the cameras in two cases.
Table 1. Scale-adjusted ROPs of the cameras in two cases.
X0 (mm)Y0 (mm)Z0 (mm)ω (Degree)φ (Degree)κ (Degree)
Left894.18940.61788.10.0000.000−0.000
Right3451.18983.02454.0−0.58614.5941.921
σ ROP -0.2010.2010.0010.0040.001
Table 2. Location error of the checkpoints by the determined ROPs (mm).
Table 2. Location error of the checkpoints by the determined ROPs (mm).
Scale AdjustmentXYZ
RMSEAbs. Max. ErrorRMSEAbs. Max. ErrorRMSEAbs. Max. Error
Before100.6188.539.671.635.398.3
After1.02.10.51.11.42.9
Table 3. Difference in displacement between the photogrammetry and the laser tracker (unit: mm).
Table 3. Difference in displacement between the photogrammetry and the laser tracker (unit: mm).
TrainX DirectionY DirectionZ Direction
RMSEAbs. Max. ErrorRMSEAbs. Max. ErrorRMSEAbs. Max. Error
1st 0.51.40.40.91.12.7
2nd0.50.80.40.72.33.5
3rd 0.30.50.30.71.73.0
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lee, H.; Oh, J. 3D Displacement Measurement of Railway Bridge According to Cyclic Loads of Different Types of Railcars with Sequential Photogrammetry. Appl. Sci. 2023, 13, 1359. https://doi.org/10.3390/app13031359

AMA Style

Lee H, Oh J. 3D Displacement Measurement of Railway Bridge According to Cyclic Loads of Different Types of Railcars with Sequential Photogrammetry. Applied Sciences. 2023; 13(3):1359. https://doi.org/10.3390/app13031359

Chicago/Turabian Style

Lee, Hyoseong, and Jaehong Oh. 2023. "3D Displacement Measurement of Railway Bridge According to Cyclic Loads of Different Types of Railcars with Sequential Photogrammetry" Applied Sciences 13, no. 3: 1359. https://doi.org/10.3390/app13031359

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop