Next Article in Journal
River Restoration Units: Riverscape Units for European Freshwater Ecosystem Management
Previous Article in Journal
Experimental Dataset for Fiber Optic Specklegram Sensing Under Thermal Conditions and Use in a Deep Learning Interrogation Scheme
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Benchmark Dataset for the Validation of Phase-Based Motion Magnification-Based Experimental Modal Analysis

Department of Structural, Geotechnical and Building Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino, Italy
*
Author to whom correspondence should be addressed.
Data 2025, 10(4), 45; https://doi.org/10.3390/data10040045
Submission received: 13 February 2025 / Revised: 15 March 2025 / Accepted: 18 March 2025 / Published: 27 March 2025

Abstract

:
In recent years, the development of computer vision technology has led to significant implementations of non-contact structural identification. This study investigates the performance offered by the Phase-Based Motion Magnification (PBMM) algorithm, which employs video acquisitions to estimate the displacements of target pixels and amplify vibrations occurring within a desired frequency band. Using low-cost acquisition setups, this technique can potentially replace the pointwise measurements provided by traditional contact sensors. The main novelty of this experimental research is the validation of PBMM-based experimental modal analyses on multi-storey frame structures with different stiffnesses, considering six structural layouts with different configurations of diagonal bracings. The PBMM results, both in terms of time series and identified modal parameters, are validated against benchmarks provided by an array of physically attached accelerometers. In addition, the influence of pixel intensity on estimates’ accuracy is investigated. Although the PBMM method shows limitations due to the low frame rates of the commercial cameras employed, along with an increase in the signal-to-noise ratio in correspondence of bracing nodes, this method turned out to be effective in modal identification for structures with modest variations in stiffness in terms of height. Moreover, the algorithm exhibits modest sensitivity to pixel intensity. An open access dataset containing video and sensor data recorded during the experiments, is available to support further research at the following https://doi.org/10.5281/zenodo.10412857.

1. Introduction

Structural identification represents a fundamental issue in structural health monitoring (SHM). In fact, determining a target structure’s natural frequencies and respective mode shapes allows for not only the evaluation of the system response to dynamic excitations but also the potential assessment of the health status of structures since variations in modal parameters are correlated to the onset and development of damage [1,2]. This dynamic characterisation is conducted through the so-called indirect approach, i.e., running algorithms capable of estimating mode shapes from vibration recordings, typically fulfilled through physically attached sensors, providing pointwise measurements. Therefore, it is necessary to discretise continuous structures, which raises some issues, e.g., measurement accuracy and robustness, cost containment, data storage and analysis, ease of installation and access, etc.
These considerations have favoured the development of new non-contact image-based techniques since they remedy the above-mentioned issues [3,4,5,6,7,8]. Indeed, the required instrumentation is easier to implement and, above all, allows for denser measurements by tracking several points simultaneously after video acquisition. Furthermore, these new approaches enable data to be obtained on hard-to-reach points using zoom lenses [9]. Studies were conducted on the application of vision-based techniques even on videos and images captured by unmanned aerial vehicles (UAVs) [10,11].
Among the computer vision techniques proposed in recent years, the Phase-Based Motion Magnification (PBMM) algorithm [12] stands out. In addition to offering an estimate of displacements undergone by target pixels in frames, the PBMM technique can reconstruct videos, amplifying the vibration field and thus making visible oscillations that are imperceptible to the naked eye. Compared to other previous approaches for video magnification, e.g., [13], the PBMM method can support higher amplification factors since it does not amplify amplitudes but phases, and thus, it simply translates noise.
This algorithm has been implemented on laboratory models [14] but also on real-size structures, such as steel bridges [15,16] and historical masonry structures [17]. Other studies have tested its use in detecting and localising damage, as well as in the direct reproduction of damaged mode shapes [18,19,20]. Additionally, other researchers have evaluated its performance in combination with other computer vision techniques [21,22,23].
Compared to the studies conducted so far, this experimental research aims to assess the performance offered by PBMM for structural setups with stiffness varying along the height, using commercial-grade cameras. Indeed, the capabilities of PBMM are more significant with the use of high-speed and high-resolution cameras, as the range of frequencies that can be investigated increases, as does the level of detail attainable. However, such devices are more expensive and produce videos of remarkable size, which further burden the post-processing of data, undermining the convenience of this method. The PBMM method’s sensitivity to different lighting conditions is also investigated in this study.
The PBMM method’s performance is assessed in terms of the accuracy of the frequency and mode shape results obtained from video analysis, benchmarked against measurements from directly attached accelerometers. This comparative analysis involves six configurations of a metal frame, each stiffened with diagonal bracings installed in different positions and excited through hammer hits.
Contextually, the authors intend to provide a comprehensive dataset acquired during the numerous laboratory tests, including measurements from accelerometers, the impact hammer, and the two commercial-grade cameras involved. This dataset provides researchers with a valuable reference point for testing computer vision techniques for SHM, allowing for a direct comparison of video-based results with accelerometer data.
In contrast to existing datasets, such as those by Anjneya and Roy [24] and Hoda, Kuncham, and Sen [25], which mainly contain vibration measurements from physically attached sensors, the dataset presented in this study also includes video recordings of a structure under impulsive excitations. This combination enables the application and validation of computer vision algorithms, offering a benchmark for comparing video-based techniques against sensor-based measurements. Specifically, the dataset is suitable for validating algorithms for displacement estimation, such as motion tracking [26], optical flow estimation [27,28], and digital image correlation [29]. As such, this dataset represents a valuable contribution to the research community, providing material for analysing a structure’s oscillatory motion and validating vision-based approaches for structural health monitoring.
Section 2 briefly recalls the basics of PBMM, while Section 3 details the six structural configurations investigated in this research work. Section 4 describes the testing setup, reporting on both the hardware for video acquisitions and the physically attached accelerometers used as the benchmark. Section 5 presents the results obtained from the two typologies of impulsive tests conducted in this study: fixed input location tests, where the structure was repeatedly struck at the same points along both the X and Y directions, and roving hammer tests, where each test involved exciting the considered structural configurations at a different location. Section 6 deepens the discussion, while Section 7 summarises the main findings.

2. Phase-Based Motion Magnification

Phase-Based Motion Magnification is an innovative algorithm that magnifies tiny motions invisible to the naked eye, developed by MIT Computer Science and Artificial Intelligence Laboratory researchers [12,30]. It is based on the pixel intensity profile variations in the subsequent frames in a video with respect to a reference frame. Differing from other computer vision algorithms [13], rooted in the amplitude information of pixel intensities, the PBMM method tracks phase differences, allowing subtle motions to be enhanced without introducing significant artefacts or distortions. Hence, this method resorts to the Eulerian approach for motion field study, in analogy with fluid dynamics. Besides highlighting subtle motions in videos, PBMM can provide a robust and accurate quantitative estimation of their entity.
The PBMM algorithm starts with the projection of the brightness profile of each frame onto a complex steerable pyramid (CSP) [31,32], an overcomplete linear wavelet transform, similar to a spatially localised Fourier transform. The image intensity profile is thus decomposed on levels of different scale, r , and orientation, θ , resulting in complex coefficients, characterised by amplitude, A r , θ , and phase, ϕ r , θ . Phase differences between subsequent frames contain motion information [33,34] at different spatial scales and orientations, allowing for displacement time history reconstruction.
To isolate the motions of interest and eliminate contributions due to noise, the phase shifts associated with each CSP base function are spatially and temporally filtered. Lastly, the magnification coefficient, α , is applied to the selected phase differences, achieving video motion magnification. On the other hand, amplitudes are left unchanged.
Noise in phase differences is reduced through amplitude-weighted spatial smoothing, with a Gaussian kernel K ρ . Finally, collapsing the CSP, each image is reconstructed with enhanced displacements in the temporal frequency range desired [12,15]. For clarity, Table 1 provides a succinct description of the main algorithm parameters referred to in this chapter.

3. Structural Configurations

PBMM-based experimental modal analysis (EMA) was validated by studying six structural frames, obtained by assembling bracings in different positions on a three-storey square-based frame, portrayed in Figure 1.
The structural elements were made of aluminium EN AW 6060 (density of 2700 kg/m3, Young’s modulus of 69 × 109 Pa, and Shear modulus of 26 × 109 Pa), joined through steel bolted connections (density of 7850 kg/m3, Young’s modulus of 200 × 109 Pa, and Shear modulus of 80 × 109 Pa). The frame comprised four columns, whose rectangular cross-section dimensions were 20 mm × 3 mm, and three square plates, with sides 400 mm long and 5 mm thick and with inter-story distance of 300 mm between each. Connections were realised through steel angles with equal flanges of 20 mm in width and 2 mm in thickness. The frame was bound to a raft foundation made of a steel slab whose dimensions were 500 mm × 500 mm × 5 mm, leaning on four supports.
The reference system was defined as orienting the Z axis upward and aligning the X and Y axes, respectively, to the columns’ longer and shorter sides so that anticlockwise rotations had a positive sign. Its origin was placed in correspondence of an edge of steel plates, projected on the base slab plane. An overview of the starting model, enriched with the depiction of the six attached accelerometers, is depicted in Figure 2.
The initial structure was subsequently enriched with one or more diagonal bracings along the Y direction, named depending on their position, as can be seen in Table 2. This way, six configurations were obtained, named according to the installed bracings, and summarised in Table 3.
Such structural layouts were excited through impulsive tests, inferred by an impact hammer model 086C03 from PCB Piezotronics. In total, thirty-six tests were performed:
  • Twelve tests were carried out hitting the six structural layouts on the third floor, near the edges where the accelerometers were located, along the X and Y axes, according to channels 3Y and 6X in Figure 2. In the present paper, these tests are named “fixed input location tests”.
  • Twenty-four tests were conducted by systematically striking the configurations WO and B10B11 on the various floors near the installed sensors, always proceeding with one strike per test. In the present paper, such tests are called “Roving Hammer tests”.

4. Testing Setup

4.1. Physically Attached Sensor Setup

Signal acquisition relied on twelve mono-axial capacitive accelerometers, which were Model 3701G2FA3G by PCB Piezotronics (Depew, NY, USA). Four sensors were settled for each floor, oriented along the X and Y axes, as seen in Figure 2. Each channel was labelled according to the edge name and direction of the signal recording. Since the accelerometers’ sampling frequency amounted to 200 Hz, the highest frequency that could be theoretically investigated corresponded to 100 Hz, according to the Nyquist theorem. Accelerometer acquisitions lasted 61 s.
Under the hypothesis of rigid floor diaphragms, nine Degrees of freedom (DOFs) were associated with the three-storey frame structure, as the i-th generic floor was only supposed to undergo rigid floor displacements along the X and Y axes, u i and v i , respectively, and one torsional rotation about the Z axis, γ i . Therefore, at least nine vibrational modes for each frame structure were expected from each test.
During signal post-processing, a fifth-order bandpass Butterworth filter [35] with lower and upper limits of 1 Hz and 80 Hz was applied to remove the noise. This type of filter was selected because of its maximally flat frequency response in the passband, which minimises distortion in the extracted signals while effectively attenuating high-frequency noise. It is important to point out that this filtering choice reduced the range of investigable frequencies, establishing 80 Hz as the upper boundary to reduce noise in data. This approach aligns with established practices in experimental modal analysis, as demonstrated by Ewins [36], who recommends setting the frequency range of interest below the Nyquist limit to enhance the accuracy and reliability of the identified modal parameters.
To accomplish modal identification through acceleration time histories (THs), the Eigensystem Realisation Algorithm (ERA) [37] was applied to free decays of the acceleration time series. Stabilisation and cluster diagrams were generated for each test to discern real modes from numerical or spurious ones. The ERA was run iteratively, and the stability of each mode was tested with respect to different parameters related to modes detected in the previous cycle: frequency percentual variation, damping ratio percentual variation, and the Modal Assurance Criterion (MAC) value, computed between mode shapes normalised to the mass matrix.
The tolerances of these parameters and the ranges of system order considered were changed for the different configurations of the frame structure and are summarised in Table 4.

4.2. Video Hardware Setup

Two different commercial-grade cameras were used for video recording: tests along the Y direction were captured with a reflex camera, model Canon EOS 1100 D (Tokyo, Japan), while those along the X direction were filmed with an iPhone 11 (Apple Inc., Cupertino, CA, USA). In both cases, the cameras were mounted on a tripod and placed on a rigid surface, and an image stabiliser was activated, providing greater stability and reducing the impact of small, subtle, and unintended movements during video recording. For all acquisitions in both configurations, a video stabilisation routine from the MatLab® R2020b Computer Vision System Toolbox was applied to mitigate the potential impact of random environmental vibrations.
Before starting the tests, the position of each camera was carefully adjusted to verify the orthogonality of its axis with respect to the metal frame to minimise perspective distortions and possible errors related to a lack of orthogonality. The devices were set up to frame the structure in the central part of the image, aiming to minimise the impact of optical distortions, which tend to be more pronounced towards the edges. Future studies may further investigate the influence of residual distortions on the accuracy of displacement estimation through computer vision techniques.
Both devices featured a frame rate of 25 frames per second (fps), a display resolution of 1280 × 720 pixels, and an RGB24 pixel format. Further technical features of the cameras are available in the repository. When the image resolution differs from the physical resolution, residual position estimation errors may arise, potentially propagating to displacement estimates and affecting the results. However, higher resolutions, with a finer image discretisation in pixels, tend to reduce these errors, mitigating their impact on the final measurements.
Videos were encoded in H.264 (MPEG-4 AVC) format, with lossy compression applied by default. No post-processing modifications were made to compression or format settings, and the footage was analysed as recorded, with the chosen acquisition parameters aimed at minimising compression-related uncertainties.
The choice to use two different cameras was driven by the intention to explore potential performance differences between different devices. The iPhone 11 features a 12 MP camera with an f/1.8 aperture for the main lens, while the Canon EOS 1100D has a 12.2 MP APS-C CMOS sensor and an aperture depending on the lens used. For this study, an EF-S 18–55 mm f/3.5–5.6 IS II lens was used, given the short length between the camera and the frame structure. Despite these differences, the video performance between the two devices proved similar in terms of image quality, with both cameras being capable of recording at resolutions sufficient for structural displacement measurements. Future studies could be conducted to assess the impact of cameras with different technical characteristics on the estimation of the displacement of the frame.
The frame was lit by neon lights on the room ceiling and by a halogen twin-head tripod work light placed on one side of the structure, as visible in Figure 3. This last element determined a difference in metallic frame lighting conditions, whose effects on PBMM outcomes were investigated. To improve video post-processing, white uniform background panels were installed parallel to the two sides opposite the cameras.
Despite video recording lasting between sixty and seventy seconds, the number of processed frames was calibrated to focus on free decays, pursuing noise minimisation.
The video estimation of displacement THs was carried out by running the 2017 version of the PBMM software code available at [38] for MATLAB software. The algorithm estimated vibration time series at three points aligned along the same column, treated as virtual sensors and assumed to represent rigid floor translations caused by external excitation in the parallel direction. To investigate the PBMM method’s sensitivity to different pixel lighting conditions, virtual sensors lying on two foreground columns in each video materialised during video post-processing. Examples of virtual sensor settings are visible in Figure 4.
However, since only three rigid floor translations along the hammer hit could be estimated from each test, only the three corresponding flexural modes could be investigated, and thus, the number of detectable modes was reduced to three.
In this regard, it is anticipated that a systematic preliminary analysis of the Power Spectral Density (PSD) of the displacement time histories estimated by videos, after the application of the PBMM method, led to focusing only on the first mode of vibration for each direction of excitation of the structures, further reducing the number of target modes from three to one per video. Indeed, the PSDs showed prominent peaks primarily at the expected frequency of the first mode, while the second mode, if not already cut off by the Nyquist limit, was considerably less pronounced, suggesting an insufficiently reliable estimate of the latter.
For video motion magnification, the amplification factor, α, was set depending on configurations, as it needed to be raised for stiffer ones. Phase filtering on the different levels was performed by applying a second-order Butterworth filter with a bandwidth of 4 Hz, whose cutoff frequencies were varied in correlation to the configuration target frequency, previously estimated through accelerometer data. However, for a comparison of kinematic entities and ERA implementation, a fifth-order bandpass Butterworth filter, with a bandwidth of 4 Hz, was successively implemented.
For the definition of the cutoff frequencies of the applied bandpass filters, we proceeded by performing a sensitivity analysis, i.e., testing different amplitudes of intervals centred on the frequency of the first mode estimated by accelerometer data and assessing the accuracy of the video motion estimates against the time series from attached sensors. Table 5, Table 6 and Table 7 detail the values used for all the analysed tests.
Once displacements were estimated, a double gradient was performed to obtain acceleration THs, which were then processed by the ERA. Additionally, the double integration of the data from accelerometers also allowed for a qualitative comparison between PBMM and the attached sensors, both in terms of displacements and accelerations, in both the time and frequency domains, through a comparison of THs and Power Spectral Densities (PSDs). In this way, a preliminary assessment of result accuracy was achieved.
Physical modes were detected by plotting stabilisation and cluster diagrams, as conducted previously for the accelerometer data, but with a range of investigated system orders heavily shortened from four up to fifteen. The parameter settings are shown in Table 8.

5. Results

Hereinafter in this section, the results of the impulsive tests performed on the three-storey frame with rigid floor behaviour will be described in the following order. First, the identifications obtained with the classic acquisition setup—i.e., the physically attached accelerometers—will be presented for both the fixed input location (Section 5.1) and Roving Hammer (Section 5.2) procedures. These results will serve as the benchmark for the PBMM-based identifications (Section 5.3 and Section 5.4).

5.1. Fixed Input Location Tests—Accelerometers

The modal identifications of each examined configuration were conducted by post-processing the free decay responses triggered by the two tests performed by hitting the frame along the 3Y and 6X channels and averaging the identified modal parameters. For the WO and B10B11 configurations, the estimates obtained through the Roving Hammer tests, presented in the next subsection, were also included.
Natural frequencies and corresponding damping ratio outcomes are shown Table 9 and Table 10. Please be reminded that, for this subsection, the sorting of modes is conventionally fixed and does not necessarily follow frequency magnitudes. The corresponding nine mode shapes for the six structural layouts were also estimated, but they were not included to keep this paper concise.
Several observations regarding the overview of the results can be made:
  • The addition of diagonal bracing elements leads to higher natural frequencies of flexural modes along the Y direction, as noticeable in Figure 5, representing the trends in the five modes of greater interest for this study, estimated for the six configurations analysed.
  • Comparing the frequencies of these modes relative to the configurations B10, B1020, and B10B20B30, it is possible to understand that the addition of a diagonal bracing does not cause a linear increase in frequencies: in particular, the most significant growths take place after the introduction of the first diagonal, by which the bracing appears, and the third one, by which continuity is achieved along the height of the frame structure.
  • Torsional modes’ natural frequencies increase as well since torsional rigidities rise too. Due to this effect, in configurations B10B20B30 and B11B21B20B30, torsional modes are no longer identified within the limit of 80 Hz: they go out of the boundary.
  • X flexural modes are not influenced by rigidity increments in the Y direction.
  • For configurations with three and four bracings, the rise in both torsional and flexural stiffness is so remarkable that it induces a variation in mode shape arrangement. In configuration B10B20B30, the greatest increase turns out to be related to torsional modes, while in B11B21B20B30, Y flexural frequencies see larger increases. Hence, the two configurations allow for the definition of the different effects triggered by an eccentric bracing system, which is respectively continuous or discontinuous in height.
  • Flexural modes along the Y axis become less pure and more twisted, so it is not always easy to discern them from purely torsional modes. The introduction of eccentric bracings generates less symmetric systems, tending to vibrate with modes affected by greater torsional rotations. In these cases, stabilisation and cluster diagrams, as well as environmental PSDs, represent a valid support for detecting real natural frequencies.
  • Finally, looking at damping ratios, no clear correlations of growth appear to increase the number of diagonal bracings. This parameter is affected by a much more significant uncertainty, for known reasons.

5.2. Roving Hammer Tests—Accelerometers

For the Roving Hammer tests, accelerometer recordings were effectively processed in only ten tests of the WO configuration and eight ones of the B10B11 frame structure: tests on channels 1X and 4X were neglected, as the video quality was not perfect; furthermore, for the B10B11 layout, the presence of bracings along the Y axis compromised the reliability of parameters from tests on channels 1Y and 4Y. The Roving Hammer test results are gathered in Table 11, Table 12, Table 13 and Table 14. They were derived from setting constraints on frequency, the damping ratio, and MAC variations reported in Table 4 and were merged with the outcomes of fixed input location tests, obtaining modal parameter estimates shown in Table 9 and Table 10 for the WO and B10B11 configurations.

5.3. Fixed Input Location Tests—PBMM

As previously outlined, despite the relative ease of data acquisition, the PBMM-based approach potentially allowed a smaller number of mode shapes to be identified. In fact, for each test, only a single video was recorded, providing a front view of the frame, with the camera oriented perpendicularly to the excitation direction. Consequently, torsional modes could not be grasped, as corresponding rotations could not be estimated. The camera orientation only allowed for the estimation of floor displacements along the excitation direction, which ideally enabled the identification of the three flexural modes in that orientation.
However, the modest frame rate of the commercial-grade cameras used for video recordings constrained the analysis to the first two flexural modes for all structural configurations. Consequently, each video was analysed focusing on the first flexural mode shape along the excitation direction.
The sensitivity of the PBMM algorithm to different pixel lighting conditions was extensively examined in controlled laboratory environments by comparing the results obtained by extracting horizontal displacements from the left and right columns in the foreground, which were, respectively, brighter and darker. In addition, since some videos of the fixed input location tests along the X direction exhibited lower quality, it was possible to investigate the algorithm’s behaviour under suboptimal data conditions. The flexural modes in the X direction were investigated solely for the WO and B11B21B20B30 configurations, corresponding to the least and most rigid setups. Thus, the stability of X-direction flexural modes concerning the introduction of diagonal bracings could still be verified. The identified natural frequencies are collected in Table 15. While the modal shape components are omitted for brevity, their consistency is analysed in Section 6.3.

5.4. Roving Hammer Tests—PBMM

The present section gathers the modal frequencies identified from videos of the tests performed hitting the setups WO and B10B11 along the X and Y directions, in proximity to the accelerometers’ positions, in coherence with Figure 2.
To maintain conciseness, the results of the tests in the X direction are reported only for the B10B11 layout, as it was previously noticed that flexural modes along this direction proved to be unaffected by bracings in the Y one. For tests conducted by striking the B10B11 configuration on the first floor and in the Y direction, the displacement estimates yielded poor results due to the significant stiffening provided by the two installed braces. In line with expectations, the outputs of these two videos were consequently discarded for the EMA.
Table 16 presents the estimates for the WO layout, while Table 17 shows the results obtained for the B10B11 configuration.
Table 18 provides an analysis of the uncertainty affecting the estimates of natural frequency values.
The maximum value is found for the B10B11 configuration, for the tests along the X direction, with a higher stiffness of the frame pillars: a deviation of 0.055 Hz is recorded for the right column and 0.039 Hz for the left column. Both figures are higher than the respective ones for the Y-direction modes of the WO and B10B11 configurations. This feature could be related to the higher value of the frequency of the first mode along X, which suggests greater variability in its estimation.
In contrast, the lowest values are recorded for the tests on the B10B11 layout in the Y direction. This can be attributed to the neglect of experiments implying lower precision in video motion estimation, i.e., those performed by hitting the structure on the first floor.

6. Discussion

6.1. Comparison of Time Histories—Fixed Input Location Tests

First, a qualitative comparison between the attached sensors and PBMM approaches was made by juxtaposing accelerometer and displacement time histories from both methods. For each video, the time series from the PBMM method were synchronised with those from the installed sensors by aligning the former to the peak values of the latter. This was the first step before proceeding to a quantitative comparison.
Video reconstructed displacement THs are very accurate on any floor in configuration WO. At the same time, they lose quality as the structural rigidity increases. Thus, the magnitude of vibrations decreases: the installation of bracing on a floor systematically triggers a drop in PBMM displacement accuracy. This aspect becomes critical for configurations with three or four bracings. In contrast, time series in accelerations are regularly less accurate than the ones in displacements, with rising deviations as structural stiffness increases. Comparative graphs of three configurations are shown Figure 6. In some cases, i.e., at the level of certain floors, for displacements or accelerations, the outputs of the better lit column are more accurate than those of the darker one. However, this improvement does not occur systematically and does not prove to be crucial.

6.2. Comparison of Time Histories—Roving Hammer Tests

Analysing video-estimated time history plots, it can be noted that tests along the X axis seem to be penalised by the camera’s modest frame rate: peaks and troughs do not appear as sharp edges but rather like flat or slanted segments, seemingly cut off. In contrast, tests along the Y direction produce higher-quality displacement estimates, thanks to the low first Y flexural modal frequency of layouts WO and B10B11. As anticipated in Section 5.4, only the tests along channels 1Y and 4Y for the second configuration produced extremely inaccurate reconstructions, given the proximity of the bracing node to the excitation point. In line with expectations, the outputs of these two videos were consequently discarded to implement the EMA.
It is particularly interesting to notice that in the test hitting channel 4Y in configuration WO, the sensitivity of PBMM to pixel intensity changes can be caught on the first floor, and an anomalous displacement peak is detected, induced by a momentary obscuration of the virtual sensor, caused by the passage in that region of the operator’s arm. This aspect is visible in Figure 7. Apart from this, the different lighting conditions of the two foreground columns do not seem to produce systematic differences in PBMM displacement computations.
To mitigate the effects of transient occlusions, such as the case neatly identified in the present study, robust methods of background subtraction and adaptive shadow segmentation (e.g., adaptive blend models and threshold techniques) can be implemented [39,40,41]. These techniques dynamically detect and remove sudden occlusions from video sequences, potentially allowing for accurate displacement estimates despite any disturbing elements in this regard. Where possible, the optimisation of the illumination setup could help a priori to reduce the possibility of sudden shadow formation. Future work will incorporate these approaches into PBMM-based displacement estimation to improve the overall reliability of the measurement.

6.3. Comparison of EMA Results—Fixed Input Location Tests

A comparison of modal parameter estimates, derived from attached sensor recordings and PBMM computations, was carried out both qualitatively, assessing the coherence between mode shape plots, and quantitatively, through the computation of absolute and percentual variations in frequencies and damping ratios. In addition, MAC values were calculated to quantify the compatibility between mode shapes estimated through the two approaches analysed.
Figure 8a portrays a bar chart of percentual frequency variations to the accelerometers’ target values—ftarget—respectively, for the two sets of virtual sensors identified in pixels on the foreground columns, the left, lighter one, and the right, darker one—fPBMM. Figure 8b shows the trend in MAC values between PBMM estimates and accelerometer ones. For this last figure, the results relative to the test on configuration WO, along the X axis, were omitted due to the video’s low quality. Trends in damping ratios were not represented since they are characterised by high uncertainty, and thus, reliable conclusions about them could not be derived.
Figure 8a clearly shows that the accuracy in frequency estimates remains very high for configurations with bracings installed only until the first floor: for the structure B10B20, a sharp increment in frequency percentual variation is noticed, suddenly changing the relatively stable trend of minor deviations kept before. This feature indicates that the PBMM method struggles when stiffness irregularity no longer impacts a localised part of the structure, i.e., the lower part, but extends into a wider region. Furthermore, it should be noted that, for the stiffest configurations, frequency detection would have been strongly compromised if the accelerometers’ estimations had not been available.
It is important to note that even for the blurred video, i.e., fixed input location test WO X, modal frequencies seem consistent, as frequency percentual variations settle around 4%.
The bar chart of the MAC values, shown in Figure 8b, shows poor accuracy only for mode shapes computed from hammer tests along the Y direction on the stiffest configurations (B10B20B30 and B11B21B20B30). The test B11B21B20B30 X confirms the low influence of bracings inserted along the Y axis on flexural modes in the perpendicular direction.
In line with TH comparisons, the different luminosity conditions of the two foreground columns do not appear to return great dissimilarities among estimates, and therefore, the different pixel intensities do not significantly influence the identification of the first flexural mode.

6.4. Comparison of EMA Results—Roving Hammer Tests

This subsection synthesises the comparison between modal parameters derived from accelerometer recordings and camera videos: Figure 9 refers to Roving Hammer tests on configuration WO along the Y axis, while Figure 10 and Figure 11, respectively, to Roving Hammer tests on frame structure B10B11 along the X and the Y directions.
The graphs show that PBMM-estimated parameters exhibit good consistency with the accelerometer results: in the great majority of the tests, natural frequency deviations are lower than 2.5%, while MAC values constantly exceed 0.98, with the only exception being the hammer test performed on channel 1X of configuration B10B11, because of the greater rigidity of the pillar along that direction and the proximity of the hit point to the foundations. In general, as expected, PBMM accuracy grows for tests on the second and third floors as the vibration entity increases as well.

7. Conclusions

From the analyses performed, the PBMM method turns out to be well suited for the purpose of this study, especially being successful in identifying the first two flexural modes for configurations with bracings installed until the first floor.
Nevertheless, the PBMM video-based approach faces two main challenges: (i) The small displacements at bracing connections significantly reduce the signal-to-noise ratio, leading to a phenomenon akin to using a lens with insufficient magnification power. This reduces the ability to capture precise motion in these regions. (ii) The algorithm struggles with higher modes, where natural frequencies approach the limits imposed by the Nyquist theorem, resulting in compromised signal reconstruction. This becomes a particularly critical issue for commercial cameras with relatively low frame rates, as they cannot capture high-frequency motions with the required accuracy.
Regarding point (i), possible improvements may be achieved with minor practical adjustments such as running videos at a shorter distance from the target object, consequently increasing the number of pixels in which each frame is divided and making the PBMM method able to capture smaller and smaller oscillations. The second issue (ii) is solely related to camera features and can only be bypassed by improving some of them. In addition to the acquisition frame rate, the camera shutter speed may also influence the accuracy of identification of higher modes. Indeed, if the exposure time is not sufficiently short, rapid structural vibrations might cause motion blur, compromising the sharpness of each frame and effectively masking high-frequency displacements. This trade-off is particularly critical in low-light conditions, where increasing the frame rate may require a longer exposure time, risking motion blur, or a higher sensor gain (ISO), which amplifies the signal but also introduces noise, potentially affecting the accuracy of the captured displacements. In addition to these factors, the sensor’s intrinsic noise level can further influence accuracy, especially in low-light or high-gain conditions. Higher noise levels may obscure subtle high-frequency vibrations, blending them with random signal fluctuations and complicating the identification of higher modes.
Furthermore, it should also be noted that the calibration process may affect video motion computation, as it is important to dismiss the error measure through conventional cameras. Indeed, calibration corrects for lens distortions and spatial scaling errors, which are critical for accurate video-based displacement estimation using PBMM. Future studies will be carried out to investigate how different calibration methodologies affect PBMM outcomes to further reduce measurement uncertainties.
In any case, these improvements usually entail heavier computational costs and should, therefore, be designed based on a cost–benefit analysis.
The PBMM algorithm proved to have modest sensitivity to different pixel luminosity conditions: analysing the results obtained by the pixels extracted from the two foreground columns in each video, no systematic and significant differences in accuracy emerged in terms of the benchmark represented by the accelerometers’ post-processing recording. The successful validation of the algorithm’s robustness under varying lighting conditions highlights its potential for reliable outdoor applications, paving the way for broader implementation in real-world structural health monitoring.

Author Contributions

Conceptualisation, M.C. and G.M.; methodology, M.C., G.M. and P.D.; investigation, M.C., G.M. and P.D.; data curation, M.C., G.M. and P.D.; writing—original draft preparation, P.D.; writing—review and editing, M.C., G.M. and R.C.; supervision, M.C., G.M. and R.C.; project administration, R.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data, models, and codes generated or used during this study are available in the https://doi.org/10.5281/zenodo.10412857 repository online, in accordance with funder data retention policies.

Acknowledgments

The research conducted in this work was performed within the activity of the POC instrument and POC transition “CAMELOT” project. The authors would like to thank Muhammad Abraiz for his contribution to editing the present article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ren, W.-X.; De Roeck, G. Structural damage identification using Modal Data. I: Simulation verification. J. Struct. Eng. 2002, 128, 87–95. [Google Scholar]
  2. Farrar, C.; Baker, W.E.; Bell, T.M.; Cone, K.M.; Darling, T.W.; Duffey, T.A.; Eklund, A.; Migliori, A. Dynamic Characterization and Damage Detection in the I-40 Bridge over the Rio Grande; Los Alamos National Lab.: Los Alamos, NM, USA, 1994. [Google Scholar]
  3. Sabato, A.; Sarrafi, A.; Mao, Z.; Niezrecki, C. Advancements in Structural Health Monitoring Using Vision-Based and Optical Techniques. In Proceedings of the 7th Asia-Pacific Workshop on Structural Health Monitoring; Non Destructive Testing: Bad Breisig, Germany, 2018. [Google Scholar]
  4. Niezrecki, C.; Baqersad, J.; Sabato, A. Digital image correlation techniques for non-destructive evaluation and structural health monitoring. In Handbook of Advanced Non-Destructive Evaluation; Springer: Cham, Switzerland, 2018. [Google Scholar]
  5. Poozesh, P.; Sabato, A.; Sarrafi, A.; Niezrecki, C.; Avitabile, P.; Yarala, R. Multicamera measurement system to evaluate the dynamic response of utility-scale wind turbine blades. Wind Energy 2020, 23, 1619–1639. [Google Scholar]
  6. Feng, D.; Feng, M.Q. Experimental validation of cost-effective vision-based structural health monitoring. Mech. Syst. Signal Process. 2017, 88, 199–211. [Google Scholar]
  7. Chen, J.G.; Davis, A.; Wadhwa, N.; Durand, F.; Freeman, W.T.; Büyüköztürk, O. Video camera–based vibration measurement for civil infrastructure applications. J. Infrastruct. Syst. 2017, 23, B4016013. [Google Scholar]
  8. Lou, K.; Kong, X.; Li, J.; Hu, J.; Lu, D. Motion magnification for video-based vibration measurement of civil structures: A review. Mech. Syst. Signal Process. 2024, 220, 111681. [Google Scholar]
  9. Feng, D.; Feng, M.Q. Computer vision for SHM of civil infrastructure: From dynamic response measurement to damage detection—A review. Eng. Struct. 2018, 156, 105–117. [Google Scholar]
  10. Ellenberg, A.; Branco, L.; Krick, A.; Bartoli, I.; Kontsos, A. Use of unmanned aerial vehicle for quantitative infrastructure evaluation. J. Infrastruct. Syst. 2015, 21, 04014054. [Google Scholar]
  11. Reagan, D.; Sabato, A.; Niezrecki, C.; Yu, T.; Wilson, R. An autonomous unmanned aerial vehicle sensing system for structural health monitoring of bridges. In Nondestructive Characterization and Monitoring of Advanced Materials, Aerospace, and Civil Infrastructure; SPIE: Las Vegas, NV, USA, 2016; Volume 9804, pp. 244–252. [Google Scholar]
  12. Wadhwa, N.; Rubinstein, M.; Durand, F.; Freeman, W.T. Phase-based video motion processing. ACM Trans. Graph. 2013, 32, 1–10. [Google Scholar]
  13. Wu, H.-Y.; Rubinstein, M.; Shih, E.; Guttag, J.; Durand, F.; Freeman, W.T. Eulerian video magnification for revealing subtle changes in the world. ACM Trans. Graph. 2012, 31, 1–8. [Google Scholar]
  14. Chen, J.G.; Wadhwa, N.; Cha, Y.-J.; Durand, F.; Freeman, W.T. Modal identification of simple structures with high-speed video using motion magnification. J. Sound Vib. 2015, 345, 58–71. [Google Scholar]
  15. Wadhwa, N.; Chen, J.G.; Sellon, J.B.; Wei, D.; Rubinstein, M.; Ghaffari, R.; Freeman, D.M.; Büyüköztürk, O.; Wang, P.; Sun, S.; et al. Motion microscopy for visualizing and quantifying small motions. Proc. Natl. Acad. Sci. USA 2017, 114, 11639–11644. [Google Scholar] [CrossRef] [PubMed]
  16. Shang, Z.; Shen, Z. Multi-point vibration measurement and mode magnification of civil structures using video-based motion processing. Autom. Constr. 2018, 93, 231–240. [Google Scholar] [CrossRef]
  17. Fioriti, V.; Roselli, I.; Tati, A.; Romano, R.; De Canio, G. Motion Magnification Analysis for structural monitoring of ancient constructions. Measurement 2018, 129, 375–380. [Google Scholar] [CrossRef]
  18. Civera, M.; Surace, C.; Fragonara, L.Z. An experimental study of the feasibility of phase-based video magnification for damage detection and localisation in operational deflection shapes. Strain 2020, 56, e12336. [Google Scholar] [CrossRef]
  19. Civera, M.; Fragonara, L.Z.; Antonaci, P.; Anglani, G.; Surace, C. An Experimental Validation of Phase-Based Motion Magnification for Structures with Developing Cracks and Time-Varying Configurations. Shock. Vib. 2021, 2021, 5518163. [Google Scholar] [CrossRef]
  20. Sarrafi, A.; Mao, Z.; Niezrecki, C.; Poozesh, P. Vibration-based damage detection in wind turbine blades using Phase-based Motion Estimation and motion magnification. J. Sound Vib. 2018, 421, 300–318. [Google Scholar] [CrossRef]
  21. Molina-Viedma, A.; Felipe-Sesé, L.; López-Alba, E.; Díaz, F. 3D mode shapes characterization using phase-based motion magnification in large structures using stereoscopic DIC. Mech. Syst. Signal Process. 2018, 108, 140–155. [Google Scholar] [CrossRef]
  22. Poozesh, P.; Sarrafi, A.; Mao, Z.; Avitabile, P.; Niezrecki, C. Feasibility of extracting operating shapes using phase-based motion magnification technique and stereo-photogrammetry. J. Sound Vib. 2017, 407, 350–366. [Google Scholar] [CrossRef]
  23. Yunus, E.H.; Gulan, U.; Holzner, M.; Chatzi, E. A novel approach for 3D-structural identification through video recording: Magnified tracking. Sensors 2019, 19, 1229. [Google Scholar] [CrossRef]
  24. Anjneya, K.; Roy, K. Acceleration time history dataset for a 3D miniature model of a shear building with structural damage. Data Brief 2021, 38, 107377. [Google Scholar] [CrossRef]
  25. Hoda, M.A.; Kuncham, E.; Sen, S. Response and input time history dataset and numerical models for a miniaturized 3D shear frame under damaged and undamaged conditions. Data Brief 2022, 45, 108692. [Google Scholar] [CrossRef] [PubMed]
  26. Pan, B. Digital image correlation for surface deformation measurement: Historical developments, recent advances and future goals. Meas. Sci. Technol. 2018, 29, 082001. [Google Scholar] [CrossRef]
  27. Fortun, D.; Bouthemy, P.; Kervrann, C. Optical flow modeling and computation: A survey. Comput. Vis. Image Underst. 2015, 134, 1–21. [Google Scholar] [CrossRef]
  28. Alfarano, A.; Maiano, L.; Papa, L.; Amerini, I. Estimating optical flow: A comprehensive review of the state of the art. Comput. Vis. Image Underst. 2024, 249, 104160. [Google Scholar]
  29. Sutton, M.A. Digital Image Correlation for Shape and Deformation Measurements. In Springer Handbook of Experimental Solid Mechanics; Springer US: Boston, MA, USA, 2008; pp. 565–600. [Google Scholar]
  30. Wadhwa, N. Revealing and Analyzing Imperceptible Deviations in Images and Videos. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2016. [Google Scholar]
  31. Portilla, J.; Simoncelli, E. A Parametric Texture Model Based on Joint Statistics of Complex Wavelet Coefficients. Int. J. Comput. Vis. 2000, 40, 49–70. [Google Scholar] [CrossRef]
  32. Simoncelli, E.P.; Freeman, W.T. The steerable pyramid: A flexible architecture. In Proceedings of the International Conference on Image Processing, Washington, DC, USA, 23–26 October 1995. [Google Scholar]
  33. Fleet, D.J.; Jepson, A.D. Computation of component image velocity from local phase information. Int. J. Comput. Vis. 1990, 5, 77–104. [Google Scholar] [CrossRef]
  34. Gautama, T.; Van Hulle, M. A phase-based approach to the estimation of the optical flow using spatial filtering. IEEE Trans. Neural Netw. 2002, 13, 1127–1136. [Google Scholar]
  35. Butterworth, S. On the Theory of Filter Amplifiers. Exp. Wirel. Wirel. Eng. 1930, 7, 536–541. [Google Scholar]
  36. Ewins, D.J. Modal Testing: Theory, Practice and Application, 2nd; Research Studies Press Ltd.: Hertfordshire, UK, 2000. [Google Scholar]
  37. Juang, J.N.; Pappa, R.S. An eigensystem realization algorithm for modal parameter identification and model reduction. J. Guid. Control Dyn. 1985, 8, 620–627. [Google Scholar] [CrossRef]
  38. Motion Microscopy for Visualizing and Quantifying Small Motions. Available online: https://people.csail.mit.edu/nwadhwa/motion-microscope/ (accessed on 17 March 2025).
  39. Stauffer, C.; Grimson, W. Adaptive background mixture models for real-time tracking. In Proceedings of the 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149), Fort Collins, CO, USA, 23–25 June 1999. [Google Scholar]
  40. Finlayson, G.D.; Hordley, S.D.; Lu, C.; Drew, M.S. On the removal of shadows from images. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 59–68. [Google Scholar] [CrossRef]
  41. Bouwmans, T. Traditional and recent approaches in background modeling for foreground detection: An overview. Comput. Sci. Rev. 2014, 11–12, 31–66. [Google Scholar]
Figure 1. Photos of the investigated case study: (a) structural overview, (b) details of the bolted connection between the pillar and the top plate, (c) details of the bolted connection with the base plate.
Figure 1. Photos of the investigated case study: (a) structural overview, (b) details of the bolted connection between the pillar and the top plate, (c) details of the bolted connection with the base plate.
Data 10 00045 g001
Figure 2. A representation of the frame structure, spatially defined by the global reference system and acquisition channels.
Figure 2. A representation of the frame structure, spatially defined by the global reference system and acquisition channels.
Data 10 00045 g002
Figure 3. The top view of the video recording setup. The distances between the cameras, the frame, and the halogen lights are not in scale but purely indicative.
Figure 3. The top view of the video recording setup. The distances between the cameras, the frame, and the halogen lights are not in scale but purely indicative.
Data 10 00045 g003
Figure 4. Visualisation of virtual sensors for configuration B11B21B20B30: (a) tests along the X direction; (b) tests along the Y direction. The intersections of the crosses represent the virtual sensors identified on the left foreground columns (in red) and right foreground columns (in blue).
Figure 4. Visualisation of virtual sensors for configuration B11B21B20B30: (a) tests along the X direction; (b) tests along the Y direction. The intersections of the crosses represent the virtual sensors identified on the left foreground columns (in red) and right foreground columns (in blue).
Data 10 00045 g004
Figure 5. Natural frequencies for five of the nine identified modes of the six frame structures.
Figure 5. Natural frequencies for five of the nine identified modes of the six frame structures.
Data 10 00045 g005
Figure 6. Comparison of acceleration and displacement time histories obtained from accelerometers and PBMM algorithm: (a) data for configuration WO; (b) data for configuration B10B11; (c) data for configuration B10B20B30.
Figure 6. Comparison of acceleration and displacement time histories obtained from accelerometers and PBMM algorithm: (a) data for configuration WO; (b) data for configuration B10B11; (c) data for configuration B10B20B30.
Data 10 00045 g006
Figure 7. Time series from the test on configuration WO, along channel 4Y: (a) displacement THs; (b) acceleration THs. The ovals highlight the abnormal oscillation detected by the PBMM method due to the sudden and abrupt change in brightness in the monitored area of the left column caused by the shadow of the operator’s arm.
Figure 7. Time series from the test on configuration WO, along channel 4Y: (a) displacement THs; (b) acceleration THs. The ovals highlight the abnormal oscillation detected by the PBMM method due to the sudden and abrupt change in brightness in the monitored area of the left column caused by the shadow of the operator’s arm.
Data 10 00045 g007
Figure 8. Overview of results of fixed input location tests. (a) Comparison between percentual modal frequency variation. (b) MAC trends, focusing on values greater than 0.8.
Figure 8. Overview of results of fixed input location tests. (a) Comparison between percentual modal frequency variation. (b) MAC trends, focusing on values greater than 0.8.
Data 10 00045 g008
Figure 9. An overview of the results from the Roving Hammer tests on layout WO along the Y direction. (a) A comparison between percentual modal frequency variation, with the dashed line marking the value 2.5%. (b) MAC trends, focusing on values greater than 0.8 and with the dashed line marking the value of 0.98.
Figure 9. An overview of the results from the Roving Hammer tests on layout WO along the Y direction. (a) A comparison between percentual modal frequency variation, with the dashed line marking the value 2.5%. (b) MAC trends, focusing on values greater than 0.8 and with the dashed line marking the value of 0.98.
Data 10 00045 g009
Figure 10. An overview of the results from the Roving Hammer tests on layout B10B11 along the X direction. (a) A comparison between percentual modal frequency variation, with the dashed line marking the value 2.5%. (b) MAC trends, focusing on values greater than 0.8 and with the dashed line marking the value of 0.98.
Figure 10. An overview of the results from the Roving Hammer tests on layout B10B11 along the X direction. (a) A comparison between percentual modal frequency variation, with the dashed line marking the value 2.5%. (b) MAC trends, focusing on values greater than 0.8 and with the dashed line marking the value of 0.98.
Data 10 00045 g010
Figure 11. An overview of the results from the Roving Hammer tests on layout B10B11 along the Y direction. (a) A comparison between percentual modal frequency variation, with the dashed line marking the value 2.5%. (b) MAC trends, focusing on values greater than 0.8 and with the dashed line marking the value of 0.98.
Figure 11. An overview of the results from the Roving Hammer tests on layout B10B11 along the Y direction. (a) A comparison between percentual modal frequency variation, with the dashed line marking the value 2.5%. (b) MAC trends, focusing on values greater than 0.8 and with the dashed line marking the value of 0.98.
Data 10 00045 g011
Table 1. A description of input parameters for the implementation of the Phase-Based Motion Magnification algorithm.
Table 1. A description of input parameters for the implementation of the Phase-Based Motion Magnification algorithm.
ParameterMeaning
Amplification factor (α)Determines the magnitude of the amplification applied to the bandpass phase differences, controlling the level of video motion magnification.
Complex steerable pyramidA multi-scale and multi-orientation image representation is used to decompose the video into phase and amplitude components, implementing video motion analysis.
Gaussian kernel ( K ρ )A Gaussian filter is applied to attenuate high-frequency components in the spatial domain, reducing noise and improving phase stability.
Frequency cutoff limitsDefine the range of frequencies for phase filtering and motion estimation, selecting the motion components to be amplified.
Camera frame rateThe frequency at which video frames are acquired, which influences temporal resolution and the ability to capture fast movements.
Target video frame intervalSpecifies the frame interval of the video to be processed, affecting the duration and continuity of motion analysis.
Table 2. Overview of bracing positions (coordinates X, Y, Z in meters).
Table 2. Overview of bracing positions (coordinates X, Y, Z in meters).
Bracing IDStarting FloorStarting Coordinates
(X, Y, Z) (m)
Ending FloorEnding Coordinates
(X, Y, Z) (m)
B10Ground floor(0,0.4,0)1st floor(0,0,0.3)
B11Ground floor(0.4,0.4,0)1st floor(0.4,0,0.3)
B201st floor(0,0.4,0.3)2nd floor(0,0,0.6)
B211st floor(0.4,0.4,0.3)2nd floor(0.4,0,0.6)
B302nd floor(0,0.4,0.6)3rd floor(0,0,0.9)
Table 3. Summary of experimental structural layouts.
Table 3. Summary of experimental structural layouts.
WOB10B10B20B10B11B10B20
B30
B11B21
B20B30
Data 10 00045 i001Data 10 00045 i002Data 10 00045 i003Data 10 00045 i004Data 10 00045 i005Data 10 00045 i006
Table 4. Parameter settings for stable mode detection from accelerometer recordings.
Table 4. Parameter settings for stable mode detection from accelerometer recordings.
ConfigurationFrequency Variation Tolerance (%)Damping Ratio Variation Tolerance (%)Minimum MAC Value (-)System Order Range (-)
WO0.2150.9515–50
B100.2150.9515–50
B10B200.5100.9715–50
B10B110.5100.9715–50
B10B20B301.0150.9515–50
B11B21B20B301.0150.9515–50
Table 5. The cutoff frequencies, in Hz, of the Butterworth filters applied for video motion estimation, for fixed input location tests.
Table 5. The cutoff frequencies, in Hz, of the Butterworth filters applied for video motion estimation, for fixed input location tests.
TestLow Cutoff FrequencyHigh Cutoff Frequency
WO X4.008.00
WO Y1.005.00
B10 Y1.505.50
B10B20 Y2.506.50
B10B11 Y 2.006.00
B10B20B30 Y3.007.00
B10B11B20B30 X4.008.00
B10B11B 20B30 Y7.0011.00
Table 6. Cutoff frequencies of Butterworth filters applied for video motion estimation, for Roving Hammer tests, for WO configuration.
Table 6. Cutoff frequencies of Butterworth filters applied for video motion estimation, for Roving Hammer tests, for WO configuration.
TestLower Cutoff Frequency [Hz]Higher Cutoff Frequency [Hz]
1Y15
4Y15
2Y15
5Y15
3Y15
6Y15
Table 7. Cutoff frequencies of Butterworth filters applied for video motion estimation, for Roving Hammer tests, for B10B11 configuration.
Table 7. Cutoff frequencies of Butterworth filters applied for video motion estimation, for Roving Hammer tests, for B10B11 configuration.
TestLower Cutoff Frequency [Hz]Higher Cutoff Frequency [Hz]
1X48
4X48
2X48
5X48
3X48
6X48
2Y26
5Y26
3Y26
6Y26
Table 8. Parameter settings for stable mode detection from PBMM-extracted time series.
Table 8. Parameter settings for stable mode detection from PBMM-extracted time series.
ConfigurationFrequency Variation Tolerance (%)Damping Ratio Variation Tolerance (%)Minimum MAC Value (-)System Order Range (-)
WO5250.954–15
B105250.954–15
B10B205300.904–15
B10B115250.904–15
B10B20B305300.904–15
B11B21B20B305300.904–15
Table 9. Trends in natural frequencies, in Hz.
Table 9. Trends in natural frequencies, in Hz.
Vibrational ModeW0B10B10B20B10B11B10B20
B30
B11B21
B20B30
1st Y flexural2.94 ± 0.033.73 ± 0.064.59 ± 0.004.05 ± 0.075.16 ± 0.289.11
1st X flexural5.96 ± 0.105.86 ± 0.026.04 ± 0.026.13 ± 0.096.24 ± 0.016.07 ± 0.23
1st torsional7.70 ± 0.057.78 ± 0.018.78 ± 0.0111.52 ± 0.0620.6912.35 ± 0.26
2nd Y flexural8.89 ± 0.0311.79 ± 0.0013.9411.84 ± 0.0319.64 ± 0.7232.32 ± 0.00
3rd Y flexural13.65 ± 0.0323.06 ± 0.5323.9923.25 ± 0.1138.00 ± 0.1157.38
2nd X flexural26.43 ± 0.2626.21 ± 0.1225.94 ± 0.1526.11 ± 0.1626.32 ± 0.2525.86 ± 0.52
2nd torsional30.52 ± 0.1239.4939.5745.30 ± 0.7053.85 ± 0.1744.08 ± 1.36
3rd X flexural64.04 ± 0.1163.86 ± 0.0563.89 ± 0.1163.91 ± 0.0764.07 ± 0.2063.78 ± 0.31
3rd torsional73.04 ± 0.1775.1077.7175.91 ± 0.32--
Table 10. Trends in damping ratios, in %.
Table 10. Trends in damping ratios, in %.
Vibrational ModeW0B10B10B20B10B11B10B20B30B11B21
B20B30
1st Y flexural1.26 ± 0.491.73 ± 0.834.19 ± 1.012.11 ± 1.142.55 ± 1.443.14
1st X flexural2.39 ± 0.262.05 ± 0.362.79 ± 0.542.02 ± 0.481.46 ± 0.522.31 ± 0.35
1st torsional1.97 ± 0.162.07 ± 0.304.50 ± 2.812.38 ± 0.722.886.19 ± 1.83
2nd Y flexural1.14 ± 0.320.73 ± 0.031.550.42 ± 0.181.89 ± 0.651.03 ± 0.22
3rd Y flexural0.59 ± 0.101.58 ± 0.245.062.23 ± 0.502.61 ± 1.653.53
2nd X flexural1.44 ± 0.521.35 ± 0.231.50 ± 0.231.32 ± 0.222.02 ± 0.861.73 ± 0.31
2nd torsional1.31 ± 0.130.681.642.08 ± 0.373.33 ± 0.522.29 ±1.88
3rd X flexural 0.60 ± 0.290.53 ± 0.080.35 ± 0.060.61 ± 0.060.61 ± 0.110.67 ± 0.11
3rd torsional1.18 ± 0.291.020.880.86 ± 0.26--
Table 11. Natural frequency estimates from Roving Hammer tests on configuration WO, in Hz.
Table 11. Natural frequency estimates from Roving Hammer tests on configuration WO, in Hz.
Vibrational Mode—Layout WOHit Channel
2X3X5X6X1Y2Y3Y4Y5Y6Y
1st Y flexural2.932.923.012.962.962.912.912.942.932.92
1st X flexural5.885.895.975.856.195.935.876.115.975.93
1st torsional7.687.687.737.657.827.757.687.817.737.71
2nd Y flexural8.938.908.938.878.928.868.878.908.868.87
3rd Y flexural13.6613.6613.70-13.6613.6313.6213.6413.6213.62
2nd X flexural26.3026.3226.4826.3326.56-26.8126.5326.5926.49
2nd torsional30.3630.4430.4930.3230.6730.5230.5130.7330.5830.54
3rd X flexural63.9564.0064.1364.1164.0763.7864.2064.1363.9963.97
3rd torsional73.3272.8873.0072.8373.3572.9172.9972.9073.0473.08
Table 12. Damping ratio estimates from Roving Hammer tests on configuration WO, in %.
Table 12. Damping ratio estimates from Roving Hammer tests on configuration WO, in %.
Vibrational Mode—Layout WOHit Channel
2X3X5X6X1Y2Y3Y4Y5Y6Y
1st Y flexural0.600.880.611.981.061.511.511.321.411.58
1st X flexural2.632.512.162.442.342.112.012.642.552.57
1st torsional1.971.871.701.941.952.262.241.891.982.07
2nd Y flexural0.991.000.981.570.831.041.270.901.191.16
3rd Y flexural0.510.600.38-0.560.690.650.600.700.61
2nd X flexural1.291.441.511.241.10-1.251.091.431.18
2nd torsional1.441.441.311.451.321.211.180.991.341.37
3rd X flexural0.500.590.290.441.190.920.350.740.960.40
3rd torsional1.670.981.401.540.841.341.211.320.691.00
Table 13. Natural frequency estimates from Roving Hammer tests on configuration B10B11, in Hz.
Table 13. Natural frequency estimates from Roving Hammer tests on configuration B10B11, in Hz.
Vibrational Mode—Layout B10B11Hit Channel
2X3X5X6X3Y4Y5Y6Y
1st Y flexural4.124.114.09-4.063.993.994.00
1st X flexural6.086.046.136.086.316.156.226.17
1st torsional11.5411.6311.5311.4611.5911.4911.4211.53
2nd Y flexural----11.8611.8211.8311.88
3rd Y flexural-----23.17--
2nd X flexural26.0025.9426.0626.0126.4126.2626.2626.12
2nd torsional46.3746.1845.5745.8444.6644.6844.7244.62
3rd X flexural63.8963.8463.9963.96-63.9163.9163.92
3rd torsional-76.6476.1175.9775.7276.0075.8275.72
Table 14. Damping ratio estimates from Roving Hammer tests on configuration B10B11, in %.
Table 14. Damping ratio estimates from Roving Hammer tests on configuration B10B11, in %.
Vibrational Mode—Layout B10B11Hit Channel
2X3X5X6X3Y4Y5Y6Y
1st Y flexural0.490.802.25-2.253.012.672.59
1st X flexural1.982.051.793.571.042.831.861.99
1st torsional2.091.722.101.861.482.202.581.94
2nd Y flexural----0.440.190.400.61
3rd Y flexural-----1.88--
2nd X flexural1.361.531.361.291.701.141.330.93
2nd torsional2.512.022.642.491.741.891.842.00
3rd X flexural0.710.640.590.66-0.570.570.52
3rd torsional-1.190.630.770.720.560.780.88
Table 15. Natural frequencies, in Hz, estimated through PBMM.
Table 15. Natural frequencies, in Hz, estimated through PBMM.
ConfigurationVibrational ModePBMM
Left Column
PBMM
Right Column
WO1st Y flexural2.992.91
WO1st X flexural6.206.22
B101st Y flexural3.883.88
B10B201st Y flexural5.155.29
B10B111st Y flexural3.944.18
B10B20B301st Y flexural5.615.54
B10B11B20B301st Y flexural9.269.32
B10B11B20B301st X flexural6.156.09
Table 16. Natural frequencies, in Hz, estimated through PBMM, for the configuration WO.
Table 16. Natural frequencies, in Hz, estimated through PBMM, for the configuration WO.
Hit Channel—WO LayoutHit FloorVibrational ModePBMM Left ColumnPBMM Right Column
1YFirst floor1st Y flexural3.013.01
4YFirst floor1st Y flexural3.003.01
2YSecond floor1st Y flexural2.962.98
5YSecond floor1st Y flexural2.972.98
3YThird floor1st Y flexural3.002.96
6YThird floor1st Y flexural2.972.96
Table 17. Natural frequencies, in Hz, estimated through PBMM, for the configuration B10B11.
Table 17. Natural frequencies, in Hz, estimated through PBMM, for the configuration B10B11.
Hit Channel—B10B11 LayoutHit FloorVibrational ModePBMM Left ColumnPBMM Right Column
1XFirst floor1st X flexural6.276.24
4XFirst floor1st X flexural6.266.23
2XSecond floor1st X flexural6.226.19
5XSecond floor1st X flexural6.236.19
3XThird floor1st X flexural6.196.10
6XThird floor1st X flexural6.176.13
2YSecond floor1st Y flexural4.154.15
5YSecond floor1st Y flexural4.144.14
3YThird floor1st Y flexural4.164.15
6YThird floor1st Y flexural4.144.14
Table 18. Standard deviations for PBMM-estimated frequencies, in Hz, relative to Roving Hammer tests.
Table 18. Standard deviations for PBMM-estimated frequencies, in Hz, relative to Roving Hammer tests.
Structural LayoutDirection
of Excitation
Left Column
Standard Deviation
Right Column
Standard Deviation
Overall
Standard Deviation
WOY0.0210.0230.021
B10B11X0.0390.0550.051
B10B11Y0.0100.0060.007
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dragonetti, P.; Civera, M.; Miraglia, G.; Ceravolo, R. A Benchmark Dataset for the Validation of Phase-Based Motion Magnification-Based Experimental Modal Analysis. Data 2025, 10, 45. https://doi.org/10.3390/data10040045

AMA Style

Dragonetti P, Civera M, Miraglia G, Ceravolo R. A Benchmark Dataset for the Validation of Phase-Based Motion Magnification-Based Experimental Modal Analysis. Data. 2025; 10(4):45. https://doi.org/10.3390/data10040045

Chicago/Turabian Style

Dragonetti, Pierpaolo, Marco Civera, Gaetano Miraglia, and Rosario Ceravolo. 2025. "A Benchmark Dataset for the Validation of Phase-Based Motion Magnification-Based Experimental Modal Analysis" Data 10, no. 4: 45. https://doi.org/10.3390/data10040045

APA Style

Dragonetti, P., Civera, M., Miraglia, G., & Ceravolo, R. (2025). A Benchmark Dataset for the Validation of Phase-Based Motion Magnification-Based Experimental Modal Analysis. Data, 10(4), 45. https://doi.org/10.3390/data10040045

Article Metrics

Back to TopTop