Next Article in Journal
Recent Developments of Acoustic Energy Harvesting: A Review
Next Article in Special Issue
Multiple Laser Stripe Scanning Profilometry Based on Microelectromechanical Systems Scanning Mirror Projection
Previous Article in Journal
Investigating the Potential of Commercial-Grade Carbon Black-Filled TPU for the 3D Printing of Compressive Sensors
Previous Article in Special Issue
Tunable Fiber Laser with High Tuning Resolution in C-band Based on Echelle Grating and DMD Chip
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hybrid 3D Shape Measurement Using the MEMS Scanning Micromirror

1
School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049, Shaanxi, China
2
School of Food Equipment Engineering and Science, Xi’an Jiaotong University, Xi’an 710049, Shaanxi, China
*
Author to whom correspondence should be addressed.
Micromachines 2019, 10(1), 47; https://doi.org/10.3390/mi10010047
Submission received: 10 December 2018 / Revised: 2 January 2019 / Accepted: 7 January 2019 / Published: 11 January 2019
(This article belongs to the Special Issue Optical MEMS)

Abstract

:
A surface with large reflection variations represents one of the biggest challenges for optical 3D shape measurement. In this work, we propose an alternative hybrid 3D shape measurement approach, which combines the high accuracy of fringe projection profilometry (FPP) with the robustness of laser stripe scanning (LSS). To integrate these two technologies into one system, first, we developed a biaxial Microelectromechanical Systems (MEMS) scanning micromirror projection system. In this system, a shaped laser beam serves as a light source. The MEMS micromirror projects the laser beam onto the object surface. Different patterns are produced by controlling the laser source and micromirror jointly. Second, a quality wised algorithm is delivered to develop a hybrid measurement scheme. FPP is applied to obtain the main 3D information. Then, LSS helps to reconstruct the missing depth guided by the quality map. After this, the data fusion algorithm is used to merge and output complete measurement results. Finally, our experiments show significant improvement in the accuracy and robustness of measuring a surface with large reflection variations. In the experimental instance, the accuracy of the proposed method is improved by 0.0278 mm and the integrity is improved by 83.55%.

1. Introduction

Three-dimensional (3D) shape information can be widely used in human–computer interaction [1,2], biometric identification [3,4], robot vision [5,6], virtual/augmented reality [7,8], industry [9] and other fields. As a result, 3D shape measurement attracts a lot of attention in the community of computer science and instrument science.
Fringe projection profilometry (FPP) is considered one of the most popular approaches because of the advantages of non-contact operation, high accuracy and full-field acquisition [10,11]. In FPP, sinusoidal fringes are projected onto a measuring surface by using a digital projector. Meanwhile, the observation pattern images are obtained from another angle using a camera. We can decode the height of the surface by analyzing the distortion of the observation fringe patterns [9,12,13]. However, FPP assumes that the measuring surface exhibits a diffuse reflection and usually considers low-reflective (dark) and highlighted (specular reflection) areas as outliers. These regions can completely block any fringe patterns, which results in the loss of depth information [14,15,16]. To address this problem, some solutions are presented. In [17], Salahieh et al. propose a multi-polarization fringe projection (MPFP) imaging technique that handles high dynamic range (HDR) objects by selecting the proper polarized channel measurements. A similar polarization solution is also adopted in [18]. On the other hand, Liu et al. demonstrate the use of a dual-camera FPP system, which can also be considered as two camera-projector monocular systems. By viewing from different angles, these highly specular and dark pixels, which are missing from binocular reconstruction, can be filled in [19]. In addition, Jiang et al. [20] present using 180-degree phase-shifted (or inverted) fringe patterns to reduce the measurement error for high-contrast surfaces reconstruction. Some other researchers have attempted to adjust the parameters of the camera and projector to handle the surface with large reflection variations. Lin et al. [21,22] suggest adjusting the maximum input gray level of projecting image globally, while Chen et al. [23,24] proposed adjusting projecting images pixel-by-pixel. In [16,25], the author proposes projecting a set of fringe images that are captured with different exposures. For [16,21,22,23,24,25], the reflection of the surface needs to be calibrated first. Then, by fusing these images captured with different parameters, a new fringe pattern with fewer saturated regions can be obtained. Although Jiang et al. [16,20,21,22,23,24,25] improve the performance without adding any extra equipment, they need to project or capture a lot of images when the measuring surface has very complex reflection variations. On the other hand, these approaches still use an FPP principle, which can be ineffective for extreme reflection areas.
Laser stripe scanning (LSS) [26,27] is another kind of structured light approach, which shares the same triangulation [28] measurement principle with FPP. The difference is that LSS applies a scanning stripe pattern instead of a fringe pattern. As LSS just needs to extract the stripe in the observation images, it results in very high robustness [29,30,31]. Therefore, LSS is widely used in the 3D shape measurement industry [32]. However, it is expensive to obtain a very high accuracy in LSS, which is mainly determined by the width of the stripe pattern and the resolution of the camera. It is reasonable to consider whether we could use the same hardware to set up an LSS and an FPP system to handle different surface reflection. Generally, because of the limited projection depth of the field, the answer is no. In FPP, Digital Light Processing (DLP) or a Liquid Crystal On Silicon (LCOS) projector is used to project fringe patterns on the measuring surface [33,34]. Those projection techniques can only produce sharp images in the focal plane. If a stripe is projected, it will be severely blurred on the defocused plane. This means that a commercial projector cannot work for LSS. In LSS, a laser stripe projector is adopted, and the object surface is scanned by moving the object or measurement system. A typical laser stripe projector employs a laser beam as the light source, a cylindrical lens is used to scatter the laser beam into a stripe. Therefore, it can’t produce a fringe pattern generally. However, by using Microelectromechanical Systems (MEMS) projection technology, it is possible to generate a stripe pattern and fringe pattern with the same hardware. In MEMS projection [35], a biaxial (or single axial) MEMS micromirror [36,37,38] is applied to scan a laser beam point by point (row by row for single axial MEMS scanner with a cylindrical lens) to produce the projection pattern. In Ref. [39], the authors set up a compact 3D shape measurement system with a single axial MEMS micromirror. As only the FPP principle is used in this work, they still cannot measure the surface with large reflection variations.
In this paper, we propose a hybrid 3D shape measurement approach, which employs FPP and LSS in the same system by applying a biaxial MEMS scanning micromirror to generate the fringe pattern and scanning laser stripe with the same setup. By doing so, the proposed method can handle the surface that has large reflection variations with high accuracy and high robustness.

2. Principles

2.1. Principle of 3D Shape Measurement with FPP and LSS

2.1.1. Principle of Fringe Projection Profilometry

As shown in Figure 1a, a typical FPP measurement system consists of a digital projector and a digital camera [10]. The light axis (EpO) of the projector intersects the light axis (EcO) of the Charge-coupled Device (CCD) camera at Point O in the reference plane (along the x-axis). The distance between the two optical centers is d, and the distance between the camera and the reference plane is l. Point D is an arbitrary point on the object’s surface with a height of h. Points A and C are the intersections of the light paths of the projector and the camera, respectively, with the reference axis. Compared with projecting a sinusoidal fringe pattern onto the reference plane, when an object is placed on the reference plane, the fringe pattern captured by the CCD will be distorted by the object’s height. The modulated phase difference will have a relationship with the true height h, as given by Equation (1), where the inference process can be found in [12]:
h F P P ( x , y ) = l Δ ϕ ( x , y ) 2 π f 0 d ,
where f 0 is the frequency of the fringe pattern projection. Δ ϕ represents the phase difference between Point D and Point A, which is equal to the phase difference between Point A and Point C in the reference plane.

2.1.2. Principle of Laser Stripe Scanning

Laser stripe scanning is based on active laser-triangulation (Figure 1b). In LSS, a laser stripe, created by a dot laser and then scattered by a cylinder lens, is projected onto the measuring object surface and the reflection light is observed under the triangulation angle with a digital camera [26]. Changing the distance between the object and measurement system results in a movement of laser stripe’s position in the x-direction observed with the z-direction. This position is calculated by extracting the laser stripe center. LSS thus delivers a height distribution of the object. In most cases, industry applications need to make full-field measurements where a highly accurate moving part is introduced. The moving part changes the position between the system and the object, so that the laser stripe sweeps across the surface of the object to obtain full field height distribution. Similar to FPP, when an object is placed on the reference plane, the laser stripe delivers Δ x movement, and Equation (2) shows the relationship between Δ x and height h L S S x , y :
h L S S x , y = d Δ x l ,
where d is the distance between the laser and the camera. l is measurement distance.

2.2. Hybrid 3D Shape Measurement System

2.2.1. Biaxial MEMS Micromirror-Based Pattern Projection

The biaxial MEMS micromirror-based pattern projection system is the foundation of our pipeline, which can produce both a stripe pattern and fringe pattern. Figure 2 shows the basic layout of our MEMS pattern projection system. There is a single model laser diode (LD) which served as the light source. Meanwhile, near the LD, an aspheric lens is placed to adjust the focus of the laser beam and shape the beam. Before the laser beam is relayed onto the biaxial MEMS micromirror, there is an aperture to remove the stray light. A biaxial electromagnetic actuation MEMS micromirror working on raster scan mode reflects the light source to the object surface to produce a different 2D pattern. Both of the fast and slow axes rotate reciprocally driven by the current signal, which makes the micromirror scan the laser beam row by row. At the same time, the intensity of the laser is modulated under the synchronization of the sync signal. Figure 3a,b illustrate the controlling signals for fringe pattern and stripe pattern projection. H and V are the horizontal and vertical driven signals of the MEMS micromirror, respectively. S y n c is the row sync signal. Additionally, L a s e r modulates the LD to produce different intensity. It should be noted that both the horizontal and vertical axes just operate in resonant vibration mode, no matter whether the fringe pattern or stripe pattern is projected. As shown in Figure 3, H and V are always sinusoidal waveforms. This kind of variable-speed scanning introduces distortion in pattern projection, which can be considered as a phase error resulting in a systematic error. To remove the distortion, pre-correction is performed on L a s e r generally, where more detailed information can be found in [35].
Different from the pixel array based projection technique, a MEMS micromirror based pattern projection system produces different patterns by scanning the laser beam two-dimensionally. This makes it possible to project fringe pattern and stripe pattern with the same hardware. Meanwhile, due to a laser source having better linearity than the light-emitting diode (LED) source, no gamma correction [40] is required in FPP with proposed pipeline, which brings additional benefits.

2.2.2. Quality Index in the Proposed Approach

Our system is significantly simplified by implementing FPP and LSS in the same system. Another benefit is that the data from FPP and LSS are based on the same coordinate system, which makes it much easier to align these two parts of the data. Before data fusion, we need to build an error model to evaluate the quality of measuring data, so that we can guide the process of data fusion. In this section, we define a quality index for our hybrid measurement approach.
In FPP, the phase of each point is calculated by the phase shift method. The fringe image obtained by the CCD can be described by
I c i ( x , y ) = a 0 ( x , y ) + b m o d ( x , y ) cos ( Φ + 2 π n i 2 π n i N s N s ) + n ( x , y ) .
In Equation (3), a 0 ( x , y ) is the respective backgrounds, while b m o d ( x , y ) is the respective modulation functions, also called the contrast. In addition, N s is the number of steps of phase shifting, n i is an integer, and n ( x , y ) is the random noise.
In fact, we modulate the phase by changing the grayscale of the projection images. Here, we would like to discuss how the grayscale affects measurement accuracy. In the N-steps phase shifting method, the phase shifting noise caused by random noise is determined by N s and the distribution of n ( x , y ) . We assume that this part noise n s [ N , N ] , as shown in Figure 4, where the complex plane represents the phase calculated by an imaginary part and a real part. If point P is a measurement point, the coordinates of Point P are P ( b m o d cos Φ , b m o d sin Φ ) . Due to the noise n s , P will and change within the blue square (Figure 4), which has a side length of 2 N . If n = ( N , N ) , then P will shift to Q and P will have a maximum phase error of φ max . If we assume that the phase of P is Φ , then the phase of Q can be given by
Φ + φ = arctan ( b m o d sin ( Φ ) + N b m o d cos ( Φ ) N ) .
Thus, the phase difference is φ . Supposing that k = N b m o d 0 , 1 , then Equation (4) can be simplified as
φ = arctan sin ( Φ ) + k cos ( Φ ) k Φ .
From Equation (5), we know that the error of the FPP system is determined by k. This means that we can evaluate FPP depth data by b m o d .
In a fixed experiments setup, the noise in Equation (3) considered as a constant variable generally. Therefore, it can be simplified as Equation (6):
I c i ( x , y ) = A + b m o d cos ( Φ + 2 π n i / N s ) = A + b m o d cos ( Φ + δ i ) = A + b m o d ( cos Φ cos δ i sin ϕ sin δ i ) = A + B c cos δ i + B s sin δ i ,
where A is the combine of background a 0 ( x , y ) , noise n ( x , y ) , and b m o d is the modulation in FPP:
A = a 0 ( x , y ) + n ( x , y ) , B c = b m o d cos ϕ , B s = b m o d sin ϕ , δ i = i 2 π N s ; i = 0 , 1 N s 1 .
Based on the principle of trigonometric function orthogonality,
B c = 2 N s I i cos δ i , B s = 2 N s I i sin δ i .
Thus, we have
b m o d = B c 2 + B s 2 .
It can be known from Equation (5) that the quality of depth data is positively correlated with the modulation of fringe images. Therefore, we choose to conclude that modulation as the guiding quality index for data fusion in the proposed pipeline, where the quality index Q can be defined in Equation (10):
Q = 2 N s I c i cos δ i 2 + 2 N s I c i sin δ i 2 .
The region with high reliability, where the Q is above a threshold, will use the depth data from FPP. As a supplement, depth information that comes from LSS is used to fill in the other part, where the Q is not good enough. This allows us to keep the integrity with the optimal accuracy.

2.2.3. Hybrid 3D Shape Measurement Pipeline

Our Hybrid 3D shape measurement pipeline is shown in Figure 5. Both FPP and LSS are implemented with one system. Before measuring, we should first calibrate the camera and the projector. The calibration process provides the relationship between the height and distortion of structured Pattern [41]. FPP is adopted to obtain the quality map and depth map 1, while LSS is employed to obtain depth map 2. Generally, depth 1 has high accuracy but loses some depth information because of the extreme reflection. On the other hand, map 2 has lower accuracy but loses little depth information due to excellent robustness. Because map 1 and map 2 are naturally aligned, we can fill the final depth map by selecting a better part from map 1 and map 2 without any registration. The quality map shows where FPP works well; therefore, it is employed to guide the data fusion. When the quality index in quality map is above a threshold, the depth coming from FPP is considered high quality and will be used for the final depth map. Otherwise, LSS data will be adopted, and even both FPP and LSS data are available. Finally, the 3D surface is reconstructed from the refined depth map and calibration parameters. A carefully selected threshold is very important for proposed pipeline, and it will be presented in the next section.

3. Experiments

To implement the proposed hybrid measurement approach, we build our system as in Figure 2. Figure 6 illustrates our experimental setup. A MEMS pattern projector, driven by the driver board, produces both the fringe pattern and stripe pattern. In addition, a USB hub is adopted for communication and collection of the image data. A CCD camera with a 12 mm lens is used to capture the pattern images. The angle between the MEMS projector and the camera is set to 15 degrees, which can balance measuring resolution and coincidence of field of view. While the measuring distance is set to 500 mm, where the projector and CCD have the largest coincidence field of view and the best projection pattern quality. All these modules are mounted on an aluminum alloy casing with the overall size of 187 mm × 90 mm × 45 mm. More detailed parameters and description for these components are in Table 1. Both camera, USB hub and lenses are commercial products. The optical and mechanical part are designed by our own team. The driver board and MEMS scanner are designed and manufactured by the cooperation team. With all these setups, the hybrid 3D shape measurement system achieves a 0.07 mm measuring accuracy with a 286 mm × 176 mm field of view at an optimum working distance of 500 mm. Additionally, laser beam scanning has a larger depth of field than pixel array based projection. Hence, the depth of view of proposed system is mainly determined by the limitation of camera. It is about 100 mm in this experiment.

3.1. Linearity Test of Proposed Pattern Projection System

In FPP, the nolinearity, which makes ideal sinusoidal waveforms nonsinusoidal, can significantly influence the performance [40]. Due to the use of a laser source, the proposed system has better linearity than a conventional LED based pattern projector. To verify the linearity of these two kinds of projectors, a commercial projector (coolux S3) is chosen as a comparison, which is an LED based DLP projector. Both projectors project pure white images with different gray levels. Then, an illuminometer is used to measure the illuminance with a fixed distance. Figure 7a,b illustrate the relationship between the projection gray value and illuminance of the proposed system and a commercial LED projector. As is shown in Figure 7, the proposed system has better linearity than a commercial LED projector. In fact, these two curves give luminescence characteristics of laser source and LED source. Therefore, gamma calibration, which is usually applied to eliminate the influence of nonlinearity in FPP [40], is no longer needed.

3.2. Experiments on the Quality Index

In the previous section, we demonstrated the use of the quality index to evaluate depth information, and give the theoretical basis. In this section, we will verify the given theoretical basis and find the best threshold for data fusion by experiments. Here, we use a standard gauge block (30 mm × 50 mm × 8 mm) as the measuring object. The flatness of the block is less than 5 µm. The distribution of the depth value reflects the quality of depth data. In this work, we take the uncertainty of depth data as the evaluation of the measurement accuracy. Additionally, it is computed with Equation (11),
R M S e r r o r = 1 N i = 1 N e i e ¯ 2 ,
where e i is the height of real measuring points and e ¯ is the height of the same position in the fitted plane by Equation (12) with the least squares method. N is the total number of real measuring points, and i stands for the index of a point:
A f i t x + B f i t y + C f i t z + D f i t = 0 .
To find the relationship between fringe modulation and measurement accuracy, we set the modulation of the projected fringe pattern as different values to obtain the related uncertainty. Figure 8a is the standard gauge block. Figure 8b is the standard gauge block with the fringe pattern. Figure 8c illustrates the relationship between the modulation of the captured fringe pattern and measurement error. Obviously, lower modulation results in higher measurement error.
In Equation (2), the uncertainty of h L S S x , y is determined by the resolution of Δ x due to the Δ x being observed with a CCD camera. Thus, the resolution of Δ x is a spatial distance, which is one pixel in the observation frame. It can be calculated as
Δ x min = l δ f ,
where l is the measurement distance, δ and f are the size of pixel and the focal length of the lens of observation camera. In these experimental settings, l = 500 mm, δ = 0.0064 mm and f = 12 mm. Jointly with Equation (2), we have Δ h L S S x , y = 0.0715 mm. From Figure 8, when the modulation is above 32, FPP presents better accuracy than 0.0715 mm, which is the resolution of LSS. Therefore, the threshold for data fusion is chosen to be 32.

3.3. Hybrid 3D Shape Measurement

In this section, a black porcelain doll is chosen as our measuring object—see Figure 9a. For a black porcelain surface, the normal direction has a strong influence on the reflection. Weak reflection can be observed in most regions. When the normal direction is towards the camera, extremely strong reflection will be captured. This makes it difficult for FPP to work well. As a comparison, we paint the body of the doll with developer, which makes the body an ideal diffuse surface. Figure 9b is the quality map, where the threshold is 25. Figure 9c illustrates the fusion mask. Guided by the mask, we fuse the depth of FPP Figure 10a with the depth of LSS Figure 10b, and then we obtain the optimal 3D shape information Figure 10c. From Figure 10, we can find that FPP gives better accuracy (body of doll), but it cannot handle large reflectance variations (head of doll). LSS shows high robustness with different reflectance (head of doll) but lower accuracy (body of doll). The fusion data combines the advantages of FPP and LSS. To evaluate the measurement data quantitatively, a 3D measurement instrument (see Figure 11a) with a resolution of ±5 µm, is adopted to offer the ground truth data (see Figure 11b). Because of the limitation of measurement range and efficiency, only the head of the doll is scanned. Then, the FPP reconstruction result and hybrid measurement result are registered with ground truth data to compute the 3D geometric error, respectively. As illustrated in Figure 12a, for FPP data, the root mean square (RMS) error is 0.0991 mm, and 0.0713 mm for hybrid measuring results in Figure 12b. In this case, integrity is computed to evaluate the robustness. For FPP, it is 54.48% taking hybrid results as the reference (100%). The experiments show that the proposed hybrid approach gets an improvement of 0.0278 mm in accuracy and 83.55% in integrity under the conditions described in this paper.
To verify the performance of the proposed approach, we chose several objects with large reflection variations that FPP cannot handle integrally. Figure 13a shows a plastic car model with shiny, dark and specular reflection regions. Figure 13b shows a metal surface. When the normal direction changes dramatically, it becomes hard to scan a metal surface. Figure 13c,d also show two difficult cases: stone material with very low reflection and a plastic surface with multiple reflectance and a large variation of normal directions. These results show the excellent performance of our approach.

4. Conclusions

In this paper, we have addressed the 3D shape measurement for large reflection variations with a hybrid approach. We proposed using a biaxial MEMS scanning micromirror and laser source to produce the fringe pattern and stripe pattern with the same hardware. Both FPP, which has the advantages of high accuracy and high efficiency, and LSS, which is one of the most robust methods, are employed to achieve a hybrid 3D shape measurement approach. Real experiments of different objects with large reflectance variations were carried out to verify the proposed method. The metal, plastic and stone materials with large reflection variations and large normal direction variations were reconstructed successfully, which shows the excellent performance of our method.

Author Contributions

Conceptualization, T.Y.; Data Curation, T.Y. and G.Z.; Formal Analysis, G.Z. and H.L.; Funding Acquisition, X.Z.; Methodology, T.Y. and G.Z.; Project Administration, T.Y.; Resources, H.L. and X.Z.; Software, G.Z.; Supervision, X.Z.; Visualization, T.Y.; Writing—Original Draft, T.Y., G.Z. and H.L.

Funding

This work was supported by the National Science and Technology Major project (No. 2015ZX04001003).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rautaray, S.S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
  2. Raheja, J.L.; Chandra, M.; Chaudhary, A. 3D gesture based real-time object selection and recognition. Pattern Recognit. Lett. 2018, 115, 14–19. [Google Scholar] [CrossRef]
  3. Soltanpour, S.; Boufama, B.; Wu, Q.J. A survey of local feature methods for 3D face recognition. Pattern Recognit. 2017, 72, 391–406. [Google Scholar] [CrossRef]
  4. Zulqarnain Gilani, S.; Mian, A. Learning From Millions of 3D Scans for Large-Scale 3D Face Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 1896–1905. [Google Scholar]
  5. Faessler, M.; Fontana, F.; Forster, C.; Mueggler, E.; Pizzoli, M.; Scaramuzza, D. Autonomous, vision-based flight and live dense 3d mapping with a quadrotor micro aerial vehicle. J. Field Robot. 2016, 33, 431–450. [Google Scholar] [CrossRef]
  6. Kim, H.; Leutenegger, S.; Davison, A.J. Real-time 3D reconstruction and 6-DoF tracking with an event camera. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–16 October 2016; pp. 349–364. [Google Scholar]
  7. Sra, M.; Garrido-Jurado, S.; Schmandt, C.; Maes, P. Procedurally generated virtual reality from 3D reconstructed physical space. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Munich, Germany, 2–4 November 2016; pp. 191–200. [Google Scholar]
  8. Sra, M.; Garrido-Jurado, S.; Maes, P. Oasis: Procedurally Generated Social Virtual Spaces from 3D Scanned Real Spaces. IEEE Trans. Vis. Comput. Graph. 2018, 24, 3174–3187. [Google Scholar] [CrossRef]
  9. Zhang, S. High-speed 3D shape measurement with structured light methods: A review. Opt. Lasers Eng. 2018, 106, 119–131. [Google Scholar] [CrossRef]
  10. Gorthi, S.S.; Rastogi, P. Fringe projection techniques: Whither we are? Opt. Lasers Eng. 2010, 48, 133–140. [Google Scholar] [CrossRef] [Green Version]
  11. Geng, J. Structured-light 3D surface imaging: A tutorial. Adv. Opt. Photonics 2011, 3, 128–160. [Google Scholar] [CrossRef]
  12. Zuo, C.; Feng, S.; Huang, L.; Tao, T.; Yin, W.; Chen, Q. Phase shifting algorithms for fringe projection profilometry: A review. Opt. Lasers Eng. 2018, 109, 23–59. [Google Scholar] [CrossRef]
  13. Zhang, Z. Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques. Opt. Lasers Eng. 2012, 50, 1097–1106. [Google Scholar] [CrossRef]
  14. Feng, S.; Zhang, L.; Zuo, C.; Tao, T.; Chen, Q.; Gu, G. High dynamic range 3-D measurements with fringe projection profilometry: A review. Meas. Sci. Technol. 2018, 29, 122001. [Google Scholar] [CrossRef]
  15. Jiang, H.; Zhao, H.; Li, X. High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces. Opt. Lasers Eng. 2012, 50, 1484–1493. [Google Scholar] [CrossRef]
  16. Zhang, S.; Yau, S.T. High dynamic range scanning technique. Opt. Eng. 2009, 48, 033604. [Google Scholar] [Green Version]
  17. Salahieh, B.; Chen, Z.; Rodriguez, J.J.; Liang, R. Multi-polarization fringe projection imaging for high dynamic range objects. Opt. Express 2014, 22, 10064–10071. [Google Scholar] [CrossRef]
  18. Feng, S.; Zhang, Y.; Chen, Q.; Zuo, C.; Li, R.; Shen, G. General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique. Opt. Lasers Eng. 2014, 59, 56–71. [Google Scholar] [CrossRef]
  19. Liu, G.H.; Liu, X.Y.; Feng, Q.Y. 3D shape measurement of objects with high dynamic range of surface reflectivity. Appl. Opt. 2011, 50, 4557–4565. [Google Scholar] [CrossRef]
  20. Jiang, C.; Bell, T.; Zhang, S. High dynamic range real-time 3D shape measurement. Opt. Express 2016, 24, 7337–7346. [Google Scholar] [CrossRef]
  21. Lin, H.; Gao, J.; Mei, Q.; He, Y.; Liu, J. Adaptive digital fringe projection technique for high dynamic range three-dimensional shape measurement. Opt. Express 2016, 24, 7703–7718. [Google Scholar] [CrossRef]
  22. Waddington, C.J.; Kofman, J.D. Modified sinusoidal fringe-pattern projection for variable illuminance in phase-shifting three-dimensional surface-shape metrology. Opt. Eng. 2014, 53, 084109. [Google Scholar] [CrossRef]
  23. Sheng, H.; Xu, J.; Zhang, S. Dynamic projection theory for fringe projection profilometry. Appl. Opt. 2017, 56, 8452–8460. [Google Scholar] [CrossRef]
  24. Chen, C.; Gao, N.; Wang, X.; Zhang, Z. Adaptive pixel-to-pixel projection intensity adjustment for measuring a shiny surface using orthogonal color fringe pattern projection. Meas. Sci. Technol. 2018, 29, 055203. [Google Scholar] [CrossRef] [Green Version]
  25. Qi, Z.; Wang, Z.; Huang, J.; Xue, Q. Improving the quality of stripes in structured-light three-dimensional profile measurement. Opt. Eng. 2016, 56, 031208. [Google Scholar] [CrossRef]
  26. Haug, K.; Pritschow, G. Robust laser-stripe sensor for automated weld-seam-tracking in the shipbuilding industry. In Proceedings of the 24th Annual Conference of the IEEE Industrial Electronics Society, Aachen, Germany, 31 August–4 September 1998; Volume 2, pp. 1236–1241. [Google Scholar]
  27. Usamentiaga, R.; Molleda, J.; Garcia, D.F.; Bulnes, F.G. Removing vibrations in 3D reconstruction using multiple laser stripes. Opt. Lasers Eng. 2014, 53, 51–59. [Google Scholar] [CrossRef]
  28. Ye, Z.; Lianpo, W.; Gu, Y.; Chao, Z.; Jiang, B.; Ni, J. A Laser Triangulation-Based 3D Measurement System for Inner Surface of Deep Holes. In Proceedings of the ASME 2018 13th International Manufacturing Science and Engineering Conference, College Station, TX, USA, 18–22 June 2018. [Google Scholar]
  29. Usamentiaga, R.; Molleda, J.; Garcia, D.F. Structured-light sensor using two laser stripes for 3D reconstruction without vibrations. Sensors 2014, 14, 20041–20063. [Google Scholar] [CrossRef]
  30. Sun, Q.; Chen, J.; Li, C. A robust method to extract a laser stripe centre based on grey level moment. Opt. Lasers Eng. 2015, 67, 122–127. [Google Scholar] [CrossRef]
  31. Tian, Q.; Zhang, X.; Ma, Q.; Ge, B. Utilizing polygon segmentation technique to extract and optimize light stripe centerline in line-structured laser 3D scanner. Pattern Recognit. 2016, 55, 100–113. [Google Scholar]
  32. Usamentiaga, R.; Molleda, J.; García, D.F. Fast and robust laser stripe extraction for 3D reconstruction in industrial environments. Mach. Vis. Appl. 2012, 23, 179–196. [Google Scholar] [CrossRef]
  33. Zhang, S. High-Speed 3D Imaging with Digital Fringe Projection Techniques; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
  34. Huang, P.S.; Zhang, C.; Chiang, F.P. High-speed 3-D shape measurement based on digital fringe projection. Opt. Eng. 2003, 42, 163–169. [Google Scholar] [CrossRef]
  35. Holmstrom, S.T.; Baran, U.; Urey, H. MEMS laser scanners: A review. J. Microelectromech. Syst. 2014, 23, 259–275. [Google Scholar] [CrossRef]
  36. Wang, H.; Zhou, L.; Zhang, X.; Xie, H. Thermal Reliability Study of an Electrothermal MEMS Mirror. IEEE Trans. Device Mater. Reliab. 2018, 18, 422–428. [Google Scholar] [CrossRef]
  37. Xie, H. Editorial for the Special Issue on MEMS Mirrors. Micromachines 2018, 9, 99. [Google Scholar] [CrossRef]
  38. Tanguy, Q.A.; Duan, C.; Wang, W.; Xie, H.; Bargiel, S.; Struk, P.; Lutz, P.; Gorecki, C. A 2-axis electrothermal MEMS micro-scanner with torsional beam. In Proceedings of the 2016 International Conference on Optical MEMS and Nanophotonics (OMN), Singapore, 31 July–4 August 2016; pp. 1–2. [Google Scholar]
  39. Yoshizawa, T.; Wakayama, T. Compact camera system for 3D profile measurement. In Proceedings of the 2009 International Conference on Optical Instruments and Technology: Optoelectronic Imaging and Process Technology, Shanghai, China, 19–22 October 2009; Volume 7513, p. 751304. [Google Scholar]
  40. Guo, H.; He, H.; Chen, M. Gamma correction for digital fringe projection profilometry. Appl. Opt. 2004, 43, 2906–2914. [Google Scholar] [CrossRef]
  41. Lu, J.; Mo, R.; Sun, H.; Chang, Z. Flexible calibration of phase-to-height conversion in fringe projection profilometry. Appl. Opt. 2016, 55, 6381–6388. [Google Scholar] [CrossRef]
Figure 1. Schematic drawings of fringe projection and laser stripe scanning. (a) basic light path of Fringe projection profilometry system; (b) schematic diagram of laser stripe scanning system.
Figure 1. Schematic drawings of fringe projection and laser stripe scanning. (a) basic light path of Fringe projection profilometry system; (b) schematic diagram of laser stripe scanning system.
Micromachines 10 00047 g001
Figure 2. Schematic diagram of the biaxial Microelectromechanical Systems (MEMS)-based fringe pattern projection system.
Figure 2. Schematic diagram of the biaxial Microelectromechanical Systems (MEMS)-based fringe pattern projection system.
Micromachines 10 00047 g002
Figure 3. Controlling signals of pattern projection. (a) controlling signal of fringe projection; (b) controlling signal of stripe projection. H and V are the horizontal and vertical driven signals of the MEMS micromirror. S y n c is the row sync signal. L a s e r is the modulation signal of the laser diode (LD).
Figure 3. Controlling signals of pattern projection. (a) controlling signal of fringe projection; (b) controlling signal of stripe projection. H and V are the horizontal and vertical driven signals of the MEMS micromirror. S y n c is the row sync signal. L a s e r is the modulation signal of the laser diode (LD).
Micromachines 10 00047 g003
Figure 4. Error analysis in fringe projection profilometry (FPP).
Figure 4. Error analysis in fringe projection profilometry (FPP).
Micromachines 10 00047 g004
Figure 5. Flow chart of the proposed hybrid measurement approach.
Figure 5. Flow chart of the proposed hybrid measurement approach.
Micromachines 10 00047 g005
Figure 6. Hybrid 3D shape measurement setup.
Figure 6. Hybrid 3D shape measurement setup.
Micromachines 10 00047 g006
Figure 7. Linearity test. (a) proposed system; (b) commercial LED projector, item model: coolux S3.
Figure 7. Linearity test. (a) proposed system; (b) commercial LED projector, item model: coolux S3.
Micromachines 10 00047 g007
Figure 8. Depth reliability experiments. (a) standard gauge block; (b) standard gauge block with fringe pattern; and (c) the relationship between modulation and measurement error.
Figure 8. Depth reliability experiments. (a) standard gauge block; (b) standard gauge block with fringe pattern; and (c) the relationship between modulation and measurement error.
Micromachines 10 00047 g008
Figure 9. Measuring object and intermediate data. (a) measuring object; (b) quality map from FPP; (c) binary mask for data fusion.
Figure 9. Measuring object and intermediate data. (a) measuring object; (b) quality map from FPP; (c) binary mask for data fusion.
Micromachines 10 00047 g009
Figure 10. 3D reconstruction results. (a) reconstructing from FPP; (b) reconstructing from laser stripe scanning (LSS); (c) fusion results.
Figure 10. 3D reconstruction results. (a) reconstructing from FPP; (b) reconstructing from laser stripe scanning (LSS); (c) fusion results.
Micromachines 10 00047 g010
Figure 11. Ground truth data. (a) high accuracy 3D measurement equipment, resolution = ±5 µm; (b) high accuracy ground truth measurement result.
Figure 11. Ground truth data. (a) high accuracy 3D measurement equipment, resolution = ±5 µm; (b) high accuracy ground truth measurement result.
Micromachines 10 00047 g011
Figure 12. Accuracy evaluation. (a) comparison between FPP data and ground truth; (b) comparison between the data measured with proposed method and ground truth.
Figure 12. Accuracy evaluation. (a) comparison between FPP data and ground truth; (b) comparison between the data measured with proposed method and ground truth.
Micromachines 10 00047 g012
Figure 13. Experimental results of the proposed hybrid measurement approach. (a) plastic surface with shiny, dark and specular reflection regions; (b) metal surface, which results in shiny and dark regions from different perspectives of observation; (c) stone material with very low reflection; (d) plastic surface with multiple reflectance and large variation of normal directions.
Figure 13. Experimental results of the proposed hybrid measurement approach. (a) plastic surface with shiny, dark and specular reflection regions; (b) metal surface, which results in shiny and dark regions from different perspectives of observation; (c) stone material with very low reflection; (d) plastic surface with multiple reflectance and large variation of normal directions.
Micromachines 10 00047 g013
Table 1. Parameters and description for experimental components.
Table 1. Parameters and description for experimental components.
ItemsParameters and Description
CameraModel: Charge-coupled Device (CCD), mono, global shutter. Resolution: 1920 × 1200. Max frames per second: 163
Lens for cameraFocal length: 12 mm
MEMS scannerModel: 2D electromagnetic actuation. Fast axis: 18 kHz, ±16°. Slow axis: 0.5 kHz, ±10°
Laser diodeWavelength: 650 nm. Power consumption: 320 mW
Lens for LDModel: aspheric lens. Focal length: 4.51 mm. Aspheric coefficient: −0.925. Distance from LD: 4.55 mm. Distance from MEMS scanner: 33 mm
USB hubUSB3.0 × 5

Share and Cite

MDPI and ACS Style

Yang, T.; Zhang, G.; Li, H.; Zhou, X. Hybrid 3D Shape Measurement Using the MEMS Scanning Micromirror. Micromachines 2019, 10, 47. https://doi.org/10.3390/mi10010047

AMA Style

Yang T, Zhang G, Li H, Zhou X. Hybrid 3D Shape Measurement Using the MEMS Scanning Micromirror. Micromachines. 2019; 10(1):47. https://doi.org/10.3390/mi10010047

Chicago/Turabian Style

Yang, Tao, Guanliang Zhang, Huanhuan Li, and Xiang Zhou. 2019. "Hybrid 3D Shape Measurement Using the MEMS Scanning Micromirror" Micromachines 10, no. 1: 47. https://doi.org/10.3390/mi10010047

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop