Next Article in Journal
Micro-Computed Tomography Analysis of Subchondral Bone Regeneration Using Osteochondral Scaffolds in an Ovine Condyle Model
Previous Article in Journal
An Unbalance Optimization Method for a Multi-Stage Rotor Based on an Assembly Error Propagation Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Separation Method of Superimposed Gratings in Double-Projector Fringe Projection Profilometry Using a Color Camera

State Key Laboratory of Precision Measurement Technology and Instruments, Tianjin University, Tianjin 300072, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(3), 890; https://doi.org/10.3390/app11030890
Submission received: 21 December 2020 / Revised: 11 January 2021 / Accepted: 17 January 2021 / Published: 20 January 2021

Abstract

:

Featured Application

Inspired by the characteristics of the red and blue channels of color cameras, this paper proposes a separation method of superimposed gratings in double-projector fringe projection profilometry. According to this method, the superimposed grating can be separated effectively without complex projection coding and separation algorithms. There is no additional device, and there is no requirement for device location. At the same time, the measurement efficiency is increased by 50%.

Abstract

Fringe projection profilometry has been intensively studied for several decades. However, due to the limitation of the field range of a single projector, when measuring objects with complex surfaces, there are always shadow areas in the captured images, resulting in missing measurement data in the dark areas. To solve this problem, systems with double projectors and single camera were employed. Not only were the shadow areas reduced, but system recalibration and multiple measurements were not needed, improving measuring efficiency. Nevertheless, separating the corresponding projection pattern from the superimposed fringe presented a difficult problem. A color camera has RGB three color channels. When the color camera is applied to fringe projection profilometry, the information obtained is three times as much as that of the monochrome camera. Due to the small overlap between the red- and blue-light spectra response of color cameras, the channel color crosstalk can be ignored. This paper proposes a method to project red and blue fringe patterns from two projectors and utilize the characteristics of the red and blue channels of the color camera to separate the superposition grating pattern. The original patterns can be recovered integrally and easily. To explain the effectiveness of superimposed fringe separation, a simulation and experiments were carried out. Both of them showed that the superimposed fringe can be separated correctly, proving that our method is feasible.

1. Introduction

Three-dimensional shape measurement is widely applied to reverse engineering, product inspection, and physical imitation. Fringe projection profilometry has become one of the most important methods due to the advantages it offers: high speed and precision, non-contact and full-field measurement, and simple data processing. For a three-dimensional object with a complex surface, the measuring range with a single projector and a single camera is restricted. The information in the shadow area will be submerged, resulting in incomplete measurement data. To cope with this problem, a double-projector structured-light three-dimensional measurement system has been created [1,2,3,4,5]. Y. Jin et al. [1] measured the dimension of holes based on a double-projector system. In addition to minimizing occlusions, the double-projector structured-light three-dimensional measurement system has other advantages, such as increasing projector light intensity, reducing the number of images needed for scanning, and removing the bimodal multi-path [2]. However, two projectors casting separately require a long experiment time. If two projectors can be run at the same time, the measurement will be more efficient. A method of separating the corresponding projection pattern from the superimposed fringe is urgently needed. Yu, Ying, et al. [2] choose two groups of patterns that temporally shifted at different rates, and used DFT to decouple them along the time axis. Griesser et al. [6] treated a projector and a camera as a module, and two modules were set in opposite directions. As a result, patterns projected by the opposite projector could blind the other camera, avoiding pattern superposition. Maimone et al. [7] utilized the fact that the fixed unit of a projector and a camera will see a clear version of its own pattern and blurred ones that come from other units. The overlapping patterns can be distinguished by the blurring patterns. Wang Jianfeng et al. [8] recovered the depth information of overlapped and non-overlapped regions by considering the correlation between multiple projectors and the infrared images, as well as that between the infrared images. Tardif. J. P. et al. [9] mentioned intensity-blending algorithms for correcting the overlap area. Yan Zengqiang et al. [10] proposed hierarchical patterns that can be separated from each other. Je Changsoo et al. [11] separated the superposition grating by acquiring partial derivation of color fringes in different directions from different projections. Xiang Sen et al. [12] categorized the interfered regions into flat regions and boundary regions under the guidance of texture segments, then applied different Markov random field (MRF) models to calculate the final depth results. Petkovic Tomislav et al. [13] uploaded their own specifically selected group of temporal phase shifts for each projector, resulting in simple and efficient separation between projected patterns. All of the aforementioned studies either kept the special position relationship between projectors, or projected specific patterns and dealt with the superimposed images using complicated algorithms.
In our previous work, we designed a particular projection order to project the phase-shifting gratings. But six images are needed to be captured if the four-step phase-shifted method is used to acquire a wrapped phase from two projectors [14]. In this study, we utilized a color camera for fringe projection profilometry. Two projectors projected red and blue stripes, respectively. The color camera captured them at the same time and used its optical characteristics to separate the superposition grating. The proposed method did not require complex projection coding and separation algorithms, and there was no special requirement for device placement. Four images are sufficient for the wrapped phase using the four-step phase-shifted method from two projectors.
The rest of this paper is organized as follows. Section 2 analyzes optical characteristics of the color camera and presents the proposed method. Section 3 and Section 4 respectively describe the simulation and experiments. Section 5 discusses the advantages of the proposed method and Section 6 summarizes our chief conclusions.

2. Materials and Methods

2.1. Optical Characteristics of the Color Camera

Color cameras are divided into single-chip color cameras and three-chip color cameras. A color camera has red, green, and blue color channels. The single-chip color camera uses a Bayer filter to show one of the three colors, and the camera’ processing unit performs spatial color interpolation to obtain the other two colors. The three-chip color camera has three semiconductor devices, each of which corresponds to a primary color. A prism can separate the primary colors of received light. The first type is significantly cheaper, and the second one provides better-quality captured color images [15]. It is common that in color cameras, the spectra of red, green, and blue channels are always made to overlap so that there are no color-blind regions in the spectrum [16]. But the overlaps between different color channels are not alike. Generally speaking, the overlaps between the green channel and the other two channels are serious. Red and blue channels have very little color crosstalk. We used a single-chip color camera; its spectral response curve is shown in Figure 1. In our projector, the wavelength of blue LEDs was 459nm, and that of the red LEDs was 618nm. Figure 1 shows that the quantum efficiency of the red channel, at 459nm, and that of the blue channel, at 618nm, were very low. This means that the red and blue superimposed fringe patterns can be separated well.

2.2. Double-Projector Fringe Projection Profilometry with Red and Blue Light

As mentioned above, the red and blue band passes are separated in color cameras. We proposed to project red and blue fringe patterns from two projectors and utilize the characteristics of the red channel and blue channel of color cameras to separate the superposition grating pattern. Figure 2 shows the structure of the system.
Phase-shifting profilometry is one of the most popular phase-extraction methods because it can eliminate interference from ambient light and surface reflectivity [17]. In our study, the four-step phase-shifted method was used, and the phase-shifting was π/2. The fringe images of a four-step phase-shifted algorithm with equal phase-shifted can be described as:
I i x , y = a x , y + b x , y cos ϕ x , y + i π 2 , i = 1 , 2 , 3 , 4
where i is the i-th phase shifting, a x , y represents the average intensity of the fringe brightness and background illumination, and b x , y represents the intensity modulation of the fringe contrast and surface reflectivity.   ϕ x , y is the wrapped phase, which can be calculated by the following equation:
ϕ x , y = arctan I 1 x , y I 3 x , y I 4 x , y I 2 x , y
Since the arctangent function only ranges from −π to π, the phase value provided by Equation (2) will have π phase discontinuities. Therefore, we used the dual-frequency method to obtain the unwrapped phase map:
Φ x , y =   ϕ x , y + 2 π k x , y
where Φ x , y represents the unwrapped phase, and k x , y is the fringe order to represent phase jumps.

3. Simulation

In order to explain the effectiveness of the proposed method, a simulation was carried out. Because we only projected red and blue stripes, the response of three channels in the color camera can be described with Equations (4)–(6):
I r x , y = α r r I p r x , y + α b r I p b x , y  
I g x , y = α r g I p r x , y + α b g I p b x , y
I b x , y = α r b I p r x , y + α b b I p b x , y
I r x , y , I g x , y , and I b x , y represent the images of three channels, respectively.   I p r x , y and I p b x , y indicate the red and blue stripes we projected. α r r is the response coefficient of the red channel to the red projection. α b r is the response coefficient of the red channel to the blue projection.   α r g is the response coefficient of the green channel to the red projection. α b g is the response coefficient of the green channel to the blue projection. α r b is the response coefficient of the blue channel to the red projection.   α b b is the response coefficient of the blue channel to the blue projection. We set α r r and α b b as 0.8; α r g as 0.2; and α b g , α b r , and α r b as 0.03. The red fringe was vertical and the blue fringe was horizontal so they could be easily distinguished. Figure 3 and Figure 4 show the results. The images measured 1140 × 912 pixels, which is consistent with the resolution of the projector we used.
As shown in Figure 3 and Figure 4, both low-frequency superimposed grating and high-frequency superimposed grating were separated well.
Using Equations (2) and (3), we unwrapped the wrapped phase. As shown in Figure 5 and Figure 6, the wrapped phase was unwrapped as an absolute phase successfully, and the cross profiles had good linearity. The simulation results revealed that color cross-talk between the red and blue channels did not affect the phase. We then simulated the three-dimensional measurement. The measured object was a hemisphere with a radius of 300 mm, as shown in Figure 7, and the spatial scale of the image was set to 1 mm/pixel. Figure 8 shows the simulation of the phase image captured by the camera, where (a) is the low-frequency phase-shifting image, and (b) is the high-frequency phase-shifting image, and they were all added color crosstalk. Based on the proposed method, we acquired the measurement results from the left (red) and right (blue) projection. We then fit the hemisphere with 3D data.
Figure 9 and Table 1 show the fitting results. In Table 1, we can see the measurement accuracy is under 100 microns. On the basis of the successful separation of the left and right projection results, the accuracy of the 3D measurement can be confirmed by the proposed method. To consider ambient light and the surface reflectivity of the object, we carried out some experiments to explain the separation effect using actual measurements.

4. Results

We constructed an apparatus similar to the one simulated, and measured a complex 3D object in order to demonstrate the validity and applicability of our proposed method. The system included a color camera (Model: GS3-U3-41C6C-C, FLIR, Wilsonville, OR, USA) with a 25 mm focal length lens and two digital projectors (Model: LightCrafter 4500, Texas Instruments DLP® Technology, Dallas, TX, USA), as shown in Figure 10.
We used an Agrippa statue as the measured object due to its complex surface. The left projector was uploaded with red vertical stripes, and the right projector was uploaded with blue horizontal stripes. The color camera captured the superimposed gratings in the middle of the two projectors. We adopted the dual-frequency four-step phase-shifted method to obtain the unwrapped phase map. The frequency of the low-frequency gratings was 1, and that of the high-frequency gratings was 6. Figure 11 shows the pictures of the superimposed gratings.
In Figure 11, the upper pictures are low-frequency, and the bottom ones are high-frequency. By separating different color channels, we obtained 16 pictures of high and low frequency fringes using the double projectors (Figure 12 and Figure 13). According to Equations (2) and (3), we unwrapped the wrapped phase. The unwrapped phases are shown in Figure 14 and Figure 15.
As shown in Figure 14a and Figure 15a, the red and blue channel pictures were used to unwrap the phase directly, with good-quality results. The blue lines in Figure 14a and Figure 15a, are the background parts, and we made a linear fit to their cross profiles, as shown in Figure 14b and Figure 15b. R-square represents the goodness of fit. The R-square in Figure 14b was 0.9992, and in Figure 15b it was 1, demonstrating excellent linearity. The red ones are cross profiles of the measured object that have fluctuations, and the disorder areas are shadows. The background and the measured object both had a fine unwrapped phase. In the double-projector structured-light three-dimensional measurement system, the two projectors projected red and blue stripes, respectively, and the superimposed grating was separated simply without a complicated coding mode or separation algorithm. The separation results can be directly used in phase unwrapping and subsequent processing without other filtering operations.

5. Discussion

In this paper, we adopted the double-projector structured-light three-dimensional measurement system. In [2], the authors revealed the advantages of this system: (1) it reduces occlusions; (2) it increases the signal-to-noise ratios for its double brightness compared to a single projector system; (3) it decreases the number of images required for scanning; and (4) it tests multi-path interference. Our method showed the third advantage again; that is, the number of images required was reduced, and the measurement speed was significantly improved. Taking the dual-frequency four-step phase-shifting method as an example, our method only needs to obtain eight superimposed gratings to obtain the unwrapped phase for both sides. When using a traditional single-projector and single-camera system, 16 pictures must be captured to acquire the same results. The measuring efficiency was improved by 50%. Second, our method projected the most primitive sinusoidal grating without complex coding patterns, and the separation method was simple. Third, our method only replaced the monochrome camera with the color camera in the system, and did not add other devices. Finally, our experimental process is not complicated, has no special requirements for the placement of cameras and projectors, and is easy to operate.

6. Conclusions

In this paper, we put forward a separation method of superimposed gratings in double-projector fringe projection profilometry using a color camera. According to the spectral-response characteristics of the color camera, there was a small amount of color crosstalk between the red and blue channels that can be ignored. If we project red and blue fringes respectively from two projectors, the projection information on both sides can be simply separated from the color superimposed grating by different channels. Our method included a simulation and experiments. Compared to the traditional single projector and single-camera system, our method improved the measurement speed by 50%. It did not require complex projection coding and separation algorithms. No additional device was added, and there was no special requirement for device placement.

Author Contributions

Conceptualization, Y.Z., F.Z.; data curation, Y.L.; methodology, Y.Z., X.Q.; software, Y.Z.; writing—original draft, Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (grant number: 51775379) and the National Key Research and Development Program of China (grant number: 2018YFB2003501).

Data Availability Statement

The data presented in this study are available in this article.

Acknowledgments

The authors would like to thank the other members of the research team for their contributions to this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jin, Y.; Chang, Y.; Wang, J.; Li, M.; Ren, L.; Chen, Y. The measurement method for the size of the hole on the part surface based on grating image processing. IEEE Access 2020, 8, 29159–29168. [Google Scholar] [CrossRef]
  2. Yu, Y.; Lau, D.L.; Ruffner, M.P.; Liu, K. Dual-projector structured light 3d shape measurement. Appl. Opt. 2019, 59, 964–974. [Google Scholar] [CrossRef] [PubMed]
  3. Chufan, J.; Beatrice, L.; Song, Z. Three-dimensional shape measurement using a structured light system with dual projectors. Appl. Opt. 2018, 57, 3983–3990. [Google Scholar]
  4. Yu, Y.; Lau, D.L.; Ruffner, M. 3D scanning by means of dual-projector structured light illumination. In Proceedings of the Emerging Digital Micromirror Device Based Systems and Applications XI, San Francisco, CA, USA, 5–6 February 2019. [Google Scholar]
  5. Zhang, S.; Hu, X.; Zhong, M.; Chen, F.; Duan, P. Autofocusing method for digital fringe projection system with dual projectors. Opt. Express 2020, 28, 12609–12620. [Google Scholar] [CrossRef] [PubMed]
  6. Griesser, A.; Koninckx, T.P.; van Gool, L. Adaptive real-time 3D acquisition and contour tracking within a multiple structured light system. In Proceedings of the 12th Pacific Conference on Computer Graphics and Applications, PG 2004, Seoul, Korea, 6–8 October 2004; pp. 361–370. [Google Scholar]
  7. Maimone, A.; Fuchs, H. Reducing interference between multiple structured light depth sensors using motion. In Proceedings of the IEEE Virtual Reality Short Papers & Posters, Costa Mesa, CA, USA, 4–8 March 2012. [Google Scholar]
  8. Wang, J.; Zhang, C.; Zhu, W.; Zhang, Z.; Chou, P.A. 3D scene reconstruction by multiple structured-light based commodity depth cameras. In Proceedings of the IEEE International Conference on Acoustics, New York, NY, USA, 26–29 October 2012. [Google Scholar]
  9. Tardif, J.P.; Roy, S.; Trudeau, M. Multi-projectors for arbitrary surfaces without explicit calibration nor reconstruction. In Proceedings of the IEEE International Conference on 3-d Digital Imaging & Modeling, Tübingen, Germany, 30 August–1 September 2004. [Google Scholar]
  10. Yan, Z.; Yu, L.; Yang, Y.; Liu, Q. Beyond the interference problem: Hierarchical patterns for multiple-projector structured light system. Appl. Opt. 2014, 53, 3621–3632. [Google Scholar] [CrossRef] [PubMed]
  11. Je, C.; Lee, K.H.; Lee, S.W. Multi-projector color structured-light vision. Signal Process. Image Commun. 2013, 28, 1046–1058. [Google Scholar] [CrossRef] [Green Version]
  12. Xiang, S.; Yu, L.; Yang, Y.; Liu, Q.; Zhou, J. Interfered depth map recovery with texture guidance for multiple structured light depth cameras. Image Commun. 2015, 31, 34–46. [Google Scholar] [CrossRef]
  13. Petkovic, T.; Pribanic, T.; Donlic, M.; Sturm, P. Efficient Separation Between Projected Patterns for Multiple Projector 3D People Scanning. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshop (ICCVW), Venice, Italy, 22–29 October 2017. [Google Scholar]
  14. Li, Y.; Qu, X.; Zhang, F.; Zhang, Y. Separation method of superimposed gratings in double-projector structured-light vision 3d measurement system. Opt. Commun. 2020, 456, 124676. [Google Scholar] [CrossRef]
  15. Wegiel, M.G.; Kujawinska, M. Fast 3d shape measurement based on color structure light projection. Proc. SPIE Int. Soc. Opt. Eng. 2003, 43, 437–446. [Google Scholar]
  16. Huang, P.S.; Hu, Q.; Jin, F.; Chiang, F.P. Color-encoded digital fringe projection technique for high-speed 3-d surface contouring. Opt. Eng. 1999, 38, 1065–1071. [Google Scholar] [CrossRef]
  17. Zuo, C.; Chen, Q.; Gu, G.; Feng, S.; Feng, F.; Li, R.; Shen, G. High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection. Opt. Lasers Eng. 2013, 51, 953–960. [Google Scholar] [CrossRef]
Figure 1. The color camera’s spectral response curve.
Figure 1. The color camera’s spectral response curve.
Applsci 11 00890 g001
Figure 2. The double-projector structured light system.
Figure 2. The double-projector structured light system.
Applsci 11 00890 g002
Figure 3. Low-frequency fringe superposition and separation: (a) superimposed fringe; (b)red-channel image; (c) blue-channel image; (d) cross profile of (b); (e) cross profile of (c).
Figure 3. Low-frequency fringe superposition and separation: (a) superimposed fringe; (b)red-channel image; (c) blue-channel image; (d) cross profile of (b); (e) cross profile of (c).
Applsci 11 00890 g003
Figure 4. High-frequency fringe superposition and separation: (a) superimposed fringe; (b)red-channel image; (c) blue-channel image; (d) cross profile of (b); (e) cross profile of (c).
Figure 4. High-frequency fringe superposition and separation: (a) superimposed fringe; (b)red-channel image; (c) blue-channel image; (d) cross profile of (b); (e) cross profile of (c).
Applsci 11 00890 g004
Figure 5. Red-channel image phase unwrapping: (a) low-frequency wrapped phase; (b) high-frequency wrapped phase; (c) unwrapped phase; (d) cross profile of (a); (e) cross profile of (b); (f) cross profile of (c).
Figure 5. Red-channel image phase unwrapping: (a) low-frequency wrapped phase; (b) high-frequency wrapped phase; (c) unwrapped phase; (d) cross profile of (a); (e) cross profile of (b); (f) cross profile of (c).
Applsci 11 00890 g005
Figure 6. Blue-channel image phase unwrapping: (a) low-frequency wrapped phase; (b) high-frequency wrapped phase; (c) unwrapped phase; (d) cross profile of (a); (e) cross profile of (b); (f) cross profile of (c).
Figure 6. Blue-channel image phase unwrapping: (a) low-frequency wrapped phase; (b) high-frequency wrapped phase; (c) unwrapped phase; (d) cross profile of (a); (e) cross profile of (b); (f) cross profile of (c).
Applsci 11 00890 g006aApplsci 11 00890 g006b
Figure 7. The measured object (simulation).
Figure 7. The measured object (simulation).
Applsci 11 00890 g007
Figure 8. The captured images (simulation): (a) low-frequency phase; (b) high-frequency phase.
Figure 8. The captured images (simulation): (a) low-frequency phase; (b) high-frequency phase.
Applsci 11 00890 g008
Figure 9. Fitting results: (a) left projection; (b) right projection.
Figure 9. Fitting results: (a) left projection; (b) right projection.
Applsci 11 00890 g009
Figure 10. The experimental device diagram.
Figure 10. The experimental device diagram.
Applsci 11 00890 g010
Figure 11. The superimposed gratings.
Figure 11. The superimposed gratings.
Applsci 11 00890 g011
Figure 12. Pictures of the red channel projected by the left projector.
Figure 12. Pictures of the red channel projected by the left projector.
Applsci 11 00890 g012
Figure 13. Pictures of the blue channel projected by the right projector.
Figure 13. Pictures of the blue channel projected by the right projector.
Applsci 11 00890 g013
Figure 14. Red-channel image phase unwrapping: (a) unwrapped phase; (b) cross profile of the blue line in (a); (c) cross profile of the red line in (a).
Figure 14. Red-channel image phase unwrapping: (a) unwrapped phase; (b) cross profile of the blue line in (a); (c) cross profile of the red line in (a).
Applsci 11 00890 g014
Figure 15. Blue-channel image phase unwrapping: (a) unwrapped phase; (b) cross profile of the blue line in (a); (c) cross profile of the red line in (a).
Figure 15. Blue-channel image phase unwrapping: (a) unwrapped phase; (b) cross profile of the blue line in (a); (c) cross profile of the red line in (a).
Applsci 11 00890 g015
Table 1. Fitting results for the hemisphere.
Table 1. Fitting results for the hemisphere.
ItemCoordinate of the Sphere Center/(mm, mm, mm)Radius/mmError/mm
Truth value(456, 570, 0)300N/A
Left projection(4,569,983, 570.0014, 0.0650)299.9507−0.0493
Right projection(456.0021, 569.9983, −0.0191)300.01990.0199
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, Y.; Qu, X.; Li, Y.; Zhang, F. A Separation Method of Superimposed Gratings in Double-Projector Fringe Projection Profilometry Using a Color Camera. Appl. Sci. 2021, 11, 890. https://doi.org/10.3390/app11030890

AMA Style

Zhang Y, Qu X, Li Y, Zhang F. A Separation Method of Superimposed Gratings in Double-Projector Fringe Projection Profilometry Using a Color Camera. Applied Sciences. 2021; 11(3):890. https://doi.org/10.3390/app11030890

Chicago/Turabian Style

Zhang, Yuanjun, Xinghua Qu, Yiming Li, and Fumin Zhang. 2021. "A Separation Method of Superimposed Gratings in Double-Projector Fringe Projection Profilometry Using a Color Camera" Applied Sciences 11, no. 3: 890. https://doi.org/10.3390/app11030890

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop