Next Article in Journal
Development of Infrared Reflective Textiles and Simulation of Their Effect in Cold-Protection Garments
Next Article in Special Issue
A Full-Color Holographic System Based on Taylor Rayleigh–Sommerfeld Diffraction Point Cloud Grid Algorithm
Previous Article in Journal
Optimized Unconventional Geometric Gates in Superconducting Circuits
Previous Article in Special Issue
Image Quality for Near-Eye Display Based on Holographic Waveguides
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Structured Light Patterns Work Like a Hologram

by
Sergey Alexandrovich Shoydin
1,2,* and
Artem Levonovich Pazoev
1
1
Department of Photonics and Device Engineering, Siberian State University of Geosystems and Technologies, 10 Plakhotnogo St., 630108 Novosibirsk, Russia
2
Laboratory of Dispersed Systems, Voevodsky Institute of Chemical Kinetics and Combustion of Siberian Branch of the Russian Academy of Sciences, 3 Institutskaya St., 630090 Novosibirsk, Russia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(6), 4037; https://doi.org/10.3390/app13064037
Submission received: 14 February 2023 / Revised: 10 March 2023 / Accepted: 17 March 2023 / Published: 22 March 2023
(This article belongs to the Special Issue Digital Holography: Novel Techniques and Its Applications)

Abstract

:

Featured Application

The results of this work can be used in the development of holographic television and 3D augmented reality systems, including the entire available range of electromagnetic radiation, acousto-optics and their multiplexing.

Abstract

The subject of this investigation is light diffraction from a structure formed at the lateral projection of structured light on the surface of a 3D object. It is demonstrated that the patterns of vertically structured light fringes change their structure during the lateral illumination of a 3D object, and take on the properties of holograms. The diffraction of light from this structure forms several diffraction orders, and one of them can restore the image of the 3D object. Results of the numerical experiment demonstrating the possibility to restore 3D holographic images by these patterns at a wavelength corresponding to Bragg conditions are presented. The obtained result allows an order of magnitude higher compression of the holographic information about a 3D object to be transmitted along the communication channel, in a volume that is sufficient for visual perception, and for the observation of both the horizontal and vertical continuous parallax. Results of the experiments on the transmission of this compressed information are presented to demonstrate that the TV frame rate frequency of the 3D holographic video sequence is quite achievable.

1. Introduction

The transmission of 3D holographic video content along conventional communication channels requires substantial compression of the holographic information, as demonstrated in [1,2,3,4,5]. It is shown in [6], through the analysis of the modern situation in this area, that the traditional methods of entropy coding are currently far from being able to implement a 106-fold compression of the holographic information, which is necessary for transmission. For this reason, an algorithm was proposed that provides the transmission of 3D holographic information in the form of two basic modalities of a 3D image, namely, the surface texture of the holographic object (the 3D scene) and its topographic depth map [7], which allows the necessary compression of the holographic information and its transmission similar to the transmission of a radio signal by means of single-sideband modulation (SSB). One of the methods of depth mapping is the technology of structured light lateral projection [8,9,10,11,12,13]. The vertical fringes of structured light form a pattern of spatial bands similar to a spatial frequency carrier hologram, and their deformation is similar to deviations containing the information on the 3D object. Light diffraction from a structure of this kind forms several orders so that one of these orders restores the image of the 3D depth map. The numerical superposition of the texture of the holographic object onto this map leads to the creation of an image of the 3D object. Another possible version of the restoration of a 3D image is in texture superposition directly on the structured light pattern in the form of vertical fringes. Diffraction on this structure in the minus-first order will restore the 3D image, as this happens in the case of holograms of focused images when the number of Rayleigh zones is much more than ten. This is similar to the restoration of the initial object by a hologram of focused images and simplifies the computer rendering of dynamically changing scenes restoring the 3D holographic video stream, which possesses continuous parallax, both horizontal and vertical, and high spatial resolution, not worse than Full HD and even 4K standard.

2. The Holographic Model

Modeling of the synthesis of the hologram and its corresponding structured light fringe pattern (SLFP) without loss of generality was carried out according to Leith–Upatnieks’ holographic scheme (Figure 1).
A photo-response proportional to τ(x1, y1) is formed in the hologram plane, and it forms the hologram in the case of either phase or amplitude holograms. For digital holograms, this response has the same appearance (1):
τ ( x 1 , y 1 ) ~ I ( x 1 , y 1 ) = | U o ( x 1 , y 1 ) + r ( x 1 , y 1 ) | 2 = = { I o ( x 1 , y 1 ) + I r ( x 1 , y 1 ) } { 1 + 2 I o ( x 1 , y 1 ) I r ( x 1 , y 1 ) I o ( x 1 , y 1 ) + I r ( x 1 , y 1 ) cos [ φ o ( x 1 , y 1 ) φ r ( x 1 , y 1 ) ] } = , = I { 1 + V cos [ φ o ( x 1 , y 1 ) φ r ( x 1 , y 1 ) ] }
where Io(x1, y1) and Ir(x1, y1) are the intensities of the object and reference waves, and φo and φr are their phases, respectively. The multiplier standing at the cosine is the visibility (V) of the interference structure of the hologram. The carrier spatial frequency formed in (1) of plane waves that are permanent over the (x1, y1) space in the simple case is proportional to the derivative of φo with respect to the coordinate, while its deviation, containing the information on the holographic object [14], is proportional to the derivative of φr depending on the wavelength λ, which allows one to separate diffraction orders in the restoration plane (x1, y1), while dx,ymax and dx,ymin are their deviations.
d x 1 = λ / sin θ x 1 ; d y 1 = λ / sin θ y 1 ; d x 1 max = λ / sin ( θ x 1 Δ θ min x 1 ) ; d y 1 max = λ / sin ( θ y 1 Δ θ min y 1 ) ; d x 1 min = λ / sin ( θ x 1 Δ θ max x 1 ) ; d y 1 min = λ / sin ( θ y 1 Δ θ max y 1 ) .
In the general case, the interference structure of the hologram is more complicated, because both in the carrier spatial frequency and in its deviation, the variable over the coordinates V(x1, y1) makes its own contribution to spatial harmonics, so it is better to study them by means of numerical modeling [14,15,16]. The classical approach to the digital synthesis of a hologram is in the computer modeling of the process according to Figure 1. Diffraction from this holographic structure forms several orders, one of which restores the 3D image of the holographic object. A comparison between the diffraction of this synthesized hologram and the pattern of the lateral projection of vertical structured-light fringes is presented below.

3. Numerical Experiment

The initial frames used to render 3D images of the holographic object are shown in Figure 2.
It is necessary to use Fresnel transformation to describe the propagation of monochromatic radiation in the space both for hologram synthesis and for image restoration by the hologram, but this transformation is very resource-intensive, so its analog, based on the fast Fourier transform (FFT) [17,18,19,20,21,22], is usually used. We use it by splitting a 3D holographic object with a depth of dz (Figure 1) into p layers. This allows one to accelerate the calculations but poses an inconvenient limitation: the same pixel resolution of the hologram, the holographed object, and the restored image. In addition, FFT works only with flat objects, with a depth not more than λ, so the treatment of deeper objects involves phase failure when it passes through π. To restore it, a complicated and noise-sensitive phase unwrapping algorithm is to be applied [23,24,25,26]. For a large number of phase wraps through 2π, the phase unwrapping method does not work well. To avoid this, we applied another algorithm. The holographic object was compressed by a factor of Ҟ in-depth to the size of λ/2, and then, after recording the hologram and restoring the image, again extended to the previous depth. The described operation did not affect the accuracy of depth reproduction because all of the variables are represented in the program in the floating-point format. This operation gives rise to the array of data in which, over the whole frame, the structured fringes shift by less than one fringe, and the phase-wrapping effect disappears. Then, to simplify the calculations without any loss of generality, we will depict the texture in one composite color.

4. The Synthesis of Holograms and Structured Fringe Patterns with the Same Period

To simplify calculation procedures, the consideration will be limited to the case of the holograms of focused images, when the plane (x0, y0) with the object of holography (Figure 1) is transferred to the plane (x1, y1), in which it would be necessary to perform Fresnel transformation, not at the stage of hologram synthesis but only later, when obtaining the image restored by the holograms. It is difficult to create material holograms in this way since the material object overlaps the reference beam with its shadow, and computer holograms are quite possible. A hologram with a period of 3 pixels per fringe, synthesized using the above-described method, is shown in Figure 3.
Similarly, by transforming the depth of the 3D object (Figure 2d) to the size of λ/2 and increasing the fringe frequency (Figure 2c) by their median multiplexing to 3 pixels per one fringe, we made the pattern of structured fringes with a period equal to the period of the carrier frequency at the hologram (Figure 4).
To compare the results of diffraction from structures in Figure 3 and Figure 4, Fourier transforms describing Fraunhofer diffraction were calculated, along with Fresnel transformations from each structure. The results for both the Fresnel and Fourier transforms are described below. The results of our further investigation, with the help of both transforms, turned out to be the same. For this reason, we show only the images obtained using the Fresnel transform below because they provide a more obvious illustration of image restoration at the final distance from the hologram and the same distance from the structured light fringe pattern. Both cases, namely the Fourier and Fresnel diffraction, correspond to image restoration according to the scheme shown in Figure 1 into the zero and plus-minus first diffraction orders for the corresponding hologram types.
The results obtained in the calculation of the Fresnel transform from the hologram (Figure 3) are shown in Figure 5a,b. Here, Figure 5a corresponds to the amplitude component of the Fresnel transform, while Figure 5b corresponds to the phase component. The observation distance of Fresnel diffraction was chosen to provide reliable spatial separation between all three diffraction orders as shown in Figure 1.
The results of similar calculations from the structured light fringe pattern (Figure 4) are presented in Figure 6a,b where Figure 6a corresponds to the amplitude component of the Fresnel transform, while Figure 6b corresponds to the phase component.
One can see that the diffraction patterns at the array of structured fringes (Figure 6) and the hologram (Figure 5) are very similar to each other. A more detailed investigation of the structure of the obtained holograms under substantial magnification confirms this similarity.

5. Image Restoration

Calculation of the inverse Fresnel transformation from the minus first diffraction order (Figure 5, right) gains the amplitude and phase of the 3D object recorded previously in the hologram of focused images (Figure 7).
Collected together, after multiplying the depth by compression factor 1/Ҟ fixed previously, they form the 3D object restored by the computer hologram (Figure 8).
Calculation of the inverse Fresnel transform from the similarly selected part of SLFP (Figure 6, right) allows us to obtain the amplitude and phase of the 3D object restored through diffraction at SLFP (Figure 9).
Collected together, after multiplying the depth by compression factor 1/Ҟ fixed previously, they also produce the 3D object shown in Figure 10.
Comparing two 3D images obtained by diffraction at SLFP (Figure 10) and by diffraction, at the hologram of focused images (Figure 8) one can see their essential similarity, which allows us to state that the array of vertically structured fringes possesses all major properties of a hologram—during irradiation with the coherent reference wave, it restores the 3D image of the object with the continuous horizontal and vertical parallax.
The difference between the depth maps of the object obtained from the hologram (Figure 7b) and the structured fringe pattern (Figure 9b) is illustrated in Figure 11.
The differences between the restored depths turned out to be within ±5%, which is a good result. This is more obvious in the illustration shoring the combined profiles (Figure 12).

6. Discussion of Results

In Figure 12a, one can see almost completely coinciding profiles shifted with respect to each other in the vertical direction artificially to provide a convenient comparison, while the profiles are combined in Figure 12b. One can see that the profiles almost coincide with each other. Small deviations may be caused by the inaccurate fixation of distortions of the structured light patterns. Their manifestation in the form of distortion of the restored wavefront is close in essence to the field aberrations of cushion-type distortion, which distorts the geometric dimensions of the image but does not determine the resolution.
This is quite clear because the oblique projection of the vertically structured fringes falling on the 3D object (Figure 13a) produces their shift in the plane perpendicular to the optical axis of the photo recording device, similar to the shift of interference fringes during the recording of the hologram (Figure 13b).
Comparing Figure 13a,b one can see a principal similarity between the mechanisms of distortion of the interference and structured fringes, which points to the fact that diffraction at structured fringes forms the 3D images of the initial object restored in minus first order, close in shape to the images restored by computer holograms. In the initial form, these patterns of structured fringes are usually close to the holograms of the radio range, which is determined by their frequency in the initial pattern of the project onto the object. However, their frequency may be increased by simple linear transformations without resource-intensive integration, leading to a frequency that is equivalent to the carrier frequency within any wavelength range including visible, infrared (IR), ultraviolet (UV), etc. The pattern of structured fringes shown in Figure 2c, before transformation that increases the carrier frequency (Figure 4b), may be considered as the diffraction structure operating as a hologram synthesized in the radio range of the electromagnetic spectrum.
Comparing the obtained 3D images in Figure 8 and Figure 10, one may see that the hologram and the system of structured fringes restore very similar, full-value 3D images. Their iso-projections demonstrating evident similarity are presented in Figure 14 and Figure 15.
The image restored by the hologram and the image restored by the system of structured fringes possess both the horizontal and vertical quasi-continuous parallax, so they are full-value holograms. We use the term ‘quasi-continuous’ only in relation to the digital representation of the signal. The degree of parallax continuity is limited only by its digital representation and may be reduced to the desired minimum by increasing the array of numerical representations of the holographed object and the hologram. A small difference between the two restored 3D images of the living object is seen in the top part of the depth map of the object. In our case, this is the region of the nose of the 3D image. This is likely a consequence of the arising longitudinal aberrations, which will be the subject of a more detailed investigation.
Attention should be paid to the non-trivial fact that the system of lateral vertical fringes illuminating a 3D object forms a structure that has information not only about the horizontal but also about the vertical parallax, almost the same as a hologram calculated in parallel in the traditional way. Therefore, the method of formation of the 3D object hologram chosen by us previously may be simplified substantially, which is extremely important for the acceleration of transmission [27,28,29] and computer processing of the series of frames of 3D video and augmented reality.
The use of the above-obtained result is proposed in order to reduce the amount of transmitted information even in comparison with that proposed in [7], however, remaining sufficient for forming the holographic video sequence at the receiving end of the radio communication channel. For this purpose, it is proposed that one should transmit, instead of two 2D frames (depth and texture maps), only one 2D frame but have it enlarged somewhat by twenty-five columns. For instance, for the Full HD standard, this will be not the 1920 × 1080 frame but a 1945 × 1080 frame, in which the additional 25 columns will carry the information on the structured light pattern in the form of vertical fringes that were illuminating the holographed object (Figure 2c). At the receiving end of the communication channel, these 25 columns will be used to create a diffraction structure similar to that shown in Figure 4a, restoring the 3D image of the depth map (Figure 9b). Superposition of the texture frame on this diffraction structure allows restoration of the 3D image of the object as a total, as shown in Figure 10. It should be stressed that the latter will work efficiently only in the case of a rather short Rayleigh distance, while diffraction has not yet blurred up the texture image. This condition is known to be fulfilled if the number of Fresnel zones fitting into the hologram field is much larger than unity. This condition is actually in the holograms of focused images when the 3D image is brought ahead of the hologram at the space depth not more than several diagonals of the holographic monitor or the hologram.

7. Transmission of 3D Holographic Information along the Radio Channel

As mentioned in the Introduction, the transmission of holographic information along communication channels involves the problem connected with the high capacity of holograms. The possibility of efficient compression of holographic information, similar to the transmission at one sideband known in radio electronics, was demonstrated in patent RU 2707582 C1 [7]. In this situation, as shown in [14], this compression is already sufficient for the transmission of 3D holographic information along the radio communication channel. It is even more so if, as described above, the transmission of one of the two components of 3D image basic modalities (depth maps of the holographed 3D object) is replaced by the structured light pattern formed by photo capture of the lateral projection of straight fringes on the 3D object. Indeed, in this case, the load on the communication channel and computational resources at the receiving end decreases more substantially. Along with the texture of the 3D object surface, not two 2D frames but only one is transmitted, though it will be somewhat wider than that in the conventional standard, or two 2D frames but with the second one possessing an order of magnitude lower resolution than the first one in the plane perpendicular to the initial direction of structured light fringes. In fact, even the necessity to form the second frame disappears. All its information is fitted into a small additive to the first frame with the texture, similar to the case of analog TV frame encoding when each scanned line ends with the operation information concerning line or frame change. However, in our case, the amount of operation information is larger, and this is more like the wide-screen frame format. As described above, diffraction at the structured fringe pattern forms a 3D image of the depth map of the holographed object. The experimental transmission of this compressed information along a wireless Wi-Fi communication channel, with a frequency bandwidth of 40 MHz, and a frame rate of more than 25 frames per second has been described in the conference proceedings [27]. Developing the obtained result, we repeated the experiment on the transmission of holographic information of 3D images along a wireless Wi-Fi communication channel for 3D video transmission with the help of the FTP protocol. In this case, each transmitted frame of the 3D image was one 2D frame—texture (2000 × 2000 pixels) and the pattern of structured fringes (25 × 2000 pixels) (Figure 16).
To imitate the transmission of a video sequence, the packets containing 291 such frames were transmitted simultaneously. The time of packet transmission, measured by the FileZilla software, during real-time representation shows that the transmission of the complete holographic information about a dynamic 3D object is quite implementable both for the frame standard similar to HD and for the Full HD.
In the experiment on the transmission of holographic 3D video, the transmitted 2D frames composed of the texture and the operation information were formed with different compression degrees.
All frames were preliminarily prepared in the following formats: BMP (without compression), PNG (compression without losses), and JPEG (compression with losses, 70% quality). Experiments were carried out for each of the listed formats.
The sizes of frames with texture and mask were chosen in the following combinations:
  • 500 × 500 pix. texture + 25 × 500 pix. structured fringe pattern;
  • 1000 × 1000 pix. texture + 25 × 1000 pix. structured fringe pattern;
  • 2000 × 2000 pix. texture + 25 × 2000 pix. structured fringe pattern.
For convenient test transmission, all frame sets were packed into uncompressed ZIP archives.
Results of the measurement of packet transmission time were processed and are shown in Table 1.
The first column of the Table indicates the running number of an experiment with test transmission of the holographic 3D information. The column entitled “Resolution, pix.” shows the resolution of frames with texture image (T) and mask (M), respectively. The column entitled «Frame format» indicates the formats of files into which the initial image arrays were compressed. The number of frames transmitted in a packet is shown in the column entitled “Number of frames”. The column entitled «Packet volume, MB» represents the volume, in MB, of the transmitted packet with frames, where indices T and M mark the volumes of a frame with texture and a frame with mask separately. The column entitled “Communication channel capacity” shows the transmission capacity of the Wi-Fi communication channel used, in MB/s. The column “Transmission frame rate” lists the frequency of frames transmission, determined on the basis of the measured rate of data transmission along the Wi-Fi communication channel, in frames/s.
One can see that the frame rate turned out to be more than 25 frames per second in experiments Nos. 1–3, 5–6, and 8–9 (rows in the table are highlighted in green). This means that it is possible to transmit the 3D holographic information in PNG and JPEG formats both for the TV standard similar to SECAM and for HD and Full HD. The BMP format is unsuitable for frame transmission in our case for the resolution higher than that involved in Experiment No. 1, because of the absence of any conventional type of compression of the transmitted 2D frames, and, as a consequence, too bulky files with the frames to be transmitted. Nevertheless, the BMP format is quite suitable in the proposed technology for use in the systems of holographic phototelegraphy, as one can see in the results of experiments in rows 4 and 7.
The versions of 3D data transmission shown in Table 1 may be applied in various 3D augmented-reality facilities, in particular, in telemedicine, in the systems of remote control of complicated objects, including safety systems, 3D phototelegraphy, etc.

8. Conclusions

Results reported herein on the transmission of model 3D video with each frame received at the receiving end of a Wi-Fi radio communication channel allowed us to obtain virtual computer holograms and restore 3D images with high spatial resolution, not worse than that of Full HD, and possessing continuous parallax, that is, meeting all the requirements formulated in [30] to this video sequence.
At the current stage of the investigation, due to experimental equipment limitations, we show the results of an experiment on the transmission of one such 3D frame 291 times, which corresponds to the video sequence lasting for slightly more than 11 s. It is evident that after providing the necessary equipment, it will be quite possible to create videos of living, moving 3D objects within the framework of the procedure described above. In this case, for the full-scale implementation of the 3D TV project and 3D augmented reality, it will be necessary to focus efforts on the elaboration of an efficient 3D holographic monitor as a development of the review and pioneering works, some of which are referenced herein [4,6,7,8,14,17,23,27,30,31].

Author Contributions

Conceptualization, S.A.S. and A.L.P.; Software, A.L.P.; Validation, A.L.P.; Formal analysis, S.A.S. and A.L.P.; Data curation, S.A.S. All authors have contributed equally. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to the fact that the texture, mask, and 3D images used in the study are the personal images of the first author of this article, Sergey Alexandrovich Shoydin. Sergey Alexandrovich Shoydin hereby declares that his images can be used in this article. The authors of other scientific articles can also use the 3D images, mask, and texture with the image that Sergey Alexandrovich Shoydin used in this work in their scientific articles with the obligatory reference to this work.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are available from the authors upon request.

Acknowledgments

The authors thank all reviewers for their helpful comments and suggestions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Titar, V.P.; Bogdanova, T.V. Issues in creating holographic television system. Radioelectron. Inform. 1999, 2, 38–42. (In Russian) [Google Scholar]
  2. Denisyuk, Y.N. Are the fundamental principles of holography well-known enough for creating new types of three-dimensional films and artificial intelligence? Tech. Phys. 1991, 61, 149–161. (In Russian) [Google Scholar]
  3. Komar, V.G. Information-based evaluation of the image quality in cinematographic systems. Technol. Cine. Telev. 1971, 10, 9–22. (In Russian) [Google Scholar]
  4. Lucente, M. Computational holographic bandwidth compression. IBM Syst. J. 1996, 35, 349–365. [Google Scholar] [CrossRef]
  5. Trejos, S.; Gómez, M.; Velez-Zea, A.; Barrera-Ramírez, J.F.; Torroba, R. Compression of 3D dynamic holographic scenes in the Fresnel domain. Appl. Opt. 2020, 59, D230–D238. [Google Scholar] [CrossRef] [PubMed]
  6. Kang, H.; Ahn, C.; Lee, S.; Lee, S. Computer-generated 3D holograms of depth-annotated images. Proc. SPIE 2005, 5742, 234–241. [Google Scholar] [CrossRef]
  7. Shoydin, S.A. Method of Holographic Recording Remote Formation. Patent RF No. 2707582, 28 November 2019. [Google Scholar]
  8. Takeda, M. Fourier fringe analysis and its application to metrology of extreme physical phenomena: A review. Appl. Opt. 2013, 52, 20–29. (In Russian) [Google Scholar] [CrossRef] [PubMed]
  9. Takeda, M.; Mutoh, K. Fourier transform profilometry for the automatic measurement of 3-D object shapes. Appl. Opt. 1983, 22, 3977–3982. [Google Scholar] [CrossRef] [PubMed]
  10. Vishnyakov, G.N.; Loshchilov, K.E. Optical schemes for measuring the shape of three-dimensional objects by the stripe projection method. Opt. J. 2011, 78, 42–47. Available online: https://opticjourn.ru/vipuski/394-opticheskij-zhurnal-tom-78-02-2011.html (accessed on 1 March 2023). (In Russian).
  11. Burke, J.; Bothe, T.; Osten, W.; Hess, C.F. Reverse engineering by fringe projection. In Proceedings of the SPIE Interferometry XI, Applications, Seattle, WA, USA, 19 June 2002; Volume 4778. [Google Scholar] [CrossRef]
  12. Garbat, P.; Kujawinska, M. Combining fringe projection method of 3D object monitoring with virtual reality environment: Concept and initial results. In Proceedings of the First International Symposium on 3D Data Processing Visualization and Transmission, Padua, Italy, 19–21 June 2002; pp. 504–508. [Google Scholar] [CrossRef]
  13. Thorstensen, J.; Thielemann, J.; Risholm, P.; Gjessing, J. Compact interferometric projector for high accuracy 3D imaging in space. In Proceedings of the OSA Imaging and Applied Optics Congress 2021 (3D, COSI, DH, ISA, pcAOP), Washington, DC, USA, 19 July 2021; p. 3Th2D.1. [Google Scholar] [CrossRef]
  14. Shoydin, S.A.; Pazoev, A.L. Transmission of 3D holographic information via conventional communication channels and the possibility of multiplexing in the implementation of 3D hyperspectral images. Photonics 2021, 8, 448. [Google Scholar] [CrossRef]
  15. Shoydin, S.A.; Pazoev, A.L. Remote Formation of Holographic Record. Optoelectron. Instrum. Data Process. 2021, 57, 80–88. [Google Scholar] [CrossRef]
  16. Shoydin, S.A.; Pazoev, A.L.; Smyk, A.F.; Shurygin, A.V. Holograms of a 3D object synthesized at the receiving end of the communication channel in Dot Matrix technology. Comput. Opt. 2022, 46, 204–213. (In Russian) [Google Scholar] [CrossRef]
  17. Picart, P.; Li, J.-c. Digital Holography; John Wiley & Sons: Hoboken, NJ, USA, 2012; pp. 88–92. [Google Scholar]
  18. Poon, T.-C.; Liu, J.-P. Introduction to Modern Digital Holography with MATLAB; Cambridge University Press, University Printing House: Cambridge, UK, 2014; p. 215. ISBN 978-1-107-01670-5. [Google Scholar]
  19. Picart, P. New Techniques in Digital Holography; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2015; p. 303. ISBN 978-1-84821-773-7. [Google Scholar]
  20. Khare, K.; Butola, M.; Rajora, S. Fourier Optics and Computational Imaging; Springer International Publishing: New York, NY, USA, 2023; p. 294. ISBN 978-3-031-18353-9. [Google Scholar] [CrossRef]
  21. Gruzman, I.S.; Kirichuk, V.S.; Kosykh, V.P.; Peretiagin, G.I.; Spektor, A.A. Digital Image Processing in Information Systems: A Tutorial; NSTU Publishers: Novosibirsk, Russia, 2002; p. 352. ISBN 5-7782-0330-6. (In Russian) [Google Scholar]
  22. Dudgeon, D.E.; Mersereau, R.M. Multidimensional Digital Signal Processing; Prentice-Hall, Inc.: Englewood Cliffs, NJ, USA, 1984; p. 400. ISBN 5-03-000402-5. [Google Scholar]
  23. Shevkunov, I.A. The method of phase unwrapping by the restored field shift. Bull. St. Petersburg Univ. 2015, 60, 395–401. (In Russian) [Google Scholar]
  24. Su, X.; Chen, W. Reliability-guided phase unwrapping algorithm: A review. Opt. Lasers Eng. 2004, 42, 245–261. [Google Scholar] [CrossRef]
  25. Belashov, A.V.; Petrov, N.V.; Semenova, I.V. Method for calculating the dynamic phase delay in holographic interferometry without phase unwrapping. Comput. Opt. 2014, 38, 704–709. (In Russian) [Google Scholar] [CrossRef]
  26. Lucente, M.E. Holographic bandwidth compression using spatial subsampling. Opt. Eng. 1996, 35, 1529–1537. [Google Scholar] [CrossRef] [Green Version]
  27. Pazoev, A.L.; Shoydin, S.A. Transmission of 3D holographic information along the radio channel. In Proceedings of the XXXII International School Symposium on Holography, Coherent Optics and Photonics (HOLOSCHOOL XXXII) 2022, Cambridge, UK, 1–4 August 2022; pp. 132–134. (In Russian). [Google Scholar]
  28. Pazoev, A.L.; Shoydin, S.A. Transmission of holographic information on a single sideband. In Collection of Materials of the National Conference with International Participation “SibOptika-2021”. Section 2: Optical and Optoelectronic Instrumentation; SSUGT: Novosibirsk, Russia, 2021; Volume 8, pp. 109–117, (In Russian). [Google Scholar] [CrossRef]
  29. Pazoev, A.L.; Shoydin, S.A. Transmission of 3D holographic information over a radio channel by a method close to SSB. Sci. Tech. J. Inf. Technol. Mech. Opt. 2023, 23, 21–27. (In Russian) [Google Scholar] [CrossRef]
  30. Blinder, D.; Ahar, A.; Bettens, S.; Birnbaum, T.; Symeonidou, A.; Ottevaere, H.; Schretter, C.; Schelkens, P. Signal processing challenges for digital holographic video display systems. Signal Process. Image Commun. 2019, 70, 114–130. [Google Scholar] [CrossRef]
  31. Sheridan, J.T.; Kostuk, R.K.; Fimia Gil, A.; Wang, Y.; Lu, W.; Zhong, H.; Tomita, Y.; Neipp, C.; Francés, J.; Gallego, S.; et al. Roadmap on holography. J. Opt. 2020, 22, 123002. [Google Scholar] [CrossRef]
Figure 1. Schematic synthesis of a digital discrete hologram: x0—object plane; x1—hologram plane; z—the direction of object wave propagation; L0x—object width, mm; N0x—object width, pixels; θ—the incident angle of the reference wave creating the spatial carrier frequency of the hologram; θmin—the minimal spatial carrier deviation angle formed by the object; θmax—the maximal spatial carrier deviation angle formed by the object; Lx—the object field width, mm; Nx—the object field width, pixels; ∆x—discrete (pixel) width; dz—object depth; zwr—distance between the object base and the hologram; and zrec—the distance between the hologram and the restored image.
Figure 1. Schematic synthesis of a digital discrete hologram: x0—object plane; x1—hologram plane; z—the direction of object wave propagation; L0x—object width, mm; N0x—object width, pixels; θ—the incident angle of the reference wave creating the spatial carrier frequency of the hologram; θmin—the minimal spatial carrier deviation angle formed by the object; θmax—the maximal spatial carrier deviation angle formed by the object; Lx—the object field width, mm; Nx—the object field width, pixels; ∆x—discrete (pixel) width; dz—object depth; zwr—distance between the object base and the hologram; and zrec—the distance between the hologram and the restored image.
Applsci 13 04037 g001
Figure 2. Initial frames of digital processing: (a) RGB image of the scene to be holographed (texture); (b) the image of the scene to be holographed, with the projection of structured fringes superposed on the holographic object; (c) the selected fringe array; (d) transformation of the horizontal shift to deep grey (depth map); and (e) virtual 3D model of the holographic object.
Figure 2. Initial frames of digital processing: (a) RGB image of the scene to be holographed (texture); (b) the image of the scene to be holographed, with the projection of structured fringes superposed on the holographic object; (c) the selected fringe array; (d) transformation of the horizontal shift to deep grey (depth map); and (e) virtual 3D model of the holographic object.
Applsci 13 04037 g002
Figure 3. Hologram of focused images: (a) from the object—depth map; (b) from the full 3D object—depth map with superposed texture.
Figure 3. Hologram of focused images: (a) from the object—depth map; (b) from the full 3D object—depth map with superposed texture.
Applsci 13 04037 g003
Figure 4. The patterns of structured fringes over the depth map from Figure 2d with a period of 3 pixels and the depth reduced to λ/2: (a) vertical structured fringes are projected onto the mask, (b) the fringes superposed on the mask are stitched on the texture.
Figure 4. The patterns of structured fringes over the depth map from Figure 2d with a period of 3 pixels and the depth reduced to λ/2: (a) vertical structured fringes are projected onto the mask, (b) the fringes superposed on the mask are stitched on the texture.
Applsci 13 04037 g004
Figure 5. Fresnel spectrum from the hologram shown in Figure 3: (a) the amplitude component of the spectrum with a colorbar from 0 to 100%; (b) the phase component of the spectrum with a colorbar from −π to +π.
Figure 5. Fresnel spectrum from the hologram shown in Figure 3: (a) the amplitude component of the spectrum with a colorbar from 0 to 100%; (b) the phase component of the spectrum with a colorbar from −π to +π.
Applsci 13 04037 g005
Figure 6. Fresnel spectrum from the structured fringe pattern shown in Figure 4: (a) the amplitude component of the spectrum with a colorbar from 0 to 100%; (b) the phase component of the spectrum with a colorbar from −π to +π.
Figure 6. Fresnel spectrum from the structured fringe pattern shown in Figure 4: (a) the amplitude component of the spectrum with a colorbar from 0 to 100%; (b) the phase component of the spectrum with a colorbar from −π to +π.
Applsci 13 04037 g006
Figure 7. Inverse Fresnel transformation from the minus first spectral order of the hologram of focused images of the 3D object of holographing: (a) the amplitude of restored image; (b) phase.
Figure 7. Inverse Fresnel transformation from the minus first spectral order of the hologram of focused images of the 3D object of holographing: (a) the amplitude of restored image; (b) phase.
Applsci 13 04037 g007
Figure 8. 3D object restored by the hologram of the focused image.
Figure 8. 3D object restored by the hologram of the focused image.
Applsci 13 04037 g008
Figure 9. Inverse Fresnel transform from minus first diffraction order at structured fringes (Figure 6): (a) the amplitude of restored image; (b) the phase.
Figure 9. Inverse Fresnel transform from minus first diffraction order at structured fringes (Figure 6): (a) the amplitude of restored image; (b) the phase.
Applsci 13 04037 g009
Figure 10. 3D object restored by means of the inverse Fresnel transform from the SLFP minus first order spectrum.
Figure 10. 3D object restored by means of the inverse Fresnel transform from the SLFP minus first order spectrum.
Applsci 13 04037 g010
Figure 11. Difference between the image depth maps (in percent to the absolute depth of the object) restored by the hologram and by SLFP.
Figure 11. Difference between the image depth maps (in percent to the absolute depth of the object) restored by the hologram and by SLFP.
Applsci 13 04037 g011
Figure 12. Profiles of the central region of the depth map: blue—initial (Figure 2d), green—restored by the hologram (Figure 8), red—restored by SLFP (Figure 10); (a) for clarity, the profiles are shifted along the vertical direction; (b) the profiles are combined.
Figure 12. Profiles of the central region of the depth map: blue—initial (Figure 2d), green—restored by the hologram (Figure 8), red—restored by SLFP (Figure 10); (a) for clarity, the profiles are shifted along the vertical direction; (b) the profiles are combined.
Applsci 13 04037 g012
Figure 13. Projection of structured (a) and interference (b) fringes at an angle of θ onto the 3D object shaped as a prism. The shift ∆i of fringes is proportional to the local thickness of the 3D object. a—object beam; r—reference beam; dz—object depth.
Figure 13. Projection of structured (a) and interference (b) fringes at an angle of θ onto the 3D object shaped as a prism. The shift ∆i of fringes is proportional to the local thickness of the 3D object. a—object beam; r—reference beam; dz—object depth.
Applsci 13 04037 g013
Figure 14. The living 3D object image is restored by the hologram: (a) horizontal parallax; (b) vertical parallax.
Figure 14. The living 3D object image is restored by the hologram: (a) horizontal parallax; (b) vertical parallax.
Applsci 13 04037 g014
Figure 15. The living 3D object image is restored by the system of structured fringes: (a) horizontal parallax; (b) vertical parallax.
Figure 15. The living 3D object image is restored by the system of structured fringes: (a) horizontal parallax; (b) vertical parallax.
Applsci 13 04037 g015
Figure 16. The frame of the 3D image: (a) texture (2000 × 2000 pixels); (b) the pattern of structured fringes (25 × 2000 pixels).
Figure 16. The frame of the 3D image: (a) texture (2000 × 2000 pixels); (b) the pattern of structured fringes (25 × 2000 pixels).
Applsci 13 04037 g016
Table 1. Time of packet transmission and transmission frame rate.
Table 1. Time of packet transmission and transmission frame rate.
Resolution, Pix.
(Columns × Rows)
Frame FormatNumber of Frames, PiecesPacket Volume, MBCommunication Channel Capacity, MB/sTransmission Frame Rate, Frames/s
TMTotal
1T 500 × 500
M 25 × 500
BMP29169.684.1873.8611.244.12
2T 500 × 500
M 25 × 500
PNG29115.860.8716.7411.2194.73
3T 500 × 500
M 25 × 500
JPG2914.220.845.0611.2644.07
4T 1000 × 1000
M 25 × 1000
BMP291277.828.07285.8911.211.40
5T 1000 × 1000
M 25 × 1000
PNG29152.941.2254.1611.260.18
6T 1000 × 1000
M 25 × 1000
JPG29111.781.3613.1411.2248.09
7T 2000 × 2000
M 25 × 2000
BMP2911110.3815.841126.2211.22.89
8T 2000 × 2000
M 25 × 2000
PNG291115.201.58116.7811.227.91
9T 2000 × 2000
M 25 × 2000
JPG29130.532.3432.8711.299.14
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shoydin, S.A.; Pazoev, A.L. Structured Light Patterns Work Like a Hologram. Appl. Sci. 2023, 13, 4037. https://doi.org/10.3390/app13064037

AMA Style

Shoydin SA, Pazoev AL. Structured Light Patterns Work Like a Hologram. Applied Sciences. 2023; 13(6):4037. https://doi.org/10.3390/app13064037

Chicago/Turabian Style

Shoydin, Sergey Alexandrovich, and Artem Levonovich Pazoev. 2023. "Structured Light Patterns Work Like a Hologram" Applied Sciences 13, no. 6: 4037. https://doi.org/10.3390/app13064037

APA Style

Shoydin, S. A., & Pazoev, A. L. (2023). Structured Light Patterns Work Like a Hologram. Applied Sciences, 13(6), 4037. https://doi.org/10.3390/app13064037

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop