Next Article in Journal
Super Broad Non-Hermitian Line Shape from Out-of-Phase and In-Phase Photon-Phonon Dressing in Eu3+: NaYF4 and Eu3+: BiPO4
Previous Article in Journal
Polarization-Independent Focusing Vortex Beam Generation Based on Ultra-Thin Spiral Diffractive Lens on Fiber End-Facet
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Ultrafast Optical Imaging System with Anamorphic Transformation Based on STEAM Structure

1
School of Microelectronics, Shenzhen Institute of Information Technology, Shenzhen 518172, China
2
Department of Periodontology, Shenzhen Stomatological Hospital, Southern Medical University, Shenzhen 518005, China
3
Center for Cognition and Neuroergonomics, State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University at Zhuhai, Zhuhai 519087, China
4
Zhongshan Institute of Changchun University of Science and Technology, Zhongshan 528437, China
5
School of Electrical Information Engineering, Hunan Institute of Technology, Hengyang 421002, China
6
School of Engineering and Digital Arts, University of Kent, Canterbury CT2 7NT, UK
*
Authors to whom correspondence should be addressed.
Photonics 2024, 11(12), 1168; https://doi.org/10.3390/photonics11121168
Submission received: 25 October 2024 / Revised: 19 November 2024 / Accepted: 10 December 2024 / Published: 12 December 2024

Abstract

:
The time-stretch (TS) imaging system is one type of ultrafast optical imaging system that enables imaging with an unprecedented imaging speed of tens of hundreds of megahertz. The TS imaging achieves linear one-to-one mapping between wavelength and time using a temporal dispersive medium. However, the data issue of high throughput and the fixed resolution in TS imaging limits its wide applications. In this paper, we propose an ultrafast optical imaging system with anamorphic transformation (AT) based on the STEAM structure, which has the benefit of data compression and changing group delay-related resolution. AT is obtained by the usage of chirped fiber Bragg grating (CFBG) with a nonlinear group delay profile. A state-of-the-art experimental demonstration shows that more acquired data are employed to describe the dense information region and the group delay-related resolution is improved by 58% using our proposed ultrafast optical imaging system without reducing the line scanning speed of 50 MHz. The proposal could increase the group delay-related resolution of the target image without adding extra data acquisition or changing the system setup, which has great potential in ultrafast optical imaging systems. Furthermore, the AT in our proposal could also be applied to data compression algorithms to mitigate the data issue in ultrafast optical imaging systems.

1. Introduction

As a promising and thriving imaging technique in modern ultrafast optical imaging systems, the time-stretch (TS) technique exploits the resolvability of ultrafast broadband laser pulses in both spectral and temporal domains and realizes the continuous data acquisition at an ultra-high frame rate, which encodes the spectral information into the temporal domain for ultrafast optical imaging based on a chromatic medium. The TS technique has already been comprehensively and extensively utilized in a variety of scientific, industrial, and biomedical applications, including optical coherence tomography (OCT) [1,2,3,4,5], observation of fast dynamic phenomena [6], biomedical tissue imaging of TS microscopy [7,8,9], cell imaging and blood screening [10,11,12], and compressive sensing (CS) optical imaging systems [13,14,15,16,17,18].
Serial time-encoded amplified microscopy (STEAM) is one typical TS imaging approach, with an unprecedented imaging speed of tens of hundreds of million frames per second [19,20,21,22], which conquers the bottleneck of the tradeoff between imaging speed and sensitivity. STEAM mainly contains two procedures. The first procedure is called wavelength/spectrum-to-time conversion, also called TS or dispersive Fourier transformation (DFT) or wavelength/spectrum-to-time one-to-one linear mapping, which is resulted from the application of temporal dispersive devices, such as long-distance single-mode fibers or dispersive compensating fibers (DCF) [23]. Figure 1a shows the one-to-one linear mapping between wavelength and time. The other procedure is called space-to-wavelength/spectrum conversion, also known as the space-to-wavelength/spectrum one-to-one linear mapping, which is induced by the utilization of spatial dispersive devices, such as gratings or virtually imaged phased arrays (VIPAs) [24,25]. Figure 1b shows the one-to-one linear mapping between wavelength and space. From the above two procedures, a linear mapping between space (imaging information) and time (1D time serial information) is achieved. Figure 1c shows the one-to-one linear mapping between time and space via the wavelength. By analyzing the 1D time serial information, the imaging information of the original image can be reconstructed.
As a result, the basic principle of STEAM is to map the spatial information into the spectrum of the detected optical imaging information, and then map the spectrally encoded information from the frequency domain to the time domain through a temporal dispersive medium to form a one-dimensional time serial data stream. The key advantage of this approach is that it can overcome the speed limit of traditional imaging methods to achieve ultrafast imaging. Recent research advances have shown that STEAM-based ultrafast imaging systems are constantly being optimized to improve image quality, enhance system stability, and expand their applications in a variety of comprehensive fields. Due to the feature of ultrafast optical imaging with continuous throughput, STEAM has been successfully employed in ultrafast optical imaging [26,27], deep learning and classification [18], biomedical imaging [8,9,10,11,12,28], and CS optical imaging [13,14,15,16,17,18,29].
Despite its ultrafast imaging capabilities, the STEAM-based ultrafast imaging systems still face some challenges in practical applications. For example, to process the large amount of data generated during ultrafast imaging, efficient signal processing algorithms are needed. There are two approaches to compress the data and mitigate the large amount of data. The first approach is using the CS method [13,14,15,16,17,18,28,29,30,31,32]. CS is an emerging information processing technology, which takes advantage of the sparsity of imaging information and realizes the data acquisition and processing of high-resolution information by collecting far fewer measurements than required by the Nyquist sampling theorem [28,29]. The core idea of CS is to reconstruct the original information based on the sparsity of the information by using the number of sampling points far below the Nyquist sampling rate. However, the CS needs corresponding systematic configuration and complicated information processing algorithms, which will increase the cost and complexity of the ultrafast imaging system [30,31,32]. The other approach is anamorphic transformation (AT) [33,34], which is also known as anamorphic stretch transform (AST) or warped stretch transform (WST). AT is an advanced optical data compression technique that realizes real-time imaging compression by nonlinear time-domain stretching [35]. The AT technique is a variant of TS imaging, which achieves real-time optical compression of images by deliberately applying highly nonlinear temporal dispersion over a linear frequency-to-time mapping process while maintaining fast imaging speeds. Unlike CS, AT does not require featured detection and iterative detection, which makes it a potential candidate when dealing with large amounts of data. Furthermore, the nonlinear mapping between wavelength/spectrum to time could improve the group delay-related resolution, which gives more details about the target images.
Thus, in this work, we propose the ultrafast optical imaging system with AT based on the STEAM structure. AT is implemented by using a designed dispersive device such as a chirped fiber Bragg grating (CFBG) with a nonlinear group delay profile. Our proposal improves the group delay-related resolution while at the same time maintaining the same line scanning speed, which lays a solid foundation for high-speed, high-throughput, and data-efficient imaging.

2. Methods

Thanks to the utilization of dispersive devices with nonlinear group delay, AT could achieve a nonlinear mapping between wavelength and time. The nonlinear nature of this transformation provides new possibilities for data compression both in amplitude and phase operations. In the field of imaging data compression, the application of AT is mainly reflected in the realization of data compression by increasing the spatial coherence of images. And this compression is achieved by the mathematical reconstruction of the image, rather than by modifying the sampling process. AT technology has a wide range of potential applications in areas such as industrial and biomedical imaging, providing high-resolution high-speed imaging while reducing the required data storage and processing requirements through optical data compression [36,37]. Hence, the development of AT technology provides a new solution for high-speed high-throughput imaging at the same time with the benefit of optical data compression.
Figure 2 shows the schematic of AT imaging compared with normal TS imaging. A spatial dispersive device is employed to convert the one-to-one mapping between the space and spectrum/wavelength of the broadband optical pulse. The spectrum of the broadband optical pulse is transformed into a rainbow to illuminate the image. Then, the image information is encoded into the spectrum due to the usage of a spatial dispersive device. In normal TS imaging, a temporal dispersive device with a linear group delay profile such as DCF is applied to achieve the linear mapping between time and spectrum/wavelength. Hence, the uniform space/spectrum/time of one-to-one-to-one mapping is achieved. Figure 2a illustrates the uniform spectrum of rainbow pulse illuminating the image. Figure 2c describes the one-to-one linear mapping between time and spectrum/wavelength when the broadband optical pulse illuminates the sample. The drawback of the approach is that it oversamples the sparse marginal area of the image with superabundant data while undersampling the rest of the imaging area.
In contrast, another temporal dispersive device with a nonlinear group delay profile such as CFBG is utilized to perform the nonlinear mapping between time and spectrum/wavelength. Therefore, AT reshapes data by nonlinear stretching in the temporal domain, and its output has properties that facilitate data compression and data analysis. Figure 2b illustrates the unevenly distributed spectrum of the rainbow pulse illuminating the image. Figure 2d depicts the one-to-one nonlinear mapping between time and spectrum/wavelength when the broadband optical pulse illuminates the sample. Figure 2d indicates that the dense information region is designed with a low group delay, hence more wavelength information is employed to describe the dense information region. As a result, to match the sparsity of the image, the temporal dispersive device is designed with a proper group delay. Hence, in the AT imaging system, the intensive image area requires high-resolution sampling while the sparse marginal area can put up with low-resolution sampling.
Figure 3 depicts the implementation of AT in an ultrafast optical imaging system using a CFBG and the results of linear-stretched and nonlinear-stretched optical pulses in the temporal domain and in the frequency domain. The CFBG with a nonlinear group delay profile is employed to obtain the nonlinear one-to-one mapping between time and spectrum/wavelength. The broadband femtosecond optical pulses produced from a mode-locked laser (MLL) are linearly stretched in the temporal domain after passing through the DCF. Then, the optical pulses go through the circulator 1 from port 1 to port 2 and then reflect back to port 3 of the circulator 1 via the CFBG, where the optical pulses are nonlinear-stretched. The nonlinear-stretched optical pulses then go through the circulator 2 from port 1 to port 2 and then reach the target image after passing through the collimator. Then, the optical pulses are reflected by the target image and are returned to the circulator 2 via the collimator from port 2 to port 3. Finally, the returned optical pulses are detected by the photodetector (PD). Figure 3b describes a linear-stretched optical pulse with the usage of DCF at port 1 of the circulator 1 in the temporal domain, and Figure 3d states a nonlinear-stretched optical pulse using a CFBG at port 3 of the circulator 1 in the temporal domain. From Figure 3b,d, the full width at half maximum (FWHM) of the optical pulse in the temporal domain is around 7 ns, and the optical pulse with nonlinear stretch in Figure 3d has a low group delay of around 7 ns~10 ns and a high group delay of around 10 ns~14 ns, while the linear-stretched optical pulse has a uniform group delay in Figure 3b. Figure 3c,e show the fast Fourier transformation of the pulses in Figure 3b,d in the frequency domain, respectively. From Figure 3c,e, the Fourier domain of the linear and nonlinear time-stretched pulses in frequency are slightly different in the low frequency region.
In the proposed imaging system, the temporal resolution is determined by the repetition rate of the MLL, while the spatial resolution is a comprehensive combination of spatial dispersion (diffraction grating-determined) limited spatial resolution, diffraction (lens-determined) limited spatial resolution, dispersion-induced time stretch through stationary phase approximation (SPA) limited spatial resolution, also known as group delay-related spatial resolution, and the temporal resolution of the digitizer limited spatial resolution [22].
The optical field at port 1 of the circulator 1 is expressed as [38]:
q 1 t = exp j · t 2 / 2 Φ × G ω
where Φ is the dispersion of the DCF, ω is the optical angular frequency, G(ω) is the optical spectrum of the pulse. Without considering the CFBG, the optical field after the target image can be shown as:
q 2 t = exp j · t 2 / 2 Φ × G ω × I ( t )
where I ( t ) is the imaging information encoded to the spectrum of the optical pulse. When considering the CFBG, the CFBG can be regarded as a phase filter in the spectral domain. The optical field before the PD is expressed by:
E ω = Q ω × e j φ ( ω )
where Q ω is the Fourier transform of q 2 t . The phase response of the CFBG, also known as φ ( ω ) , is related to the group delay profile due to τ ω = d φ ( ω ) / d ω . And the optical pulse e t in temporal domain is calculated by the inverse Fourier transform of E ω .

3. Experiment and Results

The schematic of the proposed ultrafast optical imaging system with AT is shown in Figure 4. A MLL with an average power of 10 dBm and a repetition rate of 50 MHz is utilized as the laser source to generate the broadband femtosecond optical pulses with a duration of less than 80 fs and a center wavelength of around 1550 nm. The femtosecond optical pulses with a FWHM of 15 nm in the spectrum domain go through the DCF with a dispersion of −0.48 ns/nm, where the femtosecond optical pulses are time-stretched into a nanosecond level, and the linear one-to-one mapping between wavelength/spectrum to time is achieved. Hence, the wavelength/spectrum information is encoded into the time serial. The time serial-encoded and time-stretched optical pulses then pass through the circulator 1 (Cir1) from port 1 to port 2 and then reflect back to port 3 of the circulator via the CFBG, where the optical pulses are nonlinear-stretched. The optical pulses are reshaped by the CFBG, which has a nonlinear group delay profile. The reshaped optical pulses then go through the circulator 2 (Cir2) from port 1 to port 2, emit into open space via a collimator (Coll), and then go through a pair of diffraction gratings (DG1 and DG2), which have a groove density of 600 lines per millimeter. When using two DGs at a parallel setup, the rainbow pulses will be collimated (shown in Figure 4; the tilted angle of the rainbow pulses indicating that different wavelengths of the rainbow pulses reach the sample at different times), which is easy for the following light adjust and imaging. DG acts as the spatial dispersive device in the proposed imaging system. Thanks to the utilization of DGs, a linear one-to-one mapping between spectrum/wavelength and space is obtained.
A telescope consisting of two plano-convex lenses with focal lengths of 150 mm (PL1) and 50 mm (PL2) is put after the DG2 with distances of 100 mm and 300 mm, ensuring the miniature optical pulses focusing on the imaging plane. A 1952 USAF resolution target acting as the sample is put in the focusing plane for imaging. Then, the miniature optical pulses are reflected by the sample and returned to the previous route. The red arrows show the forward propagation of optical pulses and the green arrows show the backward propagation of the returned optical pulses. The returned optical pulses pass through the collimator and circulator 2 from port 2 to port 3 and then reach the PD. The PD with a bandwidth of 10 GHz is employed to detect the optical information. An oscilloscope with a bandwidth of 10 GHz and a sampling rate of 20 GS/s is applied to real-time display the detected information. A synchronization signal from the MLL and the detected imaging information are sent to the computer for the following data processing and image reconstruction. To achieve 2D imaging, the sample is manually adjusted in the vertical direction in our proposal. To increase the 2D imaging speed, a galvo meter can be employed to adjust the vertical position of the sample in the proposed imaging system. Another method of achieving 2D imaging is to change the spatial dispersive device, using a VIPA and a DG instead of using only a DG. The combination of a DG and VIPA is an intrinsic 2D imaging approach [25].
The experimental results of reconstructed images using the normal TS technique and the proposed AT approach are described in Figure 5. Figure 5a shows the original standard 1951 USAF resolution target with an imaging region marked in the green square, which has an imaging field of view (FOV) of 2.4 mm by 6.4 mm and a pixel size of 120 × 320. Figure 5b,c show the results of reconstructed 2D images using the normal TS technique and the proposed AT approach, respectively. Figure 5b indicates that the uniform group delay of temporal dispersive devices used in the TS technique produces an identical imaging result to the original resolution target. In contrast, Figure 5c shows an anamorphic image with more details in the dense information region (group 0, number 3) and coarse information in the spare area of the target image thanks to the utilization of the nonuniform group delay of temporal dispersive devices, which is specially designed as a low group delay in the dense information region and a high group delay in the rest of the imaging area. Compared with Figure 5b, Figure 5c has improved group delay-related resolution by 58% [22] and more acquired data show the dense information region of group 0, number 3, which could give more details about the edges.
To depict more detailed information about the reconstructed images, the reconstructed experimental results of row 63 (blue line) and row 93 (red line) in Figure 5b,c are described in Figure 6a,b, respectively. Figure 6 shows the reconstructed experimental results of line scanning of row 63 (blue line) and row 93 (red line) using the TS technique in Figure 5b and the AT approach in Figure 5c. The spatial resolution in TS imaging is uniform, which is around 67 μm by measuring the point spread function (PSF) of a sharp edge [22] (shown in Figure 6a of point M). However, the spatial resolution in AT imaging is nonuniform. Although the sampling rates are the same in both AT imaging and TS imaging, thanks to the nonlinear mapping between the wavelength and the time using CFBG, after linear space-to-wavelength mapping using the spatial dispersive device (DG), nonlinear mapping between time and space is achieved. As a result, the reconstructed image is nonlinear-stretched compared with the original image. And the spatial resolutions at different positions of the image are varied (maybe difficult to discern with the naked eye due to the distorted image). Based on the PSF of a sharp edge, the average spatial resolutions at point N and point P in Figure 6b are 43 μm and 75 μm, respectively. However, sometimes the specific resolution is not mentioned due to the different spatial resolutions of varied positions in the AT imaging. The result of the anamorphic stretch in Figure 6b implies that the group delay-related resolution increases by 58% and more acquired data are employed to describe the changes of the line scanning in one third of the very beginning. Also, the result of Figure 6 illustrates that the AT technique is a promising approach for data compression without iterative detection and complicated imaging system setup.

4. Discussion

In our experiment, the settings of low group delay profiles in regions of dense information and high group delay profiles in other regions is based on the images, for the positions of dense and sparse regions of images are different. Hence, the most common operation is to perform TS imaging before AT imaging. The utilization of TS imaging could give the reconstructed uniform image of the original image, which tells the dense and sparse regions apart. Then, AT imaging could be performed using correctly chosen CFBG with properly designed group delay profiles. Hence, the AT imaging has great potential in ultrafast imaging that is associated with mitigating big data and tunable resolution. In comparison with TS imaging, AT imaging has the limitation of more complexed settings.
Table 1 shows the comparison of TS imaging and AT imaging in terms of data compression efficiency, imaging speed, resolution enhancement capability, system complexity, and cost. Table 1 clearly shows the advantages and disadvantages of TS imaging and AT imaging. And, from Table 1, it is easy to reach the conclusion that AT imaging has the benefit of data compression and enhancing resolution, which TS imaging is not capable of.
Moreover, the design of CFBG is crucial in the proposed AT imaging. The design of CFBG has linear chirp profiles, nonlinear chirp profiles, and custom-designed chirp profiles. The customer-designed chirp profiles are tailored to specific applications, which would be a future trend. The possible operation of AT imaging with unknown target images is to perform the TS imaging to confirm the dense and sparse imaging region and then add the CFBG with a properly designed group delay to achieve AT imaging. Furthermore, the other emerging technologies such as deep learning and pattern recognition are easy to combine with AT imaging for imaging analysis.

5. Conclusions

In summary, we propose and experimentally demonstrate the ultrafast optical imaging system with AT based on the STEAM structure. AT is obtained by the usage of CFBG with a specially designed nonlinear group delay profile. As a result, the nonlinear one-to-one mapping between wavelength/spectrum and time is achieved. AT imaging is a variation of TS imaging, which has the benefit of solving data issues of high throughput and changing group delay-related resolution. Furthermore, AT imaging does not require a complicated imaging system setup for data compression or for iterative imaging detection with drastically reduced imaging speed when in comparison with the CS imaging for data compression. An experimental demonstration of the proposed ultrafast imaging system is conducted. A standard resolution target performs as the sample with a field of view (FOV) of 2.4 mm by 6.4 mm and a pixel size of 120 × 320. The experimental results show more acquired data are employed to describe the dense information section and the group delay-related resolution is improved by 58% while maintaining the same line scanning speed of 50 MHz. Hence, our proposal could mitigate the data issue of high throughput, and increase the group delay-related resolution with more details showing the dense information section without changing the systematic configuration or requiring more data acquisition, which holds great potential for data efficiency and resolution-flexible ultrafast optical imaging applications.

Author Contributions

Conceptualization, G.W. and Y.Z.; methodology, G.W.; validation, R.M., F.Z. and E.D.; formal analysis, X.L. and C.Q.; writing—original draft preparation, G.W.; writing—review and editing, G.W. and C.W.; supervision, G.W. and Y.Z.; funding acquisition, G.W. and D.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Scientific Research Foundation for High-Level Talents in Shenzhen (RC2023-005), and in part by the Educational Commission of Guangdong Province, China (2022KTSCX019), and in part by the Medical Scientific Research Foundation of Guangdong Province, China (A2023118), and in part by Hunan Provincial Natural Science Foundation of China (2023JJ30209), and in part by Hunan Provincial Education Department Science Research Fund of China (22B0862), and in part by Shenzhen Science and Technology Program (GJHZ20210705141805015), and in part by Guangdong Provincial Young Innovative Talents Research Project for Colleges and Universities (2024KQNCX172).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Xu, J.; Wei, X.; Yu, L.; Zhang, C.; Xu, J.; Wong, K.K.Y.; Tsia, K.K. High-performance multi-megahertz optical coherence tomography based on amplified optical time-stretch. Biomed. Opt. Express 2015, 6, 1340–1350. [Google Scholar] [CrossRef] [PubMed]
  2. Moon, Q.; Kim, D. Ultra-high-speed optical coherence tomography with a stretched pulse supercontinuum source. Opt. Express 2006, 14, 11575–11584. [Google Scholar] [CrossRef] [PubMed]
  3. Goda, K.; Fard, A.; Malik, O.; Fu, G.; Quach, A.; Jalali, B. High-throughput optical coherence tomography at 800 nm. Opt. Express 2012, 20, 19612–19617. [Google Scholar] [CrossRef] [PubMed]
  4. Xu, J.; Zhang, C.; Xu, J.; Wong, K.K.Y.; Tsia, K.K. Megahertz all-optical swept-source optical coherence tomography based on broadband amplified optical time-stretch. Opt. Lett. 2014, 39, 622–625. [Google Scholar] [CrossRef]
  5. Wang, G.; Zhou, Y.; Min, R.; Du, E.; Wang, C. Principle and Recent Development in Photonic Time-Stretch Imaging. Photonics 2023, 10, 817. [Google Scholar] [CrossRef]
  6. Goda, K.; Tsia, K.; Jalali, B. Serial time-encoded amplified imaging for real-time observation of fast dynamic phenomena. Nature 2009, 458, 1145–1149. [Google Scholar] [CrossRef]
  7. Wei, X.; Kong, C.; Sy, S.; Ko, H.; Tsia, K.K.; Wong, K.K.Y. Ultrafast time-stretch imaging at 932 nm through a new highly-dispersive fiber. Biomed. Opt. Express 2016, 7, 5208–5217. [Google Scholar] [CrossRef]
  8. Wei, X.; Lau, A.K.S.; Wong, T.T.W.; Zhang, C.; Tsia, K.M.; Wong, K.K.Y. Coherent Laser Source for High Frame-Rate Optical Time-Stretch Microscopy at 1.0 μm. IEEE J. Sel. Top. Quantum Electron. 2014, 20, 384–389. [Google Scholar] [CrossRef]
  9. Wong, T.T.W.; Lau, A.K.S.; Wong, K.K.Y.; Tsia, K.K. Optical time-stretch confocal microscopy at 1 μm. Opt. Lett. 2012, 37, 3330–3332. [Google Scholar] [CrossRef]
  10. Kobayashi, H.; Lei, C.; Wu, Y.; Mao, A.; Jiang, Y.; Guo, B.; Ozeki, Y.; Goda, K. Label-free detection of cellular drug responses by high-throughput bright-field imaging and machine learning. Sci. Rep. 2017, 7, 12454. [Google Scholar] [CrossRef]
  11. Lei, C.; Kobayashi, H.; Wu, Y.; Li, M.; Isozaki, A.; Yasumoto, A.; Mikami, H.; Ito, T.; Nitta, N.; Sugimura, T.; et al. High-throughput imaging flow cytometry by optofluidic time-stretch microscopy. Nat. Protoc. 2018, 13, 1603–1631. [Google Scholar] [CrossRef] [PubMed]
  12. Lau, A.K.S.; Wong, T.T.W.; Ho, K.K.Y.; Tang, M.T.H.; Chan, A.C.S.; Wei, X.; Lam, E.Y.; Shum, H.C.; Wong, K.K.Y.; Tsia, K.K. Interferometric time-stretch microscopy for ultrafast quantitative cellular and tissue imaging at 1 μm. J. Biomed. Opt. 2014, 19, 076001. [Google Scholar] [CrossRef] [PubMed]
  13. Wang, G.; Shao, L.; Liu, Y.; Xu, W.; Xiao, D.; Liu, S.; Hu, J.; Zhao, F.; Shum, P.; Wang, W.; et al. Low-cost compressive sensing imaging based on spectrum-encoded time-stretch structure. Opt. Express 2021, 29, 14931–14940. [Google Scholar] [CrossRef] [PubMed]
  14. Zhu, Z.; Chi, H.; Jin, T.; Zheng, S.; Jin, X.; Zhang, X. Single-pixel imaging based on compressive sensing with spectral-domain optical mixing. Opt. Commun. 2017, 402, 119–122. [Google Scholar] [CrossRef]
  15. Guo, Q.; Chen, H.; Weng, Z.; Chen, M.; Yang, S.; Xie, S. Compressive sensing based high-speed time- stretch optical microscopy for two-dimensional image acquisition. Opt. Express 2015, 23, 29639–29646. [Google Scholar] [CrossRef]
  16. Chen, C.; Mahjoubfar, A.; Jalali, B. Optical Data Compression in Time Stretch Imaging. PLoS ONE 2015, 10, 0125106. [Google Scholar] [CrossRef]
  17. Wang, G.; Zhou, Y.; Zhao, F.; Shao, L.; Liu, H.; Sun, L.; Jiao, S.; Wang, W.; Min, R.; Du, E.; et al. A Compact and Highly Efficient Compressive Sensing Imaging System Using In-Fiber Grating. IEEE Photon. Technol. Lett. 2023, 35, 195–198. [Google Scholar] [CrossRef]
  18. Chen, C.; Mahjoubfar, A.; Tai, L.; Blaby, I.; Huang, A.; Niazi, K.; Jalali, B. Deep Learning in Label-free Cell Classification. Sci. Rep. 2016, 6, 21471. [Google Scholar] [CrossRef]
  19. Wang, G.; Zhao, F.; Xiao, D.; Shao, L.; Zhou, Y.; Yu, F.; Wang, W.; Liu, H.; Wang, C.; Min, R.; et al. Highly efficient single-pixel imaging system based on the STEAM structure. Opt. Express 2021, 29, 43203–43211. [Google Scholar] [CrossRef]
  20. Wu, J.; Xu, Y.; Xu, J.; Wei, X.; Chan, A.C.S.; Tang, A.H.L.; Lau, A.K.S.; Chung, B.M.F.; Shum, H.; Lam, E.Y.; et al. Ultrafast laser-scanning time-stretch imaging at visible wavelengths. Light Sci. Appl. 2017, 6, e16196. [Google Scholar] [CrossRef]
  21. Pu, G.; Jalali, B. Neural network enabled time stretch spectral regression. Opt. Express 2021, 29, 20786–20794. [Google Scholar] [CrossRef] [PubMed]
  22. Wang, G.; Yan, Z.; Yang, L.; Zhang, L.; Wang, C. Improved Resolution Optical Time Stretch Imaging Based on High Efficiency In-Fiber Diffraction. Sci. Rep. 2018, 8, 600. [Google Scholar] [CrossRef] [PubMed]
  23. Chen, H.; Weng, Z.; Liang, Y.; Lei, C.; Xing, F.; Chen, M.; Xie, S. High speed single-pixel imaging via time domain compressive sampling. In Proceedings of the Conferences Lasers Electro-Optics, San Jose, CA, USA, 4–19 May 2017. [Google Scholar]
  24. Shuai, Y.; Zhou, Z.; Su, H. Toward Practical Optical Phased Arrays through Grating Antenna Engineering. Photonics 2023, 10, 520. [Google Scholar] [CrossRef]
  25. Xiao, S.; McKinney, J.D.; Weiner, A.M. Photonic microwave arbitrary waveform generation using a virtually imaged phased-array (VIPA) direct space-to-time pulse shaper. IEEE Photonics Technol. Lett. 2004, 16, 1936–1938. [Google Scholar] [CrossRef]
  26. Lei, C.; Guo, B.; Cheng, Z.; Goda, K. Optical time-stretch imaging: Principles and applications. Appl. Phys. Rev. 2016, 3, 011102. [Google Scholar] [CrossRef]
  27. Wang, G.; Wang, C. Diffraction Limited Optical Time-Stretch Microscopy Using an In-Fibre Diffraction Grating. In Proceedings of the Frontiers in Optics 2016. Rochester, New York, NY, USA, 17–21 October 2016. [Google Scholar]
  28. Guo, Q.; Chen, H.; Wang, Y.; Guo, Y.; Liu, P.; Zhu, X.; Cheng, Z.; Yu, Z.; Yang, S.; Chen, M.; et al. High-Speed Compressive Microscopy of Flowing Cells Using Sinusoidal Illumination Patterns. IEEE Photonics J. 2017, 9, 3900111. [Google Scholar] [CrossRef]
  29. Matin, A.; Wang, X. Compressive Ultrafast Optical Time-Stretch Imaging. In Proceedings of the 22nd International Conference on Transparent Optical Networks (ICTON), Bari, Italy, 19–23 July 2020. [Google Scholar]
  30. Fordell, T. Real-time optical time interpolation using spectral interferometry. Opt. Lett. 2022, 47, 1194–1197. [Google Scholar] [CrossRef]
  31. Wang, G.; Xiao, D.; Shao, L.; Zhao, F.; Hu, J.; Liu, S.; Liu, H.; Wang, C.; Min, R.; Yan, Z. An Undersampling Communication System Based on Compressive Sensing and In-fiber Grating. IEEE Photonics J. 2021, 13, 7300507. [Google Scholar] [CrossRef]
  32. Huynh, N.T.; Zhang, E.; Francies, O.; Kuklis, F.; Allen, T.; Zhu, J.; Abeyakoon, O.; Lucka, F.; Betcke, M.; Jaros, J.; et al. A fast all-optical 3D photoacoustic scanner for clinical vascular imaging. Nat. Biomed. Eng. 2024, 10, 1–18. [Google Scholar] [CrossRef]
  33. Asghari, M.H.; Jalali, B. Anamorphic transformation and its application to time–bandwidth compression. Appl. Opt. 2013, 52, 6735–6743. [Google Scholar] [CrossRef]
  34. Yang, S.; Wang, J.; Chi, H.; Yang, B. Distortion compensation in continuous-time photonic time-stretched ADC based on redundancy detection. Appl. Opt. 2021, 60, 1646–1652. [Google Scholar] [CrossRef] [PubMed]
  35. Mahjoubfar, A.; Chen, C.; Jalali, B. Design of Warped Stretch Transform. Sci. Rep. 2013, 5, 17148. [Google Scholar] [CrossRef]
  36. Yang, B.; Ma, Z.; Yang, S.; Chi, H. Broadband and linearized photonic time-stretch analog-to-digital converter based on a compact dual-polarization modulator. Appl. Opt. 2023, 62, 921–926. [Google Scholar] [CrossRef] [PubMed]
  37. Jiang, T.; Wang, L.; Li, J. High-resolution timing jitter measurement based on the photonics time stretch technique. Opt. Express 2023, 31, 6722–6729. [Google Scholar] [CrossRef]
  38. Yao, J. Photonic generation of microwave arbitrary waveforms. Opt. Comm. 2011, 284, 3723–3736. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of wavelength-to-time conversion (a), space-to-wavelength conversion (b), and space-to-time conversion via the wavelength (c).
Figure 1. Schematic diagram of wavelength-to-time conversion (a), space-to-wavelength conversion (b), and space-to-time conversion via the wavelength (c).
Photonics 11 01168 g001
Figure 2. Schematic of AT imaging compared with normal TS imaging. An optical pulse with linear mapping (a) and nonlinear mapping (b) of wavelength-to-time conversion illuminates the sample; (c) describes normal TS imaging using linear wavelength-to-time mapping, and (d) shows the nonlinear wavelength-to-time mapping in AT imaging.
Figure 2. Schematic of AT imaging compared with normal TS imaging. An optical pulse with linear mapping (a) and nonlinear mapping (b) of wavelength-to-time conversion illuminates the sample; (c) describes normal TS imaging using linear wavelength-to-time mapping, and (d) shows the nonlinear wavelength-to-time mapping in AT imaging.
Photonics 11 01168 g002
Figure 3. (a) The implementation of AT in an ultrafast optical imaging system using a chirped fiber Bragg grating (CFBG). An optical pulse is linearly stretched without adding circulator 1 (b) and nonlinearly stretched with adding circulator 1 (d) in the temporal domain for comparison. (c,e) Fast Fourier transformation of the pulses of (b,d) in the frequency domain. MLL—mode-locked laser; DCF—dispersive compensating fiber; CFBG—chirped fiber Bragg grating; Cir:—circulator; Coll—collimator; PD—photodetector.
Figure 3. (a) The implementation of AT in an ultrafast optical imaging system using a chirped fiber Bragg grating (CFBG). An optical pulse is linearly stretched without adding circulator 1 (b) and nonlinearly stretched with adding circulator 1 (d) in the temporal domain for comparison. (c,e) Fast Fourier transformation of the pulses of (b,d) in the frequency domain. MLL—mode-locked laser; DCF—dispersive compensating fiber; CFBG—chirped fiber Bragg grating; Cir:—circulator; Coll—collimator; PD—photodetector.
Photonics 11 01168 g003
Figure 4. The schematic of the proposed ultrafast optical imaging system with AT. MLL—mode-locked laser; DCF—dispersive compensating fiber; CFBG—chirped fiber Bragg grating; Cir—circulator; Coll—collimator; DG—diffraction grating; PL—plano-convex lens; PD—photodetector; OSC—oscilloscope.
Figure 4. The schematic of the proposed ultrafast optical imaging system with AT. MLL—mode-locked laser; DCF—dispersive compensating fiber; CFBG—chirped fiber Bragg grating; Cir—circulator; Coll—collimator; DG—diffraction grating; PL—plano-convex lens; PD—photodetector; OSC—oscilloscope.
Photonics 11 01168 g004
Figure 5. (a) The 1951 USAF resolution target sample with rectangular imaging area. The experimental results of reconstructed images with normal TS technique (b) and AT approach (c).
Figure 5. (a) The 1951 USAF resolution target sample with rectangular imaging area. The experimental results of reconstructed images with normal TS technique (b) and AT approach (c).
Photonics 11 01168 g005
Figure 6. (a,b) The reconstructed experimental results of line scanning of row 63 (blue line) and row 93 (red line) in Figure 5b,c, respectively.
Figure 6. (a,b) The reconstructed experimental results of line scanning of row 63 (blue line) and row 93 (red line) in Figure 5b,c, respectively.
Photonics 11 01168 g006
Table 1. Comparison of TS imaging and AT imaging.
Table 1. Comparison of TS imaging and AT imaging.
ParametersTS ImagingAT Imaging
Data compression efficiencynot applicablehigh
Imaging speedhighhigh
Resolution enhancement capabilitynot applicablehigh
System complexitymediummedium high
Costmediummedium high
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, G.; Zhou, Y.; Min, R.; Zhao, F.; Du, E.; Li, X.; Qiu, C.; Xiao, D.; Wang, C. An Ultrafast Optical Imaging System with Anamorphic Transformation Based on STEAM Structure. Photonics 2024, 11, 1168. https://doi.org/10.3390/photonics11121168

AMA Style

Wang G, Zhou Y, Min R, Zhao F, Du E, Li X, Qiu C, Xiao D, Wang C. An Ultrafast Optical Imaging System with Anamorphic Transformation Based on STEAM Structure. Photonics. 2024; 11(12):1168. https://doi.org/10.3390/photonics11121168

Chicago/Turabian Style

Wang, Guoqing, Yuan Zhou, Rui Min, Fang Zhao, E Du, Xingquan Li, Cong Qiu, Dongrui Xiao, and Chao Wang. 2024. "An Ultrafast Optical Imaging System with Anamorphic Transformation Based on STEAM Structure" Photonics 11, no. 12: 1168. https://doi.org/10.3390/photonics11121168

APA Style

Wang, G., Zhou, Y., Min, R., Zhao, F., Du, E., Li, X., Qiu, C., Xiao, D., & Wang, C. (2024). An Ultrafast Optical Imaging System with Anamorphic Transformation Based on STEAM Structure. Photonics, 11(12), 1168. https://doi.org/10.3390/photonics11121168

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop