Next Article in Journal
Saturation, Allowed Transitions and Quantum Interference in Laser Cooling of Solids
Next Article in Special Issue
Two-Wavelength Computational Holography for Aberration-Corrected Simultaneous Optogenetic Stimulation and Inhibition of In Vitro Biological Samples
Previous Article in Journal
Effect of Sodium on Methanogens in a Two-Stage Anaerobic System
Previous Article in Special Issue
Assignment of Focus Position with Convolutional Neural Networks in Adaptive Lens Based Axial Scanning for Confocal Microscopy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optical Diffraction Tomography Using Nearly In-Line Holography with a Broadband LED Source

Optics Laboratory, École Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(3), 951; https://doi.org/10.3390/app12030951
Submission received: 22 December 2021 / Revised: 14 January 2022 / Accepted: 14 January 2022 / Published: 18 January 2022

Abstract

:
We present optical tomography methods for a 3D refractive index reconstruction of weakly scattering objects using LED light sources. We are able to record holograms by minimizing the optical path difference between the signal and reference beams while separating the scattered field from its twin image. We recorded multiple holograms by illuminating the LEDs sequentially and reconstructed the 3D refractive index reconstruction of the sample. The reconstructions show high signal-to-noise ratio in which the effect of speckle artifacts is highly minimized due to the partially incoherent illumination of the LEDs. Results from combining different illumination wavelengths are also described demonstrating higher acquisition speed.

1. Introduction

Optical Diffraction Tomography (ODT) is an emerging tool for label-free imaging of semi-transparent samples in three-dimensional space [1,2,3,4,5,6,7,8,9,10]. Being semi-transparent, such objects do not strongly alter the amplitude of the illuminating field. However, the total phase delay at a particular wavelength is a function of the refractive index of the sample and also the thickness of the sample. Due to this ambiguity, one cannot distinguish between those parameters from 2D projections. Hence, to reconstruct the 3D refractive index (RI) map of semi-transparent samples, a holographic detection is needed to extract the phase of the field after passing through the sample. Then, by acquiring different holograms at different illumination angles, the 3D RI map can be reconstructed using inverse scattering models [10].
Holographic detection was introduced by Gabor who used “in-line” holography. He showed that the intensity image retrieved from the in-line holography is composed of an “in-focus” image in addition to an “out-of-focus” image (i.e., “Twin” image) [11]. Due to this “Twin” image problem, in-line holography usually encounters problems in retrieving the phase of the object. Upatnieks and Leith proposed an “off-axis” holography [12]. In this configuration, a small tilt is introduced between the reference arm and the sample arm, which results in shifting in the Fourier domain the “out-of-focus” image with respect to the “in-focus”. Since then, “off-axis” interferometry has been widely used in ODT by first extracting the phase before using the inverse models [13,14].
Several limitations remain that limit the use of ODT in biological imaging. These limitations include phase instability due to interferometry and laser fluctuations and speckle artifacts due to the high coherence of the laser source. To overcome these limitations, Lei Tian, Laura Waller, and co-workers used a relatively broadband source (i.e., LED illumination) to illuminate the sample for Fourier ptychographic and 3D imaging [15,16,17,18]. In particular, in [18], Tian and Waller used LED illumination and demonstrated an iterative reconstruction scheme with a multi-slice forward model to estimate the 3D complex RI distribution by minimizing an error function between the intensity patterns estimated from the forward model. Their approach showed in-focus reconstruction at different depths while taking multiple scattering phenomena into account.
In recent years, this approach was demonstrated both in reflection and transmission configurations [19,20,21,22,23,24]. For example, a motion-free illumination scanning scheme was demonstrated for 3D RI reconstruction using an LED ring that mimics a circular scanning approach [25]. Other approaches to intensity diffraction tomography include the use of iterative schemes to reconstruct the 3D RI map from 2D intensity images using nonlinear iterative schemes that minimize an error function [26,27]. These models usually start with an initial guess of the 3D RI that it keeps modifying by minimizing the error between the actual measurement from the experiment and the intensity profile from the forward physical model.
In the earlier work [18], the phase and absorption transfer functions were calculated in the spatial domain using the intensity images as a function of the illumination angle. In this work, we use LED illumination and apply the Fourier diffraction theorem or Wolf Transform [1,28] in order to reconstruct the 3D scattering potential in the 3D Fourier space followed by a 3D inverse Fourier transform to produce the 3D RI distribution. The reconstructions showed higher resolution, lower speckle noise, and high contrast reconstructions compared to the results we presented earlier for a Wolf transform reconstruction applied to projections obtained with a laser illumination [28].
We begin by discussing the use of the Fourier diffraction theorem on the “in-line” intensity data for the retrieval of the 3D RI map of the sample. We describe the theory behind our work, we then show the reconstructed 3D RI map. After that, we show the effect of slight misalignment of the illumination on the quality and contrast of the 3D RI reconstruction. Finally, we show the effect of adding a different wavelength on the final reconstruction.

2. Theory

The intensity pattern captured by the detector of the ODT system is denoted by I t ( x , y ) with x and y being the horizontal and the vertical dimensions of the 2D intensity pattern. The detected intensity is given by:
I t = U i 2 + U s 2 + 2 U i U s cos ( φ s φ i )
where U i is the amplitude of the incident field, U s is the amplitude of the scattered field, and φ s φ i is the difference between the phases of the complex scattered and the incident field, carrying the phase information of the sample. For weakly scattering samples (i.e., Born approximation), we can assume that ( U s << U i ), and defining U i = e j φ i , | U i | = 1 , Equation (1) can be simplified as follows:
I t 1 + 2 U s cos ( Δ φ )
where Δ φ = φ s φ i . Equation (2) can be rewritten as:
I t = 1 + 2 U s cos ( Δ ϕ ) I t = 1 + U s e j Δ ϕ + U s e j Δ ϕ
Multiplying both sides of Equation (3) by E i = e j ϕ i , we obtain:
I t e j φ i = U s e j φ s + e j φ i + U s e j φ s e 2 j φ i = U s + e j φ i + U s * e 2 j φ i = U s + e j φ i ( 1 + U s * e j φ i )
Equation (4) includes the effect of the scattered field (i.e., U s ), which we refer to as the “Principal” image and its complex conjugate (i.e., U s * ), which we refer to as the twin image. As has been shown previously [28], at illumination angles less than the numerical aperture, the two terms tend to cancel each other, which results in a low contrast 3D reconstruction while maintaining the high frequency features of the sample. Figure 1 shows the effect of changing the illumination angle on the 2D intensity image of a simulated digital phantom where different illumination angles were assumed. Note the 2D Fourier transform of the corresponding intensity images includes two circles in the Fourier domain. Each circle is the result of the spectral filtering applied by the limited numerical aperture of the objective lens. From Equation (3), we see that we have three terms: a Zero-order term (the first term on the right-hand side) and the two cross terms. The shift of the two circles from the Zero-order term depends on the illumination angle of the incident plane wave.
In other words, for normal incidence, the two circles completely overlap with each other. However, as we increase the illumination angle, the shift between the two circles increases until we reach the limit of the numerical aperture, at which point we see that the two circles are tangent to each other. Only when the illumination is at the maximum angle permitted by the NA of the objective lens can the complex field be retrieved from the intensity image as would be the case for an off-axis interferometric setup with a separate reference arm for holographic detection.
Figure 1 shows that for an accurate extraction of the scattered field, the sample should be illuminated with the maximum angle permitted by the NA of the objective lens [21]. The scattered field can be extracted by simply multiplying the intensity measurement with the incident plane wave, which results in shifting the spectrum in the Fourier domain. This is followed by the spatial filtering of the “Principal” image with a circular filter whose size is determined by the NA of the objective lens as shown in Figure 2.
From Figure 2, it can be seen that only when the illumination angle is at the edge of the imaging NA can we extract the complex scattered field from the intensity image. This can be experimentally demonstrated by illumination along a circular cone whose center is perfectly aligned with the imaging objective lens. This is demonstrated in the experimental setup described in the following section.
By multiplying the intensity image with the incident plane-wave to shift the spectrum in the Fourier domain, the scattered field spectrum becomes centered around the origin. To filter out the complex scattered field in the Fourier domain, we apply a low pass filter given by the following equation:
U ˜ s ( k x , k y ) = L P F { F F T 2 D { I t e j φ i } }
where U ˜ s ( k x , k y ) is the 2D Fourier transform of U s ( x , y ) , and L P F { . } represents a circular low pass filter whose radius is given by k 0 N A , where k 0 is the wave number in free-space.
By extracting the complex field along the direction k = ( k x , k y , k z ) for each illumination k-vector k i n = ( k x i n , k y i n , k z i n ) , one Fourier component of the 3D spectrum of the scattering potential F ~ ( κ ) (which is directly related to the index distribution) can be retrieved [1]:
F ~ ( κ ) = k z 2 π j U ˜ s ( k x , k y )
where
κ = k k i = k x k x i n k y k z i n ( k 0 n 0 ) 2 k x 2 k y 2 k z i n
where κ is the 3D spatial frequency of the scattering potential, k 0 = 2 π λ , λ is the wavelength of the illumination beam, and n 0 is the refractive index of the surrounding medium. We refer to Equation (6) as the Wolf transform [1,29].
By applying Equation (6) for different 2D projections and accumulating the various spectral components in the 3D Fourier domain, the 3D spectrum F ~ ( κ ) can be measured. Subsequently, F ( r ) can be spatially reconstructed using an inverse 3D Fourier transform. Finally, n ( r ) is retrieved using the following equation:
n ( r ) = 4 π k 0 2 F ( r ) + n 0 2
In summary, we obtain the 3D RI distribution, through the following steps; (1) the raw images should be processed to remove the background; (2) the illumination angle is calculated; (3) the intensity image is multiplied with the incident plane-wave to shift the scattered field spectrum to the center; (4) the resulting spectrum is low pass filtered with a circular filter whose radius is proportional to the numerical aperture of the imaging objective lens; (5) the resulting low pass filtered spectrum is mapped to the 3D Fourier space of the sample as a spherical cap (i.e., diffraction); (6) by applying steps 1–5 for all intensity images with different illumination angles, the 3D scattering potential is formed in the 3D Fourier space; and (7) by applying an inverse 3D Fourier transform, the spatial distribution of the scattering potential is calculated from which the 3D RI distribution is retrieved.
Preprocessing of the images is performed by subtracting the background from the raw images to remove any noise from the camera or the ambient environment. The background is retrieved by applying a low pass filter onto the raw images. The normalized intensity profile is then calculated as follows:
I n = I t I B k g I B k g
where I B k g is the background signal.

3. Experimental Setup and 3D RI Reconstructions

The experimental setup used in our experiments combined a standard bright field microscope (AmScope T490B-DK 40X-2000X, Irvine, CA, USA) in which an LED ring illumination unit (Adafruit, ID: 1586, 24 LED pixels, bandwidth = 20 nm, New York, NY, USA) replaced the bright-field illumination unit incorporated with the commercial microscope. The experimental setup is shown in Figure 3. The LED ring had a radius of approximately 30 mm. To assume plane-wave illumination, the LED was placed far enough from the sample (at a distance around 35 mm) so that the wavefront illuminating the sample was flat enough. For imaging, an objective lens of magnification 40× and NA of 0.65 was used (Plan Achro, part of the AmScope system). Images were captured using a scientific color CMOS camera (Thorlabs Zelux, resolution: 1440 × 1080, pixel size: 3.45 μm, Bergkirchen, Germany). The LED ring was driven using an Arduino kit (Arduino Uno). For proper synchronization, a Matlab script was used to synchronize the LED ring with the camera for different illumination angles. Each image took 0.5 s to be acquired with a total of 12 s for the acquisition of the 24 images. Using more powerful LEDs or laser diodes can result in shorter acquisition speed.
While an LED source was used in this study, the described technique is also applicable to laser sources. The advantage of using LED is to get rid of speckle noise usually seen with high coherent illumination, and they are practically advantageous (low cost angle scanning by sequential LED illumination).
The center of the LED ring was aligned with the optical axis as described in the previous section. As we will show later, any misalignment will severely affect the quality of the reconstruction. In addition, the illumination and imaging NAs were matched by controlling the distance between the sample and the LED ring to ensure proper extraction of the “Principal” field only from the intensity measurement.
As shown in Figure 3b, the distance (h) was controlled so that the illumination and imaging NA were perfectly matched. In other words, the distance (h) should satisfy the following condition:
sin γ = r r 2 + h 2 = N A O B J
Figure 4 shows an example for a raw image taken for a cheek cell extracted from human mouth. Figure 4b shows the two circles (i.e., cross terms) as expected from the theory. As explained in the previous section, by carefully aligning the LED ring to match the imaging NA, we were able to completely decouple the two cross terms for proper reconstruction of the 3D RI distribution.
We refer to the raw intensity images as holograms, since holography is generally referred to an image where the amplitude and phase can be extracted as was first described by Dennis Gabor in which the recorded image on the film has information about the complex field of the transparent sample being imaged. In Gabor’s original work the reference beam was propagating in the same direction as the signal beam; therefore, it was difficult to separate one from the other. The work of Leith and Upatnieks placed the reference beam at a sufficiently large angle that the reconstructed signal beam was separated in the Fourier space from the Zero-order beam. The holograms presented here (and also in the work of Waller and Tian) were recorded with a reference beam, which was the portion of the incident beam that was not scattered, which was chosen to be at an angle just enough to separate the signal beam from the twin image (the conjugate term) in the Fourier space. Since the object in this case is a weakly scattering phase only object, the effects of the modulation of the incident beam by the sample did not appreciably affect the reconstructions. Fringes were not visible in the recording shown in Figure 4a due to low contrast and sampling by the camera, which was close to the Nyquist rate. The fringes were weak, but they were surely there, since the 2D Fourier transform shown in Figure 4b contains two orders.
To retrieve the illumination angle, we adopted a previously developed algorithm, which retrieved the illumination angle by detecting the boundaries and distance between the center of the circle and the Zero-order term [30]. After acquiring different projections by illuminating individually and sequentially the LEDs (see the Supplementary Video), the 3D RI reconstruction was retrieved by mapping the filtered Fourier transform shown in Figure 2 into the 3D Fourier-space. This was followed by taking the 3D inverse Fourier transform of the scattering potential in the Fourier-space to calculate the 3D RI distribution. Figure 5 shows the 3D RI reconstruction for the cheek cell shown in Figure 4 (see also Supplementary Video).
Figure 6 shows another 3D RI reconstruction for a Human Cheek cell where the bacterial structures are clearly detected with their refractive index higher than the cytoplasm of the cell agreeing with results reported in the literature [17,31].

4. Effect of Misalignment on the Reconstruction Quality

In this section, we study the effect of optical misalignment on the quality of the 3D reconstruction. As described in the above sections, it is critical that the illumination and imaging NA are identical (i.e., conical illumination with the center aligned with the optical axis). Any misalignment will result in the “Principal” and the “Twin” images overlapping, which will alter the contrast quality of the final 3D RI reconstruction. To study this, the same optical setup was used; however, the LED was slightly misaligned from the optical axis. Figure 7 shows an image for two LED illuminations. As clearly seen, when the “Principal” and the “Twin” image overlap (i.e., in Fourier transform) we see very low contrast in the intensity images since the “Principal” and the “Twin” image tend to cancel each other [28]. However, for the other case when they do not overlap, we see that the contrast is enhanced. As a result of this effect, the final 3D RI reconstruction will not reflect the true 3D refractive index distribution of the sample, since we cannot retrieve the complex scattered field “alone” from the intensity images because of the overlapping circles for certain projections. Figure 8 shows the retrieved 3D RI distribution as a result of optical misalignment. From the figure, it is observed how the bacteria highlighted in blue box takes an RI value less than the surrounding medium (i.e., water), which does not agree with literature [17,31]; this is also in contradiction to the reconstruction shown in Figure 6 when the LED were perfectly aligned, and the illumination and imaging NA were matched. This error in the 3D RI reconstruction might be attributed to the twin image and the principal image overlapping which resulted in the wrong calculation of the 3D reconstructions. In addition, we see that the refractive index contrast decreased due to the cancelation of the low frequency components from the overlapping of the principal and the twin image.

5. ODT Using Wavelength Diversity

Finally, 3D RI reconstruction based on wavelength diversity is discussed in this section. Since the LED ring supports “RGB” colors, we captured images at three different wavelengths: red (623 nm), green (515 nm), and blue (468 nm). Theoretically, this corresponded to mapping at different Ewald’s sphere with different radii k = 2 π λ i l l u m as shown in Figure 9. By following the same procedures discussed in the previous sections at each particular wavelength, the 3D refractive index reconstruction was retrieved.
Practically, objective lenses suffer from chromatic aberrations, which results in the image not being in best focus at different wavelengths, as shown in Figure 10. This results in distortions (aberrations) when mapping the projections onto the Ewald’s sphere resulting in an inaccurate estimation of the final 3D RI distribution. We can correct for this by acquiring different intensity images at different focal planes and correct for the aberrations by calculating the cross-correlation function between those images and a reference image [32], which effectively corrects for these aberrations.
To correct for the chromatic aberrations, a different approach based on the Fresnel propagation was taken to refocus the image digitally, since we have access to the scattered field. This was performed in two steps; first, we reconstructed the 3D RI distribution without calibrating for the aberrations at red and blue (given that green channel is in focus).
Then, by monitoring the reconstructions, the fields were backpropagated by the distance where the sample was displaced from the best plane of focus (z = 0). Figure 11 shows an example for the reconstructions along YZ for green and blue. Note how the reconstruction for the blue illumination was displaced from the best plane of focus representing the error due to the chromatic aberrations introduced by the objective lens.
The second step was to backpropagate the complex field extracted from the intensity image by the distance Δ z to refocus it using the Fresnel propagation as follows:
E c a l i b = I F F T { F F T { E u n c a l i b } e j K z Δ z }
where E u n c a l i b is the chromatic aberrated field and E c a l i b is the calibrated field after removing the chromatic aberrated.
Figure 12 shows the effect of refocusing on the displacement along the optical axis on the XY slice at z = 0 (best plane of focus) in which the image came into focus after calibrating for the chromatic aberrations.
After calibrating each wavelength channel, the 3D RI distribution was retrieved by combing all the calibrated projections into the 3D Fourier-space. However, this was based on the assumption that the sample did not have strong dispersion, and thus, the RI value was almost constant at different wavelengths. Figure 13 shows the 3D frequency support in the Fourier-space and the corresponding XY slices at different depths. While we did not observe enhancement in the optical resolution by going from one wavelength to three wavelengths, imaging at multiple wavelengths can still be advantageous in other aspects. For example, instead of capturing 24 projections where each projection corresponded to each LED, we could simultaneously operate three LED pixels each with a different color and then decouple them in the postprocessing, since we had an RGB camera. This increased the throughput of the system by a factor of three [31,33]. In addition, the incoherent superposition of the three RGB images increased the SNR of the reconstructed 3D object. This is not visually evident in Figure 13 because the image quality was already very good with a single color, but in cases where we want to carry out high speed recording, and we operate at low light levels, the aid of the RGB illumination can prove helpful. The color scanning influences the transverse spatial resolution of the 3D RI distribution, since the diffraction limited resolution is proportional to the wavelength. This is a relatively minor effect.
Finally, we can use the wavelength scanning in reflection or 90-degree configuration. This would result in a resolution enhancement along the axial direction helping resolve the missing cone problem, which arises from imaging in transmission configuration through the limited NA imaging system where the missing spatial frequencies by the objective lens result in a cone-shaped structure in the 3D Fourier space as shown in Figure 13b with the red arrows. Regarding the number of projections, we believe that a higher number of projections would result in a higher value of the 3D RI distribution, since we are filling more in the Ewald’s sphere (3D Fourier space). On the other hand, for coherent detection, we observed that the higher number of projections could enhance the “effective” resolution as the coherent noise would average out as we increase the number of projections resulting in a better signal-to-noise ratio.

6. Conclusions

In conclusion, we presented a technique for 3D refractive reconstruction using the Wolf transform based on intensity measurements. The technique relied on mapping the extracted scattered field into the 3D Fourier space and then taking an inverse 3D Fourier transform to retrieve the 3D RI in the spatial domain. The reconstructions showed excellent signal-to-noise ratio due to the use of a partially incoherent illumination source, which led to minimizing the speckle noise usually detected in coherent detection. To retrieve the 3D RI distribution, the illumination and the imaging NA must be perfectly matched. Finally, we investigated the effect of adding other illumination wavelengths and showed how to correct for the chromatic aberrations from the objective lens.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/app12030951/s1, Video S1: Projections_Cheek_cell, Video S2: Cheek_cell_reconstruction_1, Video S3: Cheek_cell_reconstruction_2, Video S4: Cheek_cell_reconstruction_3.

Author Contributions

Conceptualization, A.B.A. and D.P.; methodology, A.B.A., A.R. and D.P.; writing—original draft preparation, A.B.A.and D.P.; writing—review and editing, A.B.A. and D.P.; supervision, D.P. All authors have read and agreed to the published version of the manuscript.

Funding

Swiss National Science Foundation (SNSF 514481).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wolf, E. Three-dimensional structure determination of semi-transparent objects from holographic data. Opt. Commun. 1969, 1, 153–156. [Google Scholar] [CrossRef]
  2. Sung, Y.; Choi, W.; Fang-Yen, C.; Badizadegan, K.; Dasari, R.R.; Feld, M.S. Optical diffraction tomography for high resolution live cell imaging. Opt. Express 2009, 17, 266–277. [Google Scholar] [CrossRef]
  3. Haeberlé, O.; Belkebir, K.; Giovaninni, H.; Sentenac, A. Tomographic diffractive microscopy: Basics, techniques and perspectives. J. Mod. Opt. 2010, 57, 686–699. [Google Scholar] [CrossRef] [Green Version]
  4. Sung, Y.; Choi, W.; Lue, N.; Dasari, R.R.; Yaqoob, Z. Stain-Free Quantification of Chromosomes in Live Cells Using Regularized Tomographic Phase Microscopy. PLoS ONE 2012, 7, e49502. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Kim, T.; Zhou, R.; Goddard, L.L.; Popescu, G. Solving inverse scattering problems in biological samples by quantitative phase imaging. Laser Photon. Rev. 2016, 10, 13–39. [Google Scholar] [CrossRef]
  6. Shin, S.; Kim, K.; Yoon, J.; Park, Y. Active illumination using a digital micromirror device for quantitative phase imaging. Opt. Lett. 2015, 40, 5407–5410. [Google Scholar] [CrossRef] [PubMed]
  7. Charrière, F.; Marian, A.; Montfort, F.; Kühn, J.; Colomb, T.; Cuche, E.; Marquet, P.; Depeursinge, C. Cell refractive index tomography by digital holographic microscopy. Opt. Lett. 2006, 31, 178–180. [Google Scholar] [CrossRef]
  8. Cooper, K.L.; Oh, S.; Sung, Y.; Dasari, R.R.; Kirschner, M.W.; Tabin, C.J. Multiple phases of chondrocyte enlargement underlie differences in skeletal proportions. Nature 2013, 495, 375–378. [Google Scholar] [CrossRef]
  9. Choi, W.; Fang-Yen, C.; Badizadegan, K.; Oh, S.; Lue, N.; Dasari, R.R.; Feld, M.S. Tomographic phase microscopy. Nat. Methods 2007, 4, 717–719. [Google Scholar] [CrossRef]
  10. Slaney, M.; Kak, A.; Larsen, L. Limitations of Imaging with First-Order Diffraction Tomography. IEEE Trans. Microw. Theory Tech. 1984, 32, 860–874. [Google Scholar] [CrossRef] [Green Version]
  11. Gabor, D. A New Microscopic Principle. Nature 1948, 161, 777–778. [Google Scholar] [CrossRef]
  12. Leith, E.N.; Upatnieks, J. Reconstructed Wavefronts and Communication Theory. J. Opt. Soc. Am. 1962, 52, 1123–1128. [Google Scholar] [CrossRef]
  13. Boas, D.A.; Pitris, C.; Ramanujam, N. (Eds.) Handbook of Biomedical Optics; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
  14. Cuche, E.; Bevilacqua, F.; Depeursinge, C. Digital holography for quantitative phase-contrast imaging. Opt. Lett. 1999, 24, 291–293. [Google Scholar] [CrossRef] [PubMed]
  15. Tian, L.; Wang, J.; Waller, L. 3D differential phase-contrast microscopy with computational illumination using an LED array. Opt. Lett. 2014, 39, 1326–1329. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Tian, L.; Waller, L. Quantitative differential phase contrast imaging in an LED array microscope. Opt. Express 2015, 23, 11394–11403. [Google Scholar] [CrossRef] [Green Version]
  17. Tian, L.; Liu, Z.; Yeh, L.; Chen, M.; Zhong, J.; Waller, L. Computational illumination for high-speed in vitro Fourier ptychographic microscopy. Optica 2015, 2, 904–911. [Google Scholar] [CrossRef] [Green Version]
  18. Tian, L.; Waller, L. 3D intensity and phase imaging from light field measurements in an LED array microscope. Optica 2015, 2, 104–111. [Google Scholar] [CrossRef]
  19. Matlock, A.; Sentenac, A.; Chaumet, P.C.; Yi, J.; Tian, L. Inverse scattering for reflection intensity phase microscopy. Biomed. Opt. Express 2020, 11, 911–926. [Google Scholar] [CrossRef]
  20. Matlock, A.; Tian, L. High-throughput, volumetric quantitative phase imaging with multiplexed intensity diffraction tomography. Biomed. Opt. Express 2019, 10, 6432–6448. [Google Scholar] [CrossRef]
  21. Li, J.; Matlock, A.; Li, Y.; Chen, Q.; Zuo, C.; Tian, L. High-speed in vitro intensity diffraction tomography. Adv. Photon. 2019, 1, 066004. [Google Scholar] [CrossRef]
  22. Li, J.; Matlock, A.; Li, Y.; Chen, Q.; Tian, L.; Zuo, C. Resolution-enhanced intensity diffraction tomography in high numerical aperture label-free microscopy. Photon. Res. 2020, 8, 1818. [Google Scholar] [CrossRef]
  23. Liu, R.; Sun, Y.; Zhu, J.; Tian, L.; Kamilov, U. Zero-Shot Learning of Continuous 3D Refractive Index Maps from Discrete Intensity-Only Measurements. arXiv 2021, arXiv:2112.00002. [Google Scholar]
  24. Matlock, A.; Xue, Y.; Li, Y.; Cheng, S.; Tahir, W.; Tian, L. Model and learning-based computational 3D phase microscopy with intensity diffraction tomography. In Proceedings of the 2020 28th European Signal Processing Conference (EUSIPCO), Amsterdam, The Netherlands, 18–21 January 2021; pp. 760–764. [Google Scholar] [CrossRef]
  25. Ling, R.; Tahir, W.; Lin, H.-Y.; Lee, H.; Tian, L. High-throughput intensity diffraction tomography with a computational microscope. Biomed. Opt. Express 2018, 9, 2130–2141. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Pham, T.-A.; Soubies, E.; Goy, A.; Lim, J.; Soulez, F.; Psaltis, D.; Unser, M. Versatile reconstruction framework for diffraction tomography with intensity measurements and multiple scattering. Opt. Express 2018, 26, 2749–2763. [Google Scholar] [CrossRef] [Green Version]
  27. Chowdhury, S.; Chen, M.; Eckert, R.; Ren, D.; Wu, F.; Repina, N.; Waller, L. High-resolution 3D refractive index microscopy of multiple-scattering samples from intensity images. Optica 2019, 6, 1211–1219. [Google Scholar] [CrossRef] [Green Version]
  28. Ayoub, A.B.; Lim, J.; Antoine, E.E.; Psaltis, D. 3D reconstruction of weakly scattering objects from 2D intensity-only measurements using the Wolf transform. Opt. Express 2021, 29, 3976–3984. [Google Scholar] [CrossRef] [PubMed]
  29. Born, M.; Wolf, E. Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light, 7th ed.; Cambridge University Press: Cambridge, UK, 1999. [Google Scholar]
  30. Eckert, R.; Phillips, Z.F.; Waller, L. Efficient illumination angle self-calibration in Fourier ptychography. Appl. Opt. 2018, 57, 5434–5442. [Google Scholar] [CrossRef] [PubMed]
  31. Rylander, C.G.; Davé, D.P.; Akkin, T.; Milner, T.E.; Diller, K.R.; Welch, A.J. Quantitative phase-contrast imaging of cells with phase-sensitive optical coherence microscopy. Opt. Lett. 2004, 29, 1509–1511. [Google Scholar] [CrossRef]
  32. Lee, W.; Jung, D.; Joo, C. Single-exposure quantitative phase imaging in color-coded LED microscopy (Conference Presentation). Proc. SPIE 2017, 10074, 1007408. [Google Scholar] [CrossRef]
  33. Hosseini, P.; Sung, Y.; Choi, Y.; Lue, N.; Yaqoob, Z.; So, P. Scanning color optical tomography (SCOT). Opt. Express 2015, 23, 19752–19762. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Intensity images and their 2D Fourier transforms for on-axis and off-axis configurations with different illumination angles. As can be seen from the figures, as the incident illumination vector k i n approaches the numerical aperture of the objective lens, the 2 cross terms can be decoupled, and the principal term can be retrieved. Scale bar = 8 μm.
Figure 1. Intensity images and their 2D Fourier transforms for on-axis and off-axis configurations with different illumination angles. As can be seen from the figures, as the incident illumination vector k i n approaches the numerical aperture of the objective lens, the 2 cross terms can be decoupled, and the principal term can be retrieved. Scale bar = 8 μm.
Applsci 12 00951 g001
Figure 2. Processing of the 2D intensity images before mapping into the 3D Fourier space. The left-most panel shows the intensity measurements and the corresponding Fourier transform. The middle panel shows the effect of multiplying by the incident plane wave, which results in centering the scattered field highlighted by the white circles. The final step is the filtering of the scattered field with a circular filter whose size matches the size of the numerical aperture in the Fourier space declared by the red circle. Scale bar = 8 μm.
Figure 2. Processing of the 2D intensity images before mapping into the 3D Fourier space. The left-most panel shows the intensity measurements and the corresponding Fourier transform. The middle panel shows the effect of multiplying by the incident plane wave, which results in centering the scattered field highlighted by the white circles. The final step is the filtering of the scattered field with a circular filter whose size matches the size of the numerical aperture in the Fourier space declared by the red circle. Scale bar = 8 μm.
Applsci 12 00951 g002
Figure 3. Experimental setup. (a) bright-field microscope equipped with an LED ring. (b) Schematic of the system with its different parts. (OBJ: objective lens, L: lens, det.: detector). The LED ring is perfectly aligned with the optical axis for better quality of the reconstruction. The distance (h) is controlled to ensure a perfect match between the illumination and imaging NA.
Figure 3. Experimental setup. (a) bright-field microscope equipped with an LED ring. (b) Schematic of the system with its different parts. (OBJ: objective lens, L: lens, det.: detector). The LED ring is perfectly aligned with the optical axis for better quality of the reconstruction. The distance (h) is controlled to ensure a perfect match between the illumination and imaging NA.
Applsci 12 00951 g003
Figure 4. Intensity raw image from the experimental setup (a) 2D intensity image for human cheek cell. (b) 2D Fourier transform of the image shown in (a).
Figure 4. Intensity raw image from the experimental setup (a) 2D intensity image for human cheek cell. (b) 2D Fourier transform of the image shown in (a).
Applsci 12 00951 g004
Figure 5. 3D RI reconstruction at different depths the left column is the highlighted regions at z = −1 μm (dark blue) and z = 0 um (red). The images show the high resolution due to the use of a partially incoherent LED source with the speckle noise highly reduced.
Figure 5. 3D RI reconstruction at different depths the left column is the highlighted regions at z = −1 μm (dark blue) and z = 0 um (red). The images show the high resolution due to the use of a partially incoherent LED source with the speckle noise highly reduced.
Applsci 12 00951 g005
Figure 6. 3D RI reconstruction at different depths. The red box is a magnified picture of the highlighted red region. It shows the higher refractive index of this bacterial structure as compared to the cytoplasm of the cell.
Figure 6. 3D RI reconstruction at different depths. The red box is a magnified picture of the highlighted red region. It shows the higher refractive index of this bacterial structure as compared to the cytoplasm of the cell.
Applsci 12 00951 g006
Figure 7. Effect of misalignment on the captured intensity images under LED illumination. An illumination angle smaller than the maximum angle allowed by the NA of the objective lens, results in overlapping spectra (circles), and we obtain low contrast images. This is not the case when the illumination angle is maximized, and the spectra do not overlap.
Figure 7. Effect of misalignment on the captured intensity images under LED illumination. An illumination angle smaller than the maximum angle allowed by the NA of the objective lens, results in overlapping spectra (circles), and we obtain low contrast images. This is not the case when the illumination angle is maximized, and the spectra do not overlap.
Applsci 12 00951 g007
Figure 8. 3D RI reconstruction for misaligned LED ring. As shown in the red box, the contrast is highly suppressed as a result of the cancelation of the low spatial frequency due to the overlap between the two circles in some of the projections. On the other hand, as shown in the dark blue box, we see an artifact where the contrast is inverted for the bacterial structure, which does not agree with the literature [17,31], where the bacterial structures have higher refractive index than cytoplasm.
Figure 8. 3D RI reconstruction for misaligned LED ring. As shown in the red box, the contrast is highly suppressed as a result of the cancelation of the low spatial frequency due to the overlap between the two circles in some of the projections. On the other hand, as shown in the dark blue box, we see an artifact where the contrast is inverted for the bacterial structure, which does not agree with the literature [17,31], where the bacterial structures have higher refractive index than cytoplasm.
Applsci 12 00951 g008
Figure 9. Filling of the Ewald’s sphere for 2 projections at different illumination wavelength (i.e., red and blue). (a) Transverse spatial frequencies plane at Kz = 0. (b) Kx(Ky),Kz plane at Ky(Kx) = 0.
Figure 9. Filling of the Ewald’s sphere for 2 projections at different illumination wavelength (i.e., red and blue). (a) Transverse spatial frequencies plane at Kz = 0. (b) Kx(Ky),Kz plane at Ky(Kx) = 0.
Applsci 12 00951 g009
Figure 10. Chromatic aberrations of the objective lens. (a) Sample is in focus at green illuminations, whereas in (b) we see that at blue illumination the sample is out of focus due to the aberrations.
Figure 10. Chromatic aberrations of the objective lens. (a) Sample is in focus at green illuminations, whereas in (b) we see that at blue illumination the sample is out of focus due to the aberrations.
Applsci 12 00951 g010
Figure 11. Effect of chromatic aberrations on the 3D RI reconstruction at green illumination (left) and blue illumination (right). Note how the 3D reconstruction at blue is shifted from the z = 0 plane due to chromatic aberrations.
Figure 11. Effect of chromatic aberrations on the 3D RI reconstruction at green illumination (left) and blue illumination (right). Note how the 3D reconstruction at blue is shifted from the z = 0 plane due to chromatic aberrations.
Applsci 12 00951 g011
Figure 12. Calibration of the chromatic aberrations for blue illumination. (a) XY slice at z = 0 after calibration and (b) before calibration.
Figure 12. Calibration of the chromatic aberrations for blue illumination. (a) XY slice at z = 0 after calibration and (b) before calibration.
Applsci 12 00951 g012
Figure 13. 3D RI reconstruction using wavelength diversity. (a) XY slices at different propagation depth. (b) Ewald’s sphere frequency support. The red arrows in (b) highlight the missing cone problem due to the limited numerical aperture of the objective lens when imaging in transmission configuration.
Figure 13. 3D RI reconstruction using wavelength diversity. (a) XY slices at different propagation depth. (b) Ewald’s sphere frequency support. The red arrows in (b) highlight the missing cone problem due to the limited numerical aperture of the objective lens when imaging in transmission configuration.
Applsci 12 00951 g013
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ayoub, A.B.; Roy, A.; Psaltis, D. Optical Diffraction Tomography Using Nearly In-Line Holography with a Broadband LED Source. Appl. Sci. 2022, 12, 951. https://doi.org/10.3390/app12030951

AMA Style

Ayoub AB, Roy A, Psaltis D. Optical Diffraction Tomography Using Nearly In-Line Holography with a Broadband LED Source. Applied Sciences. 2022; 12(3):951. https://doi.org/10.3390/app12030951

Chicago/Turabian Style

Ayoub, Ahmed B., Abhijit Roy, and Demetri Psaltis. 2022. "Optical Diffraction Tomography Using Nearly In-Line Holography with a Broadband LED Source" Applied Sciences 12, no. 3: 951. https://doi.org/10.3390/app12030951

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop