Next Article in Journal
Multivariate CNN Model for Human Locomotion Activity Recognition with a Wearable Exoskeleton Robot
Next Article in Special Issue
Using Geometric Morphometric Analysis of Magnetic Resonance Imaging to Assess the Anatomy of the Eustachian Tube in Children with and without Otitis Media
Previous Article in Journal
A Rapid-Patterning 3D Vessel-on-Chip for Imaging and Quantitatively Analyzing Cell–Cell Junction Phenotypes
Previous Article in Special Issue
Artificial Intelligence Advances in Transplant Pathology
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Super-Resolution Imaging of Neuronal Structures with Structured Illumination Microscopy

UCCS BioFrontiers Center, University of Colorado Colorado Springs, 1420 Austin Bluffs Parkway, Colorado Springs, CO 80918, USA
*
Author to whom correspondence should be addressed.
Bioengineering 2023, 10(9), 1081; https://doi.org/10.3390/bioengineering10091081
Submission received: 2 September 2023 / Accepted: 9 September 2023 / Published: 13 September 2023
(This article belongs to the Special Issue Recent Advances in Biomedical Imaging)

Abstract

:
Super-resolution structured illumination microscopy (SR-SIM) is an optical fluorescence microscopy method which is suitable for imaging a wide variety of cells and tissues in biological and biomedical research. Typically, SIM methods use high spatial frequency illumination patterns generated by laser interference. This approach provides high resolution but is limited to thin samples such as cultured cells. Using a different strategy for processing raw data and coarser illumination patterns, we imaged through a 150-micrometer-thick coronal section of a mouse brain expressing GFP in a subset of neurons. The resolution reached 144 nm, an improvement of 1.7-fold beyond conventional widefield imaging.

1. Introduction

Recently developed methods for surpassing the diffraction limit in optical fluorescence microscopy include stimulated emission depletion microscopy (STED) [1], stochastic optical reconstruction microscopy (STORM) [2], photoactivated localization microscopy (PALM) [3], super-resolution optical fluctuation imaging (SOFI) [4], and structured illumination microscopy (SIM) [5,6]. These methods have had large impacts in many fields, with super-resolution microscopy previously being used in many applications, including imaging the a mouse brain using STED [7], STORM [8], and SIM approaches [9]. SIM methods have been used in many situations, including the rapid imaging of clinical samples [10].
SIM is a method in which sets of images are acquired with shifting illumination patterns. The subsequent processing of these image sets results in images with optical sectioning, resolutions beyond the diffraction limit (super-resolution), or both [5,6,11,12,13,14]. Since its emergence over two decades ago [15], SIM has matured as an imaging technique, with multiple proposed methods for generating the structured illumination patterns [12,13,14,15,16,17,18,19,20,21,22,23,24,25] and processing the image data [6,13,26,27,28,29,30]. Compared to other super-resolution techniques, the speed, high signal-to-noise ratio, and low excitation light intensities characteristic to SIM make it a good choice for imaging a variety of samples in three dimensions. As shown here, SIM is accomplished with a standard fluorescence microscope with some additional required elements, whereas other approaches such as light-sheet microscopy require more specialized setups. There is an increasing interest in the methods and applications of SIM, with the field seeing many recent (2022–2023) improvements [31,32,33,34,35,36,37], including new methods involving deep learning approaches [38,39].
The imaging method we used, maximum a posteriori probability SIM (MAP-SIM), uses a Bayesian framework to reconstruct super-resolution SIM images [26,29,40]. This method has advantages including flexibility in the range of SIM illumination patterns which can be used and, in our case, the ability to use patterns with lower spatial frequencies. This, in turn, allows imaging deeper into samples in which scattering degrades the high spatial frequency patterns which are more commonly used in SIM. When the SIM pattern is out of focus, it blurs rapidly with increasing depth, producing a high intensity of out of focus light. This results in reduced pattern contrast in the acquired images. Because of this, traditional super-resolution SIM methods are typically limited to an imaging depth of 10–20 μm [41,42]. Here, we used MAP-SIM to image a fixed, optically cleared, ~150-μm-thick mouse brain coronal slice expressing a neuronal GFP marker while achieving a lateral resolution of 144 nm.
To overcome the challenges of imaging deeper into brain tissues, SIM has previously been combined with two-photon excitation [16] or with adaptive optics for in vivo studies [43,44]. These methods have offered impressive results, but they do involve additional costs and require additional optical devices and expertise, reducing the number of labs that can use these approaches. Here, we used a simpler and more economical approach with a non-laser light source and open-source software for SIM [45].

2. Methods

The sample used for this work was an optically cleared, green fluorescent protein (GFP) -labeled coronal mouse brain slice. The slice was approximately 150 μm thick and was obtained from SunJin Lab (Hsinchu City, Taiwan). The supplier used a Thy1-GFP mouse strain, and they stated that the sample was prepared as follows:
  • cardiac perfusion with cold, freshly prepared 4% paraformaldehyde (PFA)
  • fixation of the dissected brain with a 4% PFA solution on an orbital shaker overnight at 4 °C followed by washing three times with phosphate-buffered saline (PBS) at room temperature
  • sectioning the brain manually using a vibratome followed by clearing of the slice with RapiClear 1.52 (SunJin Lab) overnight at room temperature
  • mounting of the cleared sample with fresh RapiClear 1.52 reagent in a 0.25-mm-deep iSpacer microchamber (SunJin Lab)
For the SIM imaging, we used a home-built set-up based on the same design as described previously [20,26,40,46]. The current SIM system was based on an IX83 microscope equipped with several objectives (Olympus, Tokyo, Japan). Illumination was provided by a liquid light guide-coupled Spectra-X light source (Lumencor, Beaverton, OR, USA) using the cyan channel, which had an emission maximum of 470 nm. The illumination was collimated by an achromatic 50 mm focal length lens (Thor labs, Newton, NJ, USA) and vertically polarized with a linear polarizer (Edmund Optics, Barrington, NJ, USA) before entering a polarized beam splitter (PBS) cube (Thor Labs) and reflecting onto a liquid-crystal on silicon (LOCS) microdisplay (Forth Dimension Displays, Dalgety Bay, Scotland, UK). This device is a ferroelectric reflective-type spatial light modulator. The pixels, which were turned on, rotated the polarization of the light by ~90 degrees, converting vertical polarization to horizontal polarization. The horizontally polarized output of the microdisplay then passed through the PBS and was imaged into the microscope using a 180 mm focal-length lens (SWTLU-C, Olympus). The emitted fluorescent light was filtered (using a GFP filter set with dichroic T495lpxr and ET525/50 emission filters; Chroma, Bellows Falls, VT, USA) and then imaged with an sCMOS camera (Zyla 4.2+, Andor). The illumination power density on the sample used a 100× objective that was measured at 2.542 W/cm2 without SIM patterning (widefield illumination) and at 0.214 W/cm2 with the SIM pattern active. Sample movements and focusing were controlled by an XY piezo Z stage (Applied Scientific Instrumentation, Eugene, OR, USA).
The microdisplay was used to produce the SIM patterns, and it was controlled by the software supplied with the device (MetroCon, Forth Dimension Displays). Various SIM patterns were used as shown in the supplementary material in Table S6. The pattern position was shifted by one pixel after each image was acquired such that the sum of all illumination masks resulted in homogenous illumination. Figure 1 shows a simplified diagram of the SIM optical system and a connection diagram illustrating how the microdisplay system was synchronized with the camera using IQ software (Andor) and a digital input/output computer card (DDA06/16, Measurement Computing, Concord, NH, USA). More details about the SIM system are given in the supplementary material. Table S1 shows a list of the components we used along with the manufacturer, part number, and vendor website. Tables S2–S5 show some of the relevant optical and performance characteristics of the camera, microdisplay, and light source. These details should be useful for those wishing to build their own SIM systems of this type. The supplementary text explains, and Figure S7 shows, a schematic of the timing scheme used by the SIM system, and they illustrate the function of the AND gates shown in Figure 1. The supplementary text also explains, and Figure S8 shows, additional details about the operation of the microdisplay.

3. Data Analysis

3.1. Optical Sectioning SIM (OS-SIM)

Several data processing methods are possible for generating optically sectioned images from SIM data (OS-SIM) [20,47]. The most commonly used implementation of this technique was introduced in 1997 by Neil et al. [15]. Their method worked by projecting a line illumination pattern onto a sample, followed by the acquisition of a set of three images with the pattern shifted by the relative spatial phases 0, 2π/3, and 4π/3, respectively. Using this method, an optically sectioned image can be recovered computationally as follows:
I O S S I M = [ ( I 1 I 2 ) 2 + ( I 1 I 3 ) 2 + ( I 2 I 3 ) 2 ] 1 / 2 ,
where IOS-SIM is an optically sectioned image and I1, I2, and I3 are the three images acquired with the different pattern positions. This type of optically sectioned image is expected to be similar to that obtained with a laser scanning confocal microscope. If the sum of the individual SIM patterns results in homogeneous illumination, as was the case in our setup, a widefield (WF) image can also be recovered from the SIM data by taking the average of all images In, as follows:
I W F = 1 N n = 1 N I n .
This was the approach we used throughout this study to generate conventional widefield images.
Instead of using Equation (1), in this study, we used a method originally shown by Neil et al. [15] and later elaborated upon [20,47], as follows:
I O S S I M = | n = 1 N I n exp ( 2 π i n N ) | .
We found that this method provided consistent results and could be applied when using any number of patterns instead of the three patterns used in the original work. The actual positions of the illumination patterns in the camera images were determined using a calibrated camera according to our previous work [15], and this was based on a well-known method for the spatial calibration of a camera [48].

3.2. SIM with Maximum a Posteriori Probability Estimation

MAP-SIM has been described previously [26]. In our study, the imaging process could be denoted as follows:
y k = H M k x + n k ,
where Mk is a matrix in which the elements represent the k-th illumination pattern; yk denotes a low-resolution image acquired using the k-th illumination pattern; x is an unknown, high-resolution image; and nk is (Gaussian) additive noise. H is a matrix that models the convolution between the high-resolution image and the point-spread function (PSF) of the system. Each SIM image acquired generates an Equation (4) with a different illumination pattern (k). The linear system of Equation (4) produced in a SIM experiment must be solved in order to reconstruct a high-resolution image. This reconstruction can be defined as the inversion of the system of equations. In the presence of noise (nk), the inversion becomes unstable and is considered an ill-posed problem. This means we need to add a constraint which stabilizes the inversion of the system and ensures the uniqueness of the solution. In this imaging model, the low-resolution images (yk), high-resolution image (x), and nose (nk) are measurement-dependent.
We modeled the PSF as an Airy disk which, in Fourier space, would lead to an optical transfer function (OTF) of the form [49] as follows:
O T F ( f ) = 1 π [ 2 cos 1 ( f f c ) sin ( 2 cos 1 ( f f c ) ) ] ,
where f is the spatial frequency. We estimated the cut-off frequency (fc) by calculating the radial average of the power spectral density (PSD) of a widefield image of 100 nm fluorescent beads [50]. This could also be calculated by taking the Rayleigh limit of the resolution d = 0.61λ/NA and expressing this value in terms of spatial frequency (1/d).
Using a Bayesian approach [26,27,29,51,52,53,54], high-resolution image estimation can be expressed as a minimized cost function according to the following:
x HR - MAP = arg min x [ k = 1 K y k H M k x 2 + λ Γ ( x ) ] .
The cost function in Equation (6) consists of two terms. The first term describes the mean square error between the estimated HR image and the observed LR images. The second term (λΓ(x)) is a regularization term. To ensure positivity and promote a smoothness condition, we relied on quadratic regularization [54]. The contribution of Γ(x) was controlled by the parameter λ, a small positive constant defining the strength of the regularization (typically, λ = 0.01). We solved Equation (6) using gradient descent methods [54].

3.3. Spectral Merging

MAP estimation of high-resolution images obtained with structured illumination enables the reconstruction of high-resolution images (HR-MAP) with details that are unresolvable in a widefield microscope. However, MAP estimation, as described above, does not suppress out-of-focus light. On the other hand, the processing method according to Equation (3) used in optical sectioning SIM [15,20] provides images (LR-HOM) with optical sectioning. Noting that the unwanted, out-of-focus light was dominant at low spatial frequencies, we merged the LR-HOM and HR-MAP images in the frequency domain to obtain the final HR image (MAP-SIM). For 3D data, this is completed in a slice-by-slice fashion, resulting in a Z-stack of SIM images. Frequency-domain Gaussian low-pass filtering was applied to the LR-HOM image, and a complementary high-pass filter was applied to the HR-MAP image. We used a weighting scheme that could be described by the following equation:
x MAP - SIM = F 1 { ( 1 β ) F { x LR - HOM } exp ( f 2 2 σ 2 ) + β F { x HR - MAP } ( 1 exp ( f 2 2 σ 2 ) ) } ,
where F , F 1 denotes the Fourier transform operator and its inverse, respectively, and f is the spatial frequency, σ is the standard deviation of the Gaussian filter, and β is a weighting coefficient. Usually, we would set β to 0.85. We would typically use a standard incoherent apodizing function to shape the MAP-SIM spectrum before the final inverse FFT.

4. Results

To acquire an overview of the slice with SIM methods, we first imaged using a 10×/0.4 NA water immersion objective. We acquired 60 image positions with a 20-percent overlap between each position and with 12 z-planes. In this image, the Z-plane spacing was 20 μm. Image stitching was accomplished using our lab’s methods and an ImageJ plugin [55], as shown in [46]. A composite image of the slice is shown in Figure 2, and it is color-coded based on depth using the isolum color table [56]. This image was acquired in 5 min and 30 s, with an additional 15 min and 35 s required for the OS-SIM processing, according to Equation (3). The final image was 8.4 GB in size, and it had 16,859 × 10,378 × 12 pixels.
This slice was matched to Paxinos and Franklin’s mouse brain atlas [57] to identify which section of the brain was being imaged. Our slice was visually matched with slice 64. We further matched our sample to slice 92 of 132 in the Allen brain atlas [58,59]. Second order polynomial fits were made for both the horizontal and vertical directions using the edges and the central aqueduct as reference points. This allowed any point on this brain slice, recorded from the microscope stage coordinates, to be translated into the coordinates of the atlas. This method placed the neuron shown in Figure 3 in the temporal association area (TeA) of the mouse brain isocortex, as indicated by the yellow box in Figure 2a.

Imaging Deep Neurons

To demonstrate MAP-SIM’s ability to image deeper into the sample than traditional SR-SIM, a TeA neuron 41–66 μm deep was imaged. The depth was measured using the closed-loop piezo stage. A 100×/1.4 NA oil immersion objective was used with an exposure time of 300 ms per SIM phase. This image is shown in Figure 3. The profile of a dendric spine neck was also measured (Figure 3d,e). The profile was fit in MatLab using a Gaussian function weighted by the square root of the counts, with nonlinear least squares methods. The full width at half-max (FWHM) was determined to be 164.0 ± 4.9 nm. To determine the image resolution, we calculated the power spectral density (PSD) as previously described [50]. We found that the WF image had a resolution of 247.6 nm while the MAP-SIM image had a resolution of 143.6 nm, an improvement of ~1.7-fold. These results are summarized in Table 1. Figures S1–S3 show additional images of cortical neurons imaged with MAP-SIM and a resolution analysis by a Fourier ring correlation (FRC) [60,61] and the PSD methods. The FRC measurements indicated a MAP-SIM resolution of approximately 150–160 nm, in good agreement with the 144 nm measured by the PSD methods.
A comparison of widefield, basic OS-SIM (Equation (3)), and MAP-SIM (Equations (6) and (7)) for this same cortical neuron is shown in Figure 4 and further analyzed in Table 1. As is evident in the figure, widefield had the largest background due to out-of-focus light, with basic OS-SIM providing optical sectioning and MAP-SIM providing both optical sectioning and super-resolution. The imaging depth of 41–66 μm exceeded the depth limit of traditional SIM by approximately three-fold.
We further imaged a neuron at a depth of 71–83 μm. This is shown in Figure 5. In this particular image, the resolution, measured by calculating the PSD, was 161 nm. While this was a decrease in resolution from the shallower neuron shown in Figure 3, it still surpassed the diffraction limit. Imaging at approximately 100 μm using these methods often resulted in images with large amounts of noise, and so the maximum imaging depth with a 100× objective in this sample appeared to be approximately 85 μm. Using a 60× objective, we were able to image up to 113 μm, as shown in Figure 6.
In addition to the cortical neurons imaged in Figure 3, Figure 4 and Figure 5 (and in Figures S1, S2 and S4), we also imaged an area of the brain in which a higher proportion of the neurons expressed the GFP marker (SUBv-sp subiculum, ventral part, pyramidal layer, also see Figure S5). This is shown in Figure 6. The maximum-intensity projection images, shown at different depths, showed good imaging at all depths. In addition, the imaging quality remained high even at depths past 100 μm using this objective (60× oil immersion).
Typically, SIM uses high-frequency patterns to maximize the obtainable resolution, but it has limited imaging depth due to scattering and the generation of large amounts of background fluorescence. The pattern used here used a lower spatial frequency to penetrate deeper into the mouse brain while maintaining the pattern integrity. A comparison of (cropped) images acquired using a high-frequency pattern (i.e., one out of three microdisplay pixels was activated) and our lower-frequency pattern (i.e., two out of ten microdisplay pixels were activated) is shown in Figure 7. Also shown is a plot of the measured modulation of the SIM pattern vs. the depth for various SIM patterns using a thick fluorescent plastic slide (obtained from Chroma). The modulation, measured as the average of the (max − min/max + min) in a region of interest, fell as the pattern spatial frequency increased and as the depth increased. This was expected in the case of incoherent illumination, as we used here, because the incoherent optical transfer function applied [49]. The higher-frequency pattern resulted in a weaker signal and poorer image reconstruction when imaging deep into the sample, as shown in Figure 7c.

5. Discussion

By combining a structured illumination microscope with a large field-of-view and an image reconstruction method based on Bayesian statistics, we demonstrated synapse-resolving meso- and micro-scale volumetric imaging in an optically cleared coronal slice of adult mouse brain. The use of MAP-SIM and sample-optimized illumination patterns allowed us to collect super-resolution images well beyond the typical depth limit for SIM.
Compared to other super-resolution methods, SIM has poorer resolution. For example, the 144 nm lateral resolution achieved here is worse in comparison to the approximately 20 nm resolution that is typically achieved with STORM. However, SIM requires ~15 (or less, depending on the method used) images to reconstruct a super-resolution image. This is far lower than the 20,000 (or more) images usually required for STORM, making SIM imaging much faster and, therefore, a possibility for use when imaging live cells. The excitation power needed for SIM is much lower than that used in STORM. Here, we used 0.214 W/cm2 for SIM compared to the 2 kW/cm2 we previously used for STORM [62]. We found that the photobleaching in our experiments was minimal (Figure S6).
Most of the progress in super-resolution SIM has been in the acquisition and processing of images, but SIM was used in a detailed study on dendritic spines [9] where the authors developed a method for reconstructing and measuring the surface geometries of dendritic spines from 3D-SIM images. By adopting more flexible strategies for image acquisition and processing, such as the methods shown here, SIM is expected to be used more frequently in biological studies, including those on dense tissues such as brain tissue.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/bioengineering10091081/s1, Table S1. Main components of the SIM system; Table S2. Camera parameters; Table S3. Light source parameters; Table S4. Microdisplay parameters; Table S5. Camera parameters for the machine vision camera used (only) in Figures S4 and S5; Table S6. Parameters of the imaging data; Figure S1: Cortical neuron; Figure S2: Enlarged views; Figure S3: Resolution analysis; Figure S4: Cortical neuron; Figure S5: Neurons of the midbrain; Figure S6: Photobleaching analysis; Figure S7: SIM system details; Figure S8: SIM system diagrams.

Author Contributions

T.C.P.: acquired the data, analyzed the data, and wrote the paper; K.A.J.: acquired the data and analyzed the data; G.M.H.: conceived the project, acquired the data, analyzed the data, supervised the research, and wrote the paper. All authors have read and agreed to the published version of the manuscript.

Funding

The research reported in this publication was supported by the National Institute of General Medical Sciences of the National Institutes of Health under award number 2R15GM128166-02. This work was also supported by the UCCS BioFrontiers center.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data are available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hell, S.W.; Wichmann, J. Breaking the diffraction resolution limit by stimulated emission: Stimulated-emission-depletion fluorescence microscopy. Opt. Lett. 1994, 19, 780. [Google Scholar] [CrossRef] [PubMed]
  2. Rust, M.J.; Bates, M.; Zhuang, X. Sub-diffraction-limit imaging by stochastic optical reconstruction microscopy (STORM). Nat. Methods 2006, 3, 793–795. [Google Scholar] [CrossRef] [PubMed]
  3. Betzig, E.; Patterson, G.H.; Sougrat, R.; Lindwasser, O.W.; Olenych, S.; Bonifacino, J.S.; Davidson, M.W.; Lippincott-Schwartz, J.; Hess, H.F. Imaging intracellular fluorescent proteins at nanometer resolution. Science 2006, 313, 1642–1645. [Google Scholar] [CrossRef] [PubMed]
  4. Dertinger, T.; Xu, J.; Naini, O.; Vogel, R.; Weiss, S. SOFI-based 3D superresolution sectioning with a widefield microscope. Opt. Nanoscopy 2012, 1, 2. [Google Scholar] [CrossRef] [PubMed]
  5. Gustafsson, M.G.L. Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy. J. Microsc. 2000, 198, 82–87. [Google Scholar] [CrossRef]
  6. Heintzmann, R.; Cremer, C. Laterally modulated excitation microscopy: Improvement of resolution by using a diffraction grating. Proc. SPIE 1998, 3568, 185–196. [Google Scholar]
  7. Berning, S.; Willig, K.I.; Steffens, H.; Dibaj, P.; Hell, S.W. Nanoscopy in a Living Mouse Brain. Science 2012, 335, 551. [Google Scholar] [CrossRef]
  8. Mlodzianoski, M.J.; Cheng-Hathaway, P.J.; Bemiller, S.M.; McCray, T.J.; Liu, S.; Miller, D.A.; Lamb, B.T.; Landreth, G.E.; Huang, F. Active PSF shaping and adaptive optics enable volumetric localization microscopy through brain sections. Nat. Methods 2018, 15, 583–586. [Google Scholar] [CrossRef]
  9. Kashiwagi, Y.; Higashi, T.; Obashi, K.; Sato, Y.; Komiyama, N.H.; Grant, S.G.N.; Okabe, S. Computational geometry analysis of dendritic spines by structured illumination microscopy. Nat. Commun. 2019, 10, 1285. [Google Scholar] [CrossRef]
  10. Wang, M.; Tulman, D.B.; Sholl, A.B.; Kimbrell, H.Z.; Mandava, S.H.; Elfer, K.N.; Luethy, S.; Maddox, M.M.; Lai, W.; Lee, B.R.; et al. Gigapixel surface imaging of radical prostatectomy specimens for comprehensive detection of cancer-positive surgical margins using structured illumination microscopy. Sci. Rep. 2016, 6, 27419. [Google Scholar] [CrossRef]
  11. Gao, L.; Shao, L.; Higgins, C.D.; Poulton, J.S.; Peifer, M.; Davidson, M.W.; Wu, X.; Goldstein, B.; Betzig, E. Noninvasive imaging beyond the diffraction limit of 3D dynamics in thickly fluorescent specimens. Cell 2012, 151, 1370–1385. [Google Scholar] [CrossRef]
  12. Kner, P.; Chhun, B.B.; Griffis, E.R.; Winoto, L.; Gustafsson, M.G.L. Super-resolution video microscopy of live cells by structured illumination. Nat. Methods 2009, 6, 339–342. [Google Scholar] [CrossRef]
  13. Gustafsson, M.G.L.; Shao, L.; Carlton, P.M.; Wang, C.J.R.; Golubovskaya, I.N.; Cande, W.Z.; Agard, D.A.; Sedat, J.W. Three-dimensional resolution doubling in wide-field fluorescence microscopy by structured illumination. Biophys. J. 2008, 94, 4957–4970. [Google Scholar] [CrossRef]
  14. Schermelleh, L.; Carlton, P.M.; Haase, S.; Shao, L.; Winoto, L.; Kner, P.; Burke, B.; Cardoso, M.C.; Agard, D.A.; Gustafsson, M.G.L.; et al. Subdiffraction multicolor imaging of the nuclear periphery with 3D structured illumination microscopy. Science 2008, 320, 1332–1336. [Google Scholar] [CrossRef]
  15. Neil, M.A.A.; Juškaitis, R.; Wilson, T. Method of obtaining optical sectioning by using structured light in a conventional microscope. Opt. Lett. 1997, 22, 1905. [Google Scholar] [CrossRef]
  16. Pilger, C.; Pospíšil, J.; Müller, M.; Ruoff, M.; Schütte, M.; Spiecker, H.; Huser, T. Super-resolution fluorescence microscopy by line-scanning with an unmodified two-photon microscope. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2021, 379, 20200300. [Google Scholar] [CrossRef] [PubMed]
  17. Liu, W.; Liu, Q.; Zhang, Z.; Han, Y.; Kuang, C.; Xu, L.; Yang, H.; Liu, X. Three-dimensional super-resolution imaging of live whole cells using galvanometer-based structured illumination microscopy. Opt. Express 2019, 27, 7237. [Google Scholar] [CrossRef] [PubMed]
  18. Brown, P.T.; Kruithoff, R.; Seedorf, G.J.; Shepherd, D.P. Multicolor structured illumination microscopy and quantitative control of polychromatic light with a digital micromirror device. Biomed. Opt. Express 2021, 12, 3700. [Google Scholar] [CrossRef] [PubMed]
  19. Chang, B.J.; Meza, V.D.P.; Stelzer, E.H.K. csiLSFM combines light-sheet fluorescence microscopy and coherent Structured illumination for a lateral resolution below 100 nm. Proc. Natl. Acad. Sci. USA 2017, 114, 4869–4874. [Google Scholar] [CrossRef] [PubMed]
  20. Křížek, P.; Raška, I.; Hagen, G.M. Flexible structured illumination microscope with a programmable illumination array. Opt. Express 2012, 20, 24585. [Google Scholar] [CrossRef] [PubMed]
  21. Rossberger, S.; Best, G.; Baddeley, D.; Heintzmann, R.; Birk, U.; Dithmar, S.; Cremer, C. Combination of structured illumination and single molecule localization microscopy in one setup. J. Opt. 2013, 15, 094003. [Google Scholar] [CrossRef]
  22. Young, L.J.; Ströhl, F.; Kaminski, C.F. A Guide to Structured Illumination TIRF Microscopy at High Speed with Multiple Colors. J. Vis. Exp. 2016, 111, e53988. [Google Scholar]
  23. Poher, V.; Zhang, H.X.; Kennedy, G.T.; Griffin, C.; Oddos, S.; Gu, E.; Elson, D.S.; Girkin, M.; French, P.M.W.; Dawson, M.D.; et al. Optical sectioning microscope with no moving parts using a micro-stripe array light emitting diode. Opt. Express 2007, 15, 11196–11206. [Google Scholar] [CrossRef] [PubMed]
  24. Pospíšil, J.; Wiebusch, G.; Fliegel, K.; Klíma, M.; Huser, T. Highly compact and cost-effective 2-beam super-resolution structured illumination microscope based on all-fiber optic components. Opt. Express 2021, 29, 11833. [Google Scholar] [CrossRef]
  25. Hinsdale, T.A.; Stallinga, S.; Rieger, B. High-speed multicolor structured illumination microscopy using a hexagonal single mode fiber array. Biomed. Opt. Express 2021, 12, 1181. [Google Scholar] [CrossRef]
  26. Lukeš, T.; Křížek, P.; Švindrych, Z.; Benda, J.; Ovesný, M.; Fliegel, K.; Klíma, M.; Hagen, G.M. Three-dimensional super-resolution structured illumination microscopy with maximum a posteriori probability image estimation. Opt. Express 2014, 22, 29805–29817. [Google Scholar] [CrossRef]
  27. Orieux, F.; Sepulveda, E.; Loriette, V.; Dubertret, B.; Olivo-Marin, J.C. Bayesian estimation for optimized structured illumination microscopy. IEEE Trans. Image Process. 2012, 21, 601–614. [Google Scholar] [CrossRef]
  28. Huang, X.; Fan, J.; Li, L.; Liu, H.; Wu, R.; Wu, Y.; Wei, L.; Mao, H.; Lal, A.; Xi, P.; et al. Fast, long-term, super-resolution imaging with Hessian structured illumination microscopy. Nat. Biotechnol. 2018, 36, 451–459. [Google Scholar] [CrossRef]
  29. Lukeš, T.; Hagen, G.M.; Křížek, P.; Švindrych, Z.; Fliegel, K.; Klíma, M. Comparison of image reconstruction methods for structured illumination microscopy. Proc. SPIE 2014, 9129, 91293J. [Google Scholar]
  30. Chakrova, N.; Rieger, B.; Stallinga, S. Deconvolution methods for structured illumination microscopy. J. Opt. Soc. Am. A 2016, 33, B12. [Google Scholar] [CrossRef]
  31. Chen, X.; Zhong, S.; Hou, Y.; Cao, R.; Wang, W.; Li, D.; Dai, Q.; Kim, D.; Xi, P. Superresolution structured illumination microscopy reconstruction algorithms: A review. Light Sci. Appl. 2023, 12, 172. [Google Scholar] [CrossRef] [PubMed]
  32. Mo, Y.; Wang, K.; Li, L.; Xing, S.; Ye, S.; Wen, J.; Duan, X.; Luo, Z.; Gou, W.; Chen, T.; et al. Quantitative structured illumination microscopy via a physical model-based background filtering algorithm reveals actin dynamics. Nat. Commun. 2023, 14, 3089. [Google Scholar] [CrossRef] [PubMed]
  33. Cao, R.; Li, Y.; Chen, X.; Ge, X.; Li, M.; Guan, M.; Hou, Y.; Fu, Y.; Xu, X.; Jiang, S.; et al. Open-3DSIM: An open-source three-dimensional structured illumination microscopy reconstruction platform. Nat. Methods 2023, 20, 1183–1186. [Google Scholar] [CrossRef] [PubMed]
  34. Hannebelle, M.T.; Raeth, E.; Leitao, S.M.; Lukeš, T.; Pospíšil, J.; Toniolo, C.; Venzin, O.F.; Chrisnandy, A.; Swain, P.P.; Ronceray, N.; et al. OpenSIM: Open source microscope add-on for structured illumination microscopy. bioRxiv 2023, arXiv:2023.06.16.545316. [Google Scholar]
  35. Wen, G.; Li, S.; Liang, Y.; Wang, L.; Zhang, J.; Chen, X.; Jin, X.; Chen, C.; Tang, Y.; Li, H. Spectrum-optimized direct image reconstruction of super-resolution structured illumination microscopy. PhotoniX 2023, 4, 19. [Google Scholar] [CrossRef]
  36. Li, X.; Wu, Y.; Su, Y.; Rey-Suarez, I.; Matthaeus, C.; Updegrove, T.B.; Wei, Z.; Zhang, L.; Sasaki, H.; Li, Y.; et al. Three-dimensional structured illumination microscopy with enhanced axial resolution. Nat. Biotechnol. 2023, 2023, 1–13. [Google Scholar] [CrossRef]
  37. Johnson, K.A.; Noble, D.; Machado, R.; Paul, T.C.; Hagen, G.M. Flexible Multiplane Structured Illumination Microscope with a Four-Camera Detector. Photonics 2022, 9, 501. [Google Scholar] [CrossRef]
  38. Luo, F.; Zeng, J.; Shao, Z.; Zhang, C. Fast structured illumination microscopy via transfer learning with correcting. Opt. Lasers Eng. 2023, 162, 107432. [Google Scholar] [CrossRef]
  39. Burns, Z.; Liu, Z.; Liu, Z.; Liu, Z.; Liu, Z. Untrained, physics-informed neural networks for structured illumination microscopy. Opt. Express 2023, 31, 8714–8724. [Google Scholar] [CrossRef]
  40. Pospíšil, J.; Lukeš, T.; Bendesky, J.; Fliegel, K.; Spendier, K.; Hagen, G.M. Imaging tissues and cells beyond the diffraction limit with structured illumination microscopy and Bayesian image reconstruction. Gigascience 2018, 8, giy126. [Google Scholar] [CrossRef]
  41. Wu, Y.; Shroff, H. Faster, sharper, and deeper: Structured illumination microscopy for biological imaging. Nat. Methods 2018, 15, 1011–1019. [Google Scholar] [CrossRef] [PubMed]
  42. Mandula, O.; Kielhorn, M.; Wicker, K.; Krampert, G.; Kleppe, I.; Heintzmann, R. Line scan—Structured illumination microscopy super-resolution imaging in thick fluorescent samples. Opt. Express 2012, 20, 24167–24174. [Google Scholar] [CrossRef] [PubMed]
  43. Li, Z.; Zhang, Q.; Chou, S.W.; Newman, Z.; Turcotte, R.; Natan, R.; Dai, Q.; Isacoff, E.Y.; Ji, N. Fast widefield imaging of neuronal structure and function with optical sectioning in vivo. Sci. Adv. 2020, 6, eaaz3870. [Google Scholar] [CrossRef]
  44. Lu, R.; Liang, Y.; Meng, G.; Zhou, P.; Svoboda, K.; Paninski, L.; Ji, N. Rapid mesoscale volumetric imaging of neural activity with synaptic resolution. Nat. Methods 2020, 17, 291–294. [Google Scholar] [CrossRef] [PubMed]
  45. Křížek, P.; Lukeš, T.; Ovesný, M.; Fliegel, K.; Hagen, G.M. SIMToolbox: A MATLAB toolbox for structured illumination fluorescence microscopy. Bioinformatics 2015, 32, 318–320. [Google Scholar] [CrossRef]
  46. Johnson, K.A.; Hagen, G.M. Artifact-free whole-slide imaging with structured illumination microscopy and Bayesian image reconstruction. Gigascience 2021, 9, giaa035. [Google Scholar] [CrossRef]
  47. Heintzmann, R. Structured illumination methods. In Handbook of Biological Confocal Microscopy; Pawley, J.B., Ed.; Springer: New York, NY, USA, 2006; pp. 265–279. [Google Scholar]
  48. Šonka, M.; Hlaváč, V.; Boyle, R. Image Processing Analysis and Machine Vision, 2nd ed.; PWS Publishing: Boston, MA, USA, 1998. [Google Scholar]
  49. Goodman, J.W. Frequency Analysis of Optical Imaging Systems. In Introduction to Fourier Optics; McGraw-HIll Int.: New York, NY, USA, 1968; pp. 126–171. ISBN 0-07-024254-2. [Google Scholar]
  50. Pospíšil, J.; Fliegel, K.; Klíma, M. Assessing resolution in live cell structured illumination microscopy. In Proceedings of SPIE—The International Society for Optical Engineering; Páta, P., Fliegel, K., Eds.; SPIE: Prague, Czech Republic, 2017; Volume 10603, p. 39. [Google Scholar]
  51. Verveer, P.J.; Jovin, T.M. Efficient superresolution restoration algorithms using maximum a posteriori estimations with application to fluorescence microscopy. JOSA A 1997, 14, 1696. [Google Scholar] [CrossRef]
  52. Verveer, P.J.; Gemkow, M.J.; Jovin, T.M. A comparison of image restoration approaches applied to three-dimensional confocal and wide-field fluorescence microscopy. J. Microsc. 1999, 193, 50–61. [Google Scholar] [CrossRef]
  53. Vermolen, B.J.; Garini, Y.; Young, I.T. 3D restoration with multiple images acquired by a modified conventional microscope. Microsc. Res. Tech. 2004, 64, 113–125. [Google Scholar] [CrossRef]
  54. Chaudhuri, S. Super-Resolution Imaging; Milanfar, P., Ed.; CRC Press: Boca Raton, FL, USA, 2011; ISBN 978-1-4398-1931-9. [Google Scholar]
  55. Preibisch, S.; Saalfeld, S.; Tomancak, P. Globally optimal stitching of tiled 3D microscopic image acquisitions. Bioinformatics 2009, 25, 1463–1465. [Google Scholar] [CrossRef]
  56. Geissbuehler, M.; Lasser, T. How to display data by color schemes compatible with red-green color perception deficiencies. Opt. Express 2013, 21, 9862. [Google Scholar] [CrossRef] [PubMed]
  57. Paxinos, G.; Franklin, K.B.J. The Mouse Brain in Stereotaxic Coordinates, 2nd ed.; Academic Press: New York, NY, USA, 2001; ISBN 9780128161579. [Google Scholar]
  58. Allen Institute for Brain Science (2004). Allen Institute for Brain Science (2011). Allen Mouse Brain Atlas [Dataset]. Available online: https://mouse.brain-map.org/ (accessed on 24 May 2023).
  59. Lein, E.S.; Hawrylycz, M.J.; Ao, N.; Ayres, M.; Bensinger, A.; Bernard, A.; Boe, A.F.; Boguski, M.S.; Brockway, K.S.; Byrnes, E.J.; et al. Genome-wide atlas of gene expression in the adult mouse brain. Nature 2007, 445, 168–176. [Google Scholar] [CrossRef]
  60. Nieuwenhuizen, R.P.J.; Lidke, K.A.; Bates, M.; Puig, D.L.; Grünwald, D.; Stallinga, S.; Rieger, B. Measuring image resolution in optical nanoscopy. Nat. Methods 2013, 10, 557–562. [Google Scholar] [CrossRef] [PubMed]
  61. Culley, S.; Albrecht, D.; Jacobs, C.; Pereira, P.M.; Leterrier, C.; Mercer, J.; Henriques, R. Quantitative mapping and minimization of super-resolution optical imaging artifacts. Nat. Methods 2018, 15, 263–266. [Google Scholar] [CrossRef] [PubMed]
  62. Smirnov, E.; Borkovec, J.; Kováčik, L.; Svidenská, S.; Schröfel, A.; Skalníková, M.; Švindrych, Z.; Křížek, P.; Ovesný, M.; Hagen, G.M.; et al. Separation of replication and transcription domains in nucleoli. J. Struct. Biol. 2014, 188, 259–266. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Simplified optical diagram (left) and connection diagram (right). The connection setup for the two-wavelength acquisition is shown, and in this study, only 470 nm illumination was used.
Figure 1. Simplified optical diagram (left) and connection diagram (right). The connection setup for the two-wavelength acquisition is shown, and in this study, only 470 nm illumination was used.
Bioengineering 10 01081 g001
Figure 2. (a) Overview of the OS-SIM image. The yellow box indicates the temporal association area where the neurons were imaged with super-resolution MAP-SIM. (b) Nissl (left) and anatomical annotations (right) from the Allen mouse brain atlas and the Allen Reference Atlas—Mouse Brain, at the same slice position as (a) (slice 92 of 132, Allen Mouse Brain Atlas, mouse.brain-map.org and atlas.brain-map.org).
Figure 2. (a) Overview of the OS-SIM image. The yellow box indicates the temporal association area where the neurons were imaged with super-resolution MAP-SIM. (b) Nissl (left) and anatomical annotations (right) from the Allen mouse brain atlas and the Allen Reference Atlas—Mouse Brain, at the same slice position as (a) (slice 92 of 132, Allen Mouse Brain Atlas, mouse.brain-map.org and atlas.brain-map.org).
Bioengineering 10 01081 g002
Figure 3. (a) TeA neuron imaged at a depth of 41 μm to 66 μm using a 100×/1.4 NA oil immersion objective. (b,c) Zoomed in views of the selected areas indicated in (a) by yellow boxes. The width of the spine neck, selected in (d), was fit to a Gaussian function (FWHM 164.0 ± 4.9 nm).
Figure 3. (a) TeA neuron imaged at a depth of 41 μm to 66 μm using a 100×/1.4 NA oil immersion objective. (b,c) Zoomed in views of the selected areas indicated in (a) by yellow boxes. The width of the spine neck, selected in (d), was fit to a Gaussian function (FWHM 164.0 ± 4.9 nm).
Bioengineering 10 01081 g003
Figure 4. (a) TeA neuron shown in widefield, basic OS-SIM, and MAP-SIM. (b) MAP-SIM image color-coded by depth.
Figure 4. (a) TeA neuron shown in widefield, basic OS-SIM, and MAP-SIM. (b) MAP-SIM image color-coded by depth.
Bioengineering 10 01081 g004
Figure 5. (a) TeA neuron imaged at a depth of 71 μm to 83 μm using a 100×/1.4 NA oil immersion objective. The inset shows the fast Fourier transform (FFT) of the image in (a), the boundary of which indicates the resolution. (b) Zoomed-in view of the selected area indicated in (a) by a yellow box. (c) A measurement of the resolution determined by measuring the power spectral density. (d) The MAP-SIM image color-coded by depth.
Figure 5. (a) TeA neuron imaged at a depth of 71 μm to 83 μm using a 100×/1.4 NA oil immersion objective. The inset shows the fast Fourier transform (FFT) of the image in (a), the boundary of which indicates the resolution. (b) Zoomed-in view of the selected area indicated in (a) by a yellow box. (c) A measurement of the resolution determined by measuring the power spectral density. (d) The MAP-SIM image color-coded by depth.
Bioengineering 10 01081 g005
Figure 6. Neurons of the subiculum, ventral part, pyramidal layer (SUBv-sp), with an imaging depth of 0 to 113 μm (60×/1.42 NA oil immersion objective). The maximum-intensity projections of the imaged area have depths of (a) 0.2–28.4 μm, (b) 28.6–56.6 μm, (c) 56.8–84.8 μm, and (d) 85.0–113.0 μm. (e) X-Z projection of the imaged area.
Figure 6. Neurons of the subiculum, ventral part, pyramidal layer (SUBv-sp), with an imaging depth of 0 to 113 μm (60×/1.42 NA oil immersion objective). The maximum-intensity projections of the imaged area have depths of (a) 0.2–28.4 μm, (b) 28.6–56.6 μm, (c) 56.8–84.8 μm, and (d) 85.0–113.0 μm. (e) X-Z projection of the imaged area.
Bioengineering 10 01081 g006
Figure 7. (a) Plot of modulation vs. axial depth for the different SIM patterns. (b) High-frequency pattern imaging of the TeA cortical neurons at the surface of the slice (0–10 μm). (c) High-frequency pattern imaging of the TeA cortical neurons at a depth of 41–45 μm. (d) Low-frequency pattern imaging of the same field of view shown in (c).
Figure 7. (a) Plot of modulation vs. axial depth for the different SIM patterns. (b) High-frequency pattern imaging of the TeA cortical neurons at the surface of the slice (0–10 μm). (c) High-frequency pattern imaging of the TeA cortical neurons at a depth of 41–45 μm. (d) Low-frequency pattern imaging of the same field of view shown in (c).
Bioengineering 10 01081 g007
Table 1. SNR and resolution measurements.
Table 1. SNR and resolution measurements.
SNR (dB)Resolution (nm)
Widefield43.87247.6
Basic SIM29.33251.6
MAP-SIM39.27143.6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Paul, T.C.; Johnson, K.A.; Hagen, G.M. Super-Resolution Imaging of Neuronal Structures with Structured Illumination Microscopy. Bioengineering 2023, 10, 1081. https://doi.org/10.3390/bioengineering10091081

AMA Style

Paul TC, Johnson KA, Hagen GM. Super-Resolution Imaging of Neuronal Structures with Structured Illumination Microscopy. Bioengineering. 2023; 10(9):1081. https://doi.org/10.3390/bioengineering10091081

Chicago/Turabian Style

Paul, Tristan C., Karl A. Johnson, and Guy M. Hagen. 2023. "Super-Resolution Imaging of Neuronal Structures with Structured Illumination Microscopy" Bioengineering 10, no. 9: 1081. https://doi.org/10.3390/bioengineering10091081

APA Style

Paul, T. C., Johnson, K. A., & Hagen, G. M. (2023). Super-Resolution Imaging of Neuronal Structures with Structured Illumination Microscopy. Bioengineering, 10(9), 1081. https://doi.org/10.3390/bioengineering10091081

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop