Next Article in Journal
Phase-Derived Ranging Based Fiber Transfer Delay Measurement Using a Composite Signal for Distributed Radars with Fiber Networks
Previous Article in Journal
Particle Swarm Optimized Compact, Low Loss 3-dB Power Splitter Enabled by Silicon Columns in Silicon-on-Insulator
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

One-Dimensional High-Resolution Wavefront Sensor Enabled by Subwavelength Compound Gratings

School of Electronic Science and Engineering, Nanjing University, Nanjing 210023, China
*
Author to whom correspondence should be addressed.
Photonics 2023, 10(4), 420; https://doi.org/10.3390/photonics10040420
Submission received: 21 February 2023 / Revised: 3 April 2023 / Accepted: 4 April 2023 / Published: 7 April 2023

Abstract

:
Angle sensors are widely used for wavefront measurements, which is attributed to their integration and robustness. Currently, commercial sensors are available with pixel sizes in the order of wavelengths. However, the spatial resolution of angle sensors still lags far behind. Here, we report a one-dimensional, high-resolution wavefront sensor. It was produced by introducing subwavelength compound gratings above the pixels. The gratings involved could be replaced by the sensor’s intrinsic readout circuitry without additional operation. The experimental results showed that it had a spatial resolution of 1.4 µm, two orders of magnitude higher than that of the Shack–Hartmann wavefront sensor. The significant increase in spatial resolution enables angle sensors to reconstruct complex wavefronts accurately.

1. Introduction

The measurement of the wavefront of light is now widely applied in many research areas, such as optical characterization, neuroscience, astronomy observation, visual science, and free space optical communications [1,2,3]. The wavefront is determined by phase. Thus, it cannot be measured directly. One of the most common ways to measure a wavefront is to convert it into the light intensity distribution.
It is well known that interferometry is a widely used technique to transform phase measurements into light intensity measurements. It obtains information through interferometric phenomena caused by the superposition of waves. This technique is still vigorously used in the field of wavefront measurement [4,5,6,7]. Bon and coworkers demonstrated that quadriwave lateral shearing interferometry enables precise measurements of the phase shift introduced by live cells in a single imaging process [7]. Kwon et al. achieved quantitative phase gradient imaging based on two dielectric metasurface layers. The volume of the proposed optical system was in the order of 1 mm3, promoting the further development of interferometry on the path to miniaturization [4].
The phase recovery method is based on a mathematical algorithm. Compared with interferometry, the phase recovery method can achieve relatively high recovery accuracy without requiring precision equipment; thus, the method is widely used. There are two main types of phase recovery techniques: (1) The iterative method [8]. The main representative algorithms of the iterative method are the Gerchberg–Saxton (GS) algorithm [9], the ptychographic iterative engine (PIE) [10], and the hybrid input–output (HIO) algorithm [11]. (2) Solving the transport of intensity equation (TIE) [12]. The main representative algorithms for solving the TIE are the Zernike polynomial expansion method [13], the fast Fourier transform (FFT) method [14], and the Green’s function method [15].
Wavefront sensors based on angle measurement are also attracting more and more attention in wavefront measurements. They are widely used in commercial applications due to their characteristics of simple setup, robustness, and high processing speed. The Shack–Hartmann wavefront sensor (SHWFS) is the most commonly used [16,17,18]. SHWFS relies on an array of microlenses to generate a set of grid points on the sensor. The displacement of focal points can be used to measure the slopes of different subregions of the wavefront. These slopes can together determine the wavefront [19,20,21]. However, conventional optical elements cannot be miniaturized indefinitely because diffraction seriously weakens the accuracy [22]. Thus, the microlens array severely limits the spatial resolution and dynamic range of the SHWFS. For a typical SHWFS (Thorlabs WFS20-5C), its spatial resolution is 150 µm, with a maximum dynamic range of less than 1 ° . As a result, they are unsuitable for applications demanding high spatial resolution, such as quantitative phase imaging.
New angle sensors have been proposed to further develop the potential of wavefront sensors based on angle measurement. These angle sensors enable the direct measurement of the incident light angle without additional devices. They integrate gratings or metal aperture arrays with the sensor to form a chip-level solution [23,24,25,26,27]. This solution inherits the advantages of miniaturization and high speed of SHWFS, and avoids the size limitation problem of microlens arrays, which makes it more competitive. The most common angle sensor is based on the Talbot effect. This method requires several periods of gratings above the pixel, which results in pixel sizes typically larger than 10 µm [23,24,25]. Introducing an aperture grid to the sensor is another way to implement angle sensors. In general, this approach requires four pixels sharing a single aperture. Limitations on aperture array fabrication techniques result in a pixel size generally around 5 µm [26,27]. Recently, the pixel size of the latest commercial sensors reached the wavelength scale. This situation made us wonder whether angle-sensing pixels (ASPs) can be implemented at this scale or smaller length scales. We note that Yi et al. detected light angles using two Si wires with a width of 105 nm at a scale of 105 nm separation [22]. The approach was based on the near-field coupling effect between nanoscale resonators. This provides a perspective for implementing ASPs at the wavelength scale. However, the approach cannot be integrated with today’s advanced complementary metal-oxide-semiconductor (CMOS) technology, which makes it challenging to promote it further.
In this work, our interest focuses on implementing a one-dimensional, high-resolution wavefront sensor. The proposed wavefront sensor based on angle measurement is implemented by introducing subwavelength compound gratings above the pixels. As the two adjacent pixels share one period of the gratings, ASPs can be obtained at the wavelength scale. It has a high spatial resolution and high dynamic range. Compared with SHWFS, its spatial resolution has been improved by two orders of magnitude. The high resolution allows it to measure complex wavefronts, which used to be challenging for SHWFS.
The rest of this paper is organized as follows. In Section 2, we describe the principle of the wavefront sensor. In Section 3, we describe the algorithm for wavefront reconstruction. In Section 4, we demonstrate the wavefront reconstruction capability of the sensor. In Section 5, we discuss the significance of this work as a conclusion.

2. Principle

The basic idea of our technique is to use the relationship between the energy ratio and the incidence angle of light. Figure 1 shows the working principle of the angle sensor. It is achieved by introducing subwavelength compound gratings above the pixels (Figure 1a). In this paper, the gratings that superimpose two or more one-dimensional gratings with different slit widths are defined as compound gratings [28,29,30]. Compared with common circular aperture structures or other two-dimensional structures, gratings are easier to fabricate, which makes them more competitive at the wavelength scale. It is worth noting that the proposed compound gratings are implemented by the sensor’s intrinsic readout circuitry. The exact dimensions of the compound gratings can be found in previous work [31].
We will illustrate the idea using finite-difference time-domain (FDTD) simulations (Figure 1c,e) [27]. In the normal direction, the light energy shows a symmetrical distribution. When the incident light tilts at a certain angle, the light energy will be more concentrated on one side, rather than symmetrically distributed. The energy distribution can be detected by placing two photodetectors directly after the gratings. Then, the energy can be calculated by E 2 d s , where E is the electric field intensity and s is the area of the active regions of the pixels.
For any incident light, we can measure the angle of the incident light in terms of the energy ratio. Assuming that A and B represent the light intensity measured by two photodetectors, the energy ratio can be defined as [23,24]:
r a t i o = A B A + B .
Using full-wave simulations, we studied how the energy ratio of the ASPs changes with the illumination direction [22]. As shown in Figure 1b, the monotonic relationship between the incident angle and the energy ratio is held in the range of −10° to +10°. Under the experimental conditions, we measured the energy ratio of the ASP, which was consistent with the simulation result. For comparison purposes, the data were normalized.
To clarify the working principle of the proposed angle sensor more clearly, we explain the sensor in terms of its architecture. The sensor adopts NOR architecture [32]. We focus on its circuit layout. Its circuits can be divided into two layers, top and bottom, and the two layers are orthogonal. We demonstrated that the layout shown in Figure 1a is sensitive to the angle of the incident light. It can also be easily demonstrated with FDTD that changing the above grating layout to a uniform distribution and changing the grating pitch to the size of a pixel (0.7 µm) will eliminate the sensitivity to the angle of the incident light. In the design, we laid out the bottom circuits as shown in Figure 1a so that the top circuits are uniformly distributed and spaced at 0.7 µm. Although the circuits and gratings are different, the angle sensor can normally work with some loss of accuracy.
In conclusion, we only adapted the sensor’s circuits without introducing additional components, which means that it works like a standard CMOS. It should be noted that we provided detailed dimensions of the gratings in our previous work. The layout and spacing of the gratings are complex issues that we will study in detail in subsequent work.

3. Method

In optics, wavefront reconstruction is the recovery of the wavefront from data measured by the wavefront sensor. Currently, there are two dominant methods for wavefront reconstruction: the zonal method and the modal method [33]. The zonal method can be understood as the linear integration method. More specifically, wavefront reconstruction is performed by the successive integration of slope values. This method is quite simple, but unavoidably results in cumulative errors [34]. The model method can represent the wavefront by a set of polynomials. The process of reconstructing the wavefront can be equivalent to the process of determining the coefficients of the polynomials. There are different series of polynomials, such as Zernike polynomials, Taylor monomials, and Fourier series [35]. As Zernike polynomials have many advantages, such as high efficiency and simplicity, we will use them in our wavefront reconstruction.
Zernike polynomials are a set of polynomials defined on a unit circle [36,37,38]. In polar coordinates, they can be conveniently expressed as a product of radial polynomials and angular functions, as follows
Z i ( r , θ ) = R n m ( r ) Θ n m ( θ ) ,
where the radial polynomial is defined as
R n m ( r ) = s = 0 ( n m ) / 2 ( 1 ) s ( n s ) ! s ! [ ( n + m ) / 2 s ] ! [ ( n m ) / 2 s ] ! r n 2 s ,
and the triangular function is defined as
Θ n m ( θ ) = 2 ( n + 1 ) cos m θ ,   m 0 ,   ( even   Zernike   term ) Θ n m ( θ ) = 2 ( n + 1 ) sin m θ ,   m 0 ,   ( odd   Zernike   term ) Θ n 0 ( θ ) = n + 1 .                     m = 0
The ordering indexes m and n represent the azimuthal frequency and radial degree, respectively. They must be integral and satisfy m n , n m = e v e n . Index i orders the polynomials and is a function of m and n [37,38]. For a given value of n, priority is given to the modes with a smaller value of m.
The wavefront W can be expanded using the Zernike polynomials as
W ( x , y ) = j = 0 j max A j Z j ( x , y ) ,
where A j is the coefficient of the term Z j . j max refers to the highest mode number included in the expansion. Taking the derivative for both sides of Equation (5), we find that [39,40]
W ( x , y ) x = j A j Z j ( x , y ) x             a n d             W ( x , y ) y = j A j Z j ( x , y ) y ,
Let,
W ( x , y ) x = b ( x , y )                     a n d                     W ( x , y ) y = c ( x , y )   ,
Z j ( x , y ) x = g j ( x , y )                     a n d                     Z j ( x , y ) y = h j ( x , y ) ,
Equation (6) can be expressed in matrix form
b ( x 1 , y 1 ) b ( x 2 , y 2 ) · · · b ( x k , y k ) c ( x 1 , y 1 ) c ( x 2 , y 2 ) · · · c ( x k , y k ) = g 1 ( x 1 , y 1 ) g 2 ( x 1 , y 1 ) · · · g j max ( x 1 , y 1 ) g 1 ( x 1 , y 2 ) g 2 ( x 1 , y 2 ) · · · g j max ( x 1 , y 2 ) · · · · · · · · · · · g 1 ( x k , y k ) g 2 ( x k , y k ) · · · g j max ( x k , y k ) h 1 ( x 1 , y 1 ) h 2 ( x 1 , y 1 ) · · · h j max ( x 1 , y 1 ) h 1 ( x 1 , y 2 ) h 2 ( x 1 , y 2 ) · · · h j max ( x 1 , y 2 ) · · · · · · · · · · · h 1 ( x k , y k ) h 2 ( x k , y k ) · · · h j max ( x k , y k ) . × A 1 A 2 · · · A j max ,
Or,
β = α ω ,
The least-square estimation can be used to solve the system of linear equations in (10); then, the Zernike coefficient vector ω can be found by
ω L S = ( α T α ) 1 α T β .
In the above derivation, the Zernike polynomials are expressed in the Cartesian coordinate system. The connection of the Zernike polynomials in polar and Cartesian coordinates can be expressed as [41]
Z n m ( r , θ ) x = R n m r Θ n m ( θ ) cos θ R n m ( r ) r Θ n m ( θ ) θ sin θ ,
Z n m ( r , θ ) y = R n m r Θ n m ( θ ) sin θ + R n m ( r ) r Θ n m ( θ ) θ cos θ .

4. Results

Our research aims to produce a one-dimensional high-resolution wavefront sensor. We first characterized the sensor’s performance by measuring the wavefront of a plano-convex cylindrical lens. The sensor’s ability to measure a complex wavefront was then verified using the one-dimensional wavefront generated by a spatial light modulator (SLM).

4.1. Characterization of the Wavefront Sensor

As shown in Figure 2, a plano-convex cylindrical lens was used to evaluate the performance of the proposed wavefront sensor. We used a laser diode (Thorlabs DJ532-10) with wavelength λ = 532   nm as the light source. The beam was first collimated by a collimator lens to produce parallel light. A plano-convex cylindrical lens (Thorlabs LJ1695RM-A) with focal length f = 50   mm was used to produce a certain phase distribution, which was measured by the proposed wavefront sensor (pixel size 0.7   μ m × 0.7   μ m , 10240 H × 10240 V ).
Due to the manufacturing process’s randomness, the sensor must be calibrated before wavefront measurement. We used a collimated laser diode light source for calibration. Using a rotating stage, we measured the sensor’s response from −10° to +10° (red curve in Figure 1b). The experimental setup for calibration can be found in the previous work [31]. The acquired experimental curve will be used as a look-up table for the angle sensor to determine the incidence angle of light. Specifically, once the angle sensor has measured an energy ratio, we can find the incidence angle of light corresponding to that ratio based on the acquired experimental curve. The energy ratio can be calculated based on Equation (1).
Figure 3a shows the intensity image acquired by the proposed wavefront sensor. The image contains phase information, as shown in Figure 2. Therefore, we can take advantage of the adjacent pixels’ energy ratios to measure the light’s incidence angle. The tangent of this incidence angle is equivalent to the wavefront gradient [26]. To make it easier to understand how the wavefront sensor works, we show the gradient information in Figure 3c. The arrows represent gradient information about the wavefront. The direction and length of the arrows represent the direction and magnitude of the gradient, respectively. Once the sensor has measured the light’s incidence angle, the wavefront gradient can be calculated. The wavefront gradient corresponds to β in Equation (10), and the Zernike polynomials correspond to α in this Equation. Both of these quantities are known quantities. Then, the coefficients ω of the Zernike polynomials can be estimated using least-squares based on Equation (11). The coefficients ω of the Zernike polynomials and A j in Equation (5) are equivalent. Finally, we can reconstruct the wavefront by combining Equation (5). The reconstructed wavefront is shown in Figure 3b.
The quality of the wavefront reconstruction is directly dependent on the gradient information measured by the sensor. Meanwhile, our goal is to evaluate the performance of the sensor. Therefore, we focus on assessing the accuracy of the gradient measurement. A comparison between the experimentally measured gradients and the theoretical values is shown in Figure 3d. The line profile of the measured gradients was obtained along the line marked in the intensity image. It can be seen that the experimental values (red dots) were in agreement with the theoretical values (black line). The root-mean-square error (RMSE) was used to quantify the accuracy of the measured gradients. We show the residual error map in Figure 3e. The RMSE of the measured gradients was 2.6 mrad. These results demonstrate the ability of our sensor to measure wavefronts accurately.
Here, we provide a brief analysis of the error sources. The sensor’s energy ratio and the signal-to-noise ratio determine the accuracy of the incidence angle measurement. In the near-normal direction, the minimum detectable angle can be expressed as δ θ = 2 S N R · D R max 1 , where SNR is the signal-to-noise ratio of the angle sensor, R max is the maximum energy ratio between two adjacent pixels in Figure 1b, and D is the corresponding angle at R max [22,27]. The SNR can be calculated based on S N R = 10 lg P s i g n a l P n o i s e . The signal power P s i g n a l and noise power P n o i s e were obtained by irradiating the angle sensor with uniform light, and the uniform light was obtained by using an integrating sphere. R max was obtained by experimental measurement. As shown in Figure 3d, our sensor performed slightly worse in large-angle measurements than in smaller-angle measurements. This can be attributed to the relatively low rate of change in the energy ratio R in large-angle measurements.
The high resolution was the main feature of the proposed wavefront sensor. Compared with commercial SHWFS, the spatial resolution was two orders of magnitude higher. The high resolution was the foundation for resolving fine features in the wavefront. The maximum phase gradient could be used to quantify the ability of the wavefront sensor to resolve fine features. It was determined by the ratio between the spatial resolution and angular dynamic range. For a commercial SHWFS (Thorlabs WFS20-5C), the spatial resolution is 150 µm and the angular dynamic range is less than 1°. Hence, it has a maximum phase gradient of approximately 0.1 mrad/µm, which was calculated by 0.1   mrad / μ m 1 ° / 150   μ m . The spatial resolution for the proposed wavefront sensor was 1.4 µm and the angular dynamic range was 10°. As a result, its maximum phase gradient was 125 mrad/µm, which was 1250 times larger than that of the SHWFS. With such high-resolution capabilities, angle sensors can be used in more complex application scenarios.

4.2. Measurement of Complex Wavefront

The measurement of complex wavefronts is significant in adaptive optics, which are widely used in atmospheric turbulence correction, retinal imaging, and free space optical communications. Here, we show that the proposed wavefront sensor enables the measurement of the complex wavefront.
The experiment setup is presented in Figure 4. Light from a laser (Thorlabs DJ532-10) emitting at 532 nm was collimated with a lens. The diameter of the laser beam was 20 mm. The collimated light passed through a polarizer and a half-wave plate to make the beam’s polarization direction parallel to the long axis of the SLM. The SLM reflected the light, which traveled through a 4f system formed by two lenses with a 60 mm focal length. Finally, the light was projected onto our wavefront sensor. The SLM (Holoeye PLUTO-2.1-NIR-134, 8.0   μ m × 8.0   μ m , 1920 × 1080 pixels) was used to generate a certain phase distribution.
Figure 5a presents the grayscale map loaded onto the SLM. For each SLM pattern in our experiment, we first generated a random curve using three random sine functions summed together. Then, we used the repmat function in MATLAB to expand the random curve to create a grayscale image with a resolution of 1920 × 1080. These grayscale maps are continuous and can be used to model complex wavefronts. Our wavefront sensor provided direct access to gradient information. The Zernike polynomials could be used to reconstruct the wavefront. As can be seen from Figure 5b, the wavefront was faithfully reconstructed. We also show the gradient information in Figure 5c. These arrows indicated the rapid changes in the wavefront. Our sensor could measure large phase gradients, making it easy to resolve these challenges.
Again, we focus on the accuracy of the gradient measurement by the wavefront sensor. Figure 5d shows the line profile of the measured gradients along the line marked in Figure 5a (red line). In the same region, the gradients of the grayscale map are also plotted as a reference (black line). It can be seen that the experimentally measured gradients are consistent with the reference values. We also show the residual error map in Figure 5e. The RMSE of the measured gradients was 1.4 mrad. These results prove that our wavefront sensor can measure complex wavefronts.
The distinctive feature of the angle sensor is its fast speed. Our sensor took 100 ms to reconstruct the wavefront image shown in Figure 5b. The fast speed allowed us to perform wavefront reconstruction at a video-frame rate. This capability to detect in real-time is helpful in many fields, such as studying atmospheric turbulence or observing live biological samples.
Here, we demonstrated the real-time detection capabilities of the angle sensor. The PLUTO SLM provides a 60 Hz synchronization signal for triggering the angle sensor. As they matched the sensor frame rate, the SLM patterns were modified to 10 Hz. The pre-created wavefronts were loaded onto the SLM in the specified sequence. The sensor acquired one frame every 100 ms. Figure 6a–c show a time-lapse of the reconstructed wavefront. In simple terms, the SLM changed patterns following a specified sequence. In this way, we dynamically simulated changing wavefronts.
The frame acquisition time of the sensor was the main restriction. The high-speed sensor could easily reach above 1000 Hz, which could improve this issue. For future development, the high-speed sensor could be used to observe a dynamically changing wavefront with micro-second temporal resolution.

5. Discussions

This paper presents a one-dimensional, high-resolution wavefront sensor based on subwavelength compound gratings and illustrates its potential uses in the field of wavefront measurement. Although we demonstrated the high spatial resolution of the proposed angle sensor in wavefront measurements, it would be helpful to discuss some critical issues.
Firstly, the relationship between the proposed angle sensor and the SHWFS needs to be clarified. In terms of design principles, the proposed angle sensor is consistent with the SHWFS. The SHWFS captures the wavefront of the entire optical pupil by measuring the average gradient over the sub-apertures subtended by the corresponding lenslet. The lenslets divide the optical pupil into different sub-apertures, the size of which is determined by the lenslets. For each of the sub-apertures, the signal is captured by the corresponding lenslet. In our angle sensor, the subwavelength compound gratings covered the entire sensor, allowing the sensor to sample the entire optical pupil. Similar to the lenslets of the SHWFS, the gratings divide the optical pupil into different sub-apertures and capture the signal. The size of the sub-apertures is determined by the space of the gratings covering the two adjacent pixels. In other words, the subwavelength compound gratings replace the microlens array of the SHWFS. It is important to note that this type of substitution is common in angle sensors. For example, angle sensors based on flat optics use circular apertures to complete sub-aperture division and signal acquisition. Angle sensors based on the Talbot effect use a combination of gratings to achieve the same function. In conclusion, the design principles of these angle sensors mentioned above are the same. The different methods of dividing the sub-apertures are the main reason for their performance differences.
Second, the significance of subwavelength compound gratings for the development of high-resolution wavefront sensors is illustrated by comparing typical angle sensors’ limitations in improving spatial resolution. For the SHWFS, diffraction constrains further improvements in spatial resolution. Diffraction can severely weaken accuracy when conventional optical elements are reduced to optical wavelengths. For angle sensors based on flat optics, the narrowing of the aperture leads to a reduction in the quantum efficiency of the sensor. At an aperture size of 10 µm, the quantum efficiency is reduced by a factor of four. Angle sensors based on the Talbot effect require the introduction of gratings above each pixel. Several grating periods (approximately ten periods) are necessary to use the Talbot effect effectively. Two pixels in our sensor share one period of the grating. This approach greatly improves the spatial resolution compared with the Talbot effect. Diffraction and quantum efficiency have less influence on angle sensors based on subwavelength compound gratings, which allows the proposed angle sensor to be implemented at the wavelength scale. In addition, the easy processability of the grating increases its competitiveness in high-resolution wavefront sensors.
Third, we discuss extending the proposed method to measure two-dimensional wavefronts. Compared with SHWFS, the proposed angle sensor can only detect angles in one direction and lacks full azimuth and elevation angle capability, limiting its application to more complex scenarios. This limitation has been widely discussed [22,23,24]. A common solution is to use a pair of orthogonal angle sensors for both azimuth and elevation angle measurements. Let us illustrate this with an example. We assume an incident angle of θ = φ = 5°. Light with this incident angle will enable the angle sensor to generate a definite light intensity ratio. However, different incident angles can also generate the same light intensity ratio, such as θ = φ = 8° or θ = 4°, φ = 3°. This situation will produce ambiguity, which means that we cannot determine the angle of the incident light by a light intensity ratio. To solve this problem, we can place another angle sensor next to the first one, but rotate it by 90°. The second sensor will also generate a definite light intensity ratio, as with the first angle sensor. Meanwhile, there will be multiple directions generating the same light intensity ratio. We overlapped all the incident light angles mentioned above. Finally, the actual incident angle of light can be determined by finding one direction generating the above two light Intensity ratios simultaneously.
Fourthly, the effect of the rate of change in the light intensity ratio on the measurement accuracy is considered. As shown in Figure 1b, the light intensity ratio R is a monotonic function regarding the angle of incidence θ . The rate of change in the light intensity ratio can be expressed as d R = Δ R / Δ θ . For a given Δ R , if d R is larger, then Δ θ will be smaller. A smaller Δ θ means that the sensor can distinguish smaller angles. In other words, the larger d R , the higher the measurement accuracy of the sensor. It is clear that d R is larger, as θ [ 0 ° , 4 ° ] compared with θ ( 4 ° , 10 ° ] . This shows that our sensor was more accurate at θ [ 0 ° , 4 ° ] . As shown in Figure 3e, the characterization result of our sensor clearly shows this difference in measurement accuracy. Yi et al. [22] reported that the material and spatial layout of the grating has a significant effect on the rate of change in the light intensity ratio. These two perspectives could be used as entry points for research in subsequent work. Careful exploration of the rate of change in the light intensity ratio is a nontrivial task that we will actively pursue in the future.
Finally, we discuss the implications of the proposed wavefront sensor for artificial intelligence. Artificial intelligence’s rapid development is changing how people live in many areas, such as self-driving vehicles, drones, and intelligent robots. In these applications, the measurement of rotational components (e.g., self-driving steering wheel angle or aircraft engines) is necessary to ensure the safe operation of the entire system. Angle is a critical parameter in these measurements. Therefore, angle sensors are becoming an increasingly important research topic. Various electrical, optoelectronic, and optical approaches have been proposed for angular displacement measurements. Among them, optical techniques provide the merits of precision, high resolution, and noncontact qualities. Fiber optic displacement sensors use interference fringes formed by the measurement wave and the reference wave for measurement [42,43,44]. High resolution is its main characteristic. However, it also inherits the disadvantages of the interferometric method, such as the complexity of the instrumentation and poor immunity to interference. It has been shown that angle sensors can match the measurement accuracy of commercial white light interferometers [27]. In addition, angle sensors can provide real-time measurements, significantly increasing their competitiveness. These studies show the potential for high-precision angle sensors in artificial intelligence.

6. Conclusions

In summary, we have achieved a one-dimensional, high-resolution wavefront sensor. Subwavelength compound gratings are used to implement a differential distribution of light intensity, which can be used to measure the angle of incident light. Combined with the Zernike polynomials, we can achieve one-dimensional wavefront reconstruction. The experimental results show that our wavefront sensor has a spatial resolution of 1.4 µm and an angular dynamic range of ±10°.
The advantages of this approach are as follows: Firstly, an angle-sensitive pixel of 0.7 µm was achieved, reaching the wavelength scale. To the best of our knowledge, this is the smallest angle-sensitive pixel available. Secondly, the subwavelength compound gratings can be replaced by the sensor’s readout circuitry without requiring additional processes. This approach enables angle sensors to be generated at a much lower cost.
There are several issues deserving further study in the future. First, we achieved wavefront detection in one dimension. Wavefront sensors that simultaneously perform wavefront detection in both dimensions are more attractive. In this respect, angle sensors based on the Talbot effect provide a good idea for research. Second, the rate of change in the light intensity ratio directly determines the accuracy of the sensor measurement. Further increases in the rate of change in this ratio are significant to improve the performance of the angle sensor. Third, it will be interesting to develop new applications for advanced optical sensing, such as self-driving vehicles and artificial intelligence systems.

Author Contributions

Idea conceptualization and methodology C.Y.; validation, X.S. (Xinyu Shen); formal analysis, J.X.; writing—original draft preparation, Y.M.; writing—review and editing, C.Y. and Y.M.; project administration, Y.P. and X.S. (Xiaowen Shao); funding acquisition, F.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program of China (2016YFA0202102).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Booth, M.J. Adaptive optical microscopy: The ongoing quest for a perfect image. Light Sci. Appl. 2014, 3, e165. [Google Scholar] [CrossRef] [Green Version]
  2. Mello, G.R.; Rocha, K.M.; Santhiago, M.R.; Smadja, D.; Krueger, R.R. Applications of wavefront technology. J. Cataract Refract. Surg. 2012, 38, 1671–1683. [Google Scholar] [CrossRef] [PubMed]
  3. Chen, S.; Xue, S.; Zhai, D.; Tie, G. Measurement of freeform optical surfaces: Trade-off between accuracy and dynamic range. Laser Photonics Rev. 2020, 14, 1900365. [Google Scholar] [CrossRef]
  4. Bon, P.; Maucort, G.; Wattellier, B.; Monneret, S. Quadriwave lateral shearing interferometry for quantitative phase microscopy of living cells. Opt. Express 2009, 17, 13080–13094. [Google Scholar] [CrossRef]
  5. Popescu, G.; Ikeda, T.; Dasari, R.R.; Feld, M.S. Diffraction phase microscopy for quantifying cell structure and dynamics. Opt. Lett. 2006, 31, 775–777. [Google Scholar] [CrossRef]
  6. Park, Y.; Depeursinge, C.; Popescu, G. Quantitative phase imaging in biomedicine. Nat. Photonics 2018, 12, 578–589. [Google Scholar] [CrossRef]
  7. Kwon, H.; Arbabi, E.; Kamali, S.M.; Faraji-Dana, M.; Faraon, A. Single-shot quantitative phase gradient microscopy using a system of multifunctional metasurfaces. Nat. Photonics 2020, 14, 109–114. [Google Scholar] [CrossRef] [Green Version]
  8. Guo, C.; Wei, C.; Tan, J.; Chen, K.; Liu, S.; Wu, Q.; Liu, Z. A review of iterative phase retrieval for measurement and encryption. Opt. Lasers Eng. 2017, 89, 2–12. [Google Scholar] [CrossRef]
  9. Gerchberg, R.W. A practical algorithm for the determination of plane from image and diffraction pictures. Optik 1972, 35, 237–246. [Google Scholar]
  10. Rodenburg, J.M.; Faulkner, H.M.L. A phase retrieval algorithm for shifting illumination. Appl. Phys. Lett. 2004, 85, 4795–4797. [Google Scholar] [CrossRef] [Green Version]
  11. Fienup, J.R. Phase retrieval algorithms: A comparison. Appl. Opt. 1982, 21, 2758–2769. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Teague, M.R. Deterministic phase retrieval: A Green’s function solution. J. Opt. Soc. Am. JOSA 1983, 73, 1434–1441. [Google Scholar] [CrossRef]
  13. Gureyev, T.E.; Roberts, A.; Nugent, K.A. Phase retrieval with the transport-of-intensity equation: Matrix solution with use of Zernike polynomials. J. Opt. Soc. Am. A JOSAA 1995, 12, 1932–1941. [Google Scholar] [CrossRef]
  14. Gureyev, T.E.; Nugent, K.A. Rapid quantitative phase imaging using the transport of intensity equation. Opt. Commun. 1997, 133, 339–346. [Google Scholar] [CrossRef]
  15. Woods, S.C.; Greenaway, A.H. Wave-front sensing by use of a Green’s function solution to the intensity transport equation. J. Opt. Soc. Am. A 2003, 20, 508. [Google Scholar] [CrossRef]
  16. Primot, J. Theoretical description of Shack–Hartmann wave-front sensor. Opt. Commun. 2003, 222, 81–92. [Google Scholar] [CrossRef]
  17. Platt, B.C.; Shack, R. History and Principles of Shack-Hartmann Wavefront Sensing. J. Refract Surg. 2001, 17, S573–S577. [Google Scholar] [CrossRef]
  18. Davies, R.; Kasper, M. Adaptive Optics for Astronomy. Annu. Rev. Astron. Astrophys. 2012, 50, 305–351. [Google Scholar] [CrossRef] [Green Version]
  19. Pathak, B.; Boruah, B.R. Improved wavefront reconstruction algorithm for Shack–Hartmann type wavefront sensors. J. Opt. 2014, 16, 55403. [Google Scholar] [CrossRef]
  20. Fried, D.L. Least-square fitting a wave-front distortion estimate to an array of phase-difference measurements. J. Opt. Soc. Am. JOSA 1977, 67, 370–375. [Google Scholar] [CrossRef]
  21. Cubalchini, R. Modal wave-front estimation from phase derivative measurements. J. Opt. Soc. Am. JOSA 1979, 69, 972–977. [Google Scholar] [CrossRef]
  22. Yi, S.; Zhou, M.; Yu, Z.; Fan, P.; Behdad, N.; Lin, D.; Wang, K.X.; Fan, S.; Brongersma, M. Subwavelength angle-sensing photodetectors inspired by directional hearing in small animals. Nat. Nanotechnol. 2018, 13, 1143–1147. [Google Scholar] [CrossRef]
  23. Varghese, V.; Chen, S. Polarization-Based Angle Sensitive Pixels for Light Field Image Sensors with High Spatio-Angular Resolution. IEEE Sens. J. 2016, 16, 5183–5194. [Google Scholar] [CrossRef]
  24. Varghese, V.; Qian, X.; Chen, S.; Shen, Z.; Tao, J.; Liang, G.; Wang, Q.J. Track-and-Tune Light Field Image Sensor. IEEE Sens. J. 2014, 14, 4372–4384. [Google Scholar] [CrossRef]
  25. Wang, A.; Molnar, A. A Light-Field Image Sensor in 180 nm CMOS. IEEE J. Solid-State Circuits 2012, 47, 257–271. [Google Scholar] [CrossRef]
  26. Cui, X.; Ren, J.; Tearney, G.J.; Yang, C. Wavefront image sensor chip. Opt. Express OE 2010, 18, 16685–16701. [Google Scholar] [CrossRef] [Green Version]
  27. Yi, S.; Xiang, J.; Zhou, M.; Wu, Z.; Yang, L.; Yu, Z. Angle-based wavefront sensing enabled by the near fields of flat optics. Nat. Commun. 2021, 12, 6002. [Google Scholar] [CrossRef]
  28. Skigin, D.C.; Depine, R.A. Diffraction by dual-period gratings. Appl. Opt. AO 2007, 46, 1385–1391. [Google Scholar] [CrossRef]
  29. Skigin, D.C.; Depine, R.A. Narrow gaps for transmission through metallic structured gratings with subwavelength slits. Phys. Rev. E 2006, 74, 046606. [Google Scholar] [CrossRef]
  30. Skigin, D.C.; Depine, R.A. Resonances on metallic compound transmission gratings with subwavelength wires and slits. Opt. Commun. 2006, 262, 270–275. [Google Scholar] [CrossRef]
  31. Meng, Y.; Hu, X.; Yang, C.; Shen, X.; Cao, X.; Lin, L.; Yan, F.; Yue, T. Angle-sensitive Pixels Based on Subwavelength Compound Gratings. Curr. Opt. Photonics COPP 2022, 6, 359–366. [Google Scholar] [CrossRef]
  32. Bez, R.; Camerlenghi, E.; Modelli, A.; Visconti, A. Introduction to flash memory. Proc. IEEE 2003, 91, 489–502. [Google Scholar] [CrossRef] [Green Version]
  33. Southwell, W.H. Wave-front estimation from wave-front slope measurements. J. Opt. Soc. Am. JOSA 1980, 70, 998–1006. [Google Scholar] [CrossRef]
  34. Kuria, J.M.; Schon, R.; Börret, R. A Flatbed Scanner Based Wavefront Sensing Unit for Optics Quality Control. In Proceedings of the 18th World Conference on Nondestructive Testing, Durban, South Africa, 16–20 April 2012. [Google Scholar]
  35. Dai, G.-M. Wavefront Optics for Vision Correction; SPIE Press: Bellingham, WA, USA, 2008; ISBN 9780819469663. [Google Scholar]
  36. Lakshminarayanan, V.; Fleck, A. Zernike polynomials: A guide. J. Mod. Opt. 2011, 58, 545–561. [Google Scholar] [CrossRef]
  37. Mahajan, B.V.N. Zernike Annular Polynomials and Optical Aberrations of Systems with Annular Pupils. Appl. Opt. AO 1994, 33, 8125–8127. [Google Scholar] [CrossRef]
  38. Noll, R.J. Zernike polynomials and atmospheric turbulence. J. Opt. Soc. Am. JOSA 1976, 66, 207–211. [Google Scholar] [CrossRef]
  39. Dai, G.-M. Modal wave-front reconstruction with Zernike polynomials and Karhunen–Loève functions. J. Opt. Soc. Am. A JOSAA 1996, 13, 1218–1225. [Google Scholar] [CrossRef]
  40. Liang, J.; Grimm, B.; Goelz, S.; Bille, J.F. Objective measurement of wave aberrations of the human eye with the use of a Hartmann–Shack wave-front sensor. J. Opt. Soc. Am. A JOSAA 1994, 11, 1949–1957. [Google Scholar] [CrossRef]
  41. Dai, G. Modified Hartmann-Shack wavefront sensing and iterative wavefront reconstruction. In Adaptive Optics in Astronomy; SPIE: Bellingham, WA, USA, 1994. [Google Scholar] [CrossRef]
  42. Wang, Y.-C.; Huang, C.-S. Flexible linear and angular displacement sensor based on a gradient guided-mode resonance filter. IEEE Sens. J. 2018, 18, 9925–9930. [Google Scholar] [CrossRef]
  43. Shan, M.; Min, R.; Zhong, Z.; Wang, Y.; Zhang, Y. Differential reflective fiber-optic angular displacement sensor. Opt. Laser Technol. 2015, 68, 124–128. [Google Scholar] [CrossRef]
  44. Jia, B.; He, L.; Yan, G.; Feng, Y. A Differential Reflective Intensity Optical Fiber Angular Displacement Sensor. Sensors 2016, 16, 1508. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Performance of the proposed angle-sensitive pixels (ASPs). (a) Schematic diagram of the ASPs. Yellow lines represent the subwavelength compound gratings and blue squares represent the active regions of the pixels. (b) Angle dependence of the energy ratio between two adjacent pixels, measured at a wavelength of 532 nm. The energy ratio is a function of the incidence angle of light. (ce) Light intensity distribution on two adjacent pixels for different incidence angles, where the red rectangles represent the active regions of the pixels.
Figure 1. Performance of the proposed angle-sensitive pixels (ASPs). (a) Schematic diagram of the ASPs. Yellow lines represent the subwavelength compound gratings and blue squares represent the active regions of the pixels. (b) Angle dependence of the energy ratio between two adjacent pixels, measured at a wavelength of 532 nm. The energy ratio is a function of the incidence angle of light. (ce) Light intensity distribution on two adjacent pixels for different incidence angles, where the red rectangles represent the active regions of the pixels.
Photonics 10 00420 g001
Figure 2. Schematic diagram of the experimental setup for characterizing the performance of the wavefront sensor.
Figure 2. Schematic diagram of the experimental setup for characterizing the performance of the wavefront sensor.
Photonics 10 00420 g002
Figure 3. Characterization of the wavefront sensor. (a) Intensity image captured by the wavefront sensor. (b) Wavefront of the plano-convex cylindrical lens reconstructed using Zernike polynomials. (c) Enlarged image of the content in the black rectangle of Figure 3. The arrows represent the gradient information of the wavefront. (d) Profiles of the wavefront gradients along the center of the intensity image. The red circles represent the experimentally measured gradients and the solid black line represents the theoretical values. (e) Residual error map for measured wavefront gradients.
Figure 3. Characterization of the wavefront sensor. (a) Intensity image captured by the wavefront sensor. (b) Wavefront of the plano-convex cylindrical lens reconstructed using Zernike polynomials. (c) Enlarged image of the content in the black rectangle of Figure 3. The arrows represent the gradient information of the wavefront. (d) Profiles of the wavefront gradients along the center of the intensity image. The red circles represent the experimentally measured gradients and the solid black line represents the theoretical values. (e) Residual error map for measured wavefront gradients.
Photonics 10 00420 g003
Figure 4. Schematic diagram of the experimental setup for measuring complex wavefronts. The SLM is used to generate a certain wavefront.
Figure 4. Schematic diagram of the experimental setup for measuring complex wavefronts. The SLM is used to generate a certain wavefront.
Photonics 10 00420 g004
Figure 5. Reconstruction of the complex wavefront. (a) Grayscale image loaded onto the SLM. (b) Reconstructed wavefront. (c) Close-up view of the black rectangle in (b), where the arrows represent the wavefront gradients. (d) Comparison between measured (red line) and theoretical (black line) wavefront gradient profiles along the red dashed line marked in (a). (e) Residual error map for measured wavefront gradients.
Figure 5. Reconstruction of the complex wavefront. (a) Grayscale image loaded onto the SLM. (b) Reconstructed wavefront. (c) Close-up view of the black rectangle in (b), where the arrows represent the wavefront gradients. (d) Comparison between measured (red line) and theoretical (black line) wavefront gradient profiles along the red dashed line marked in (a). (e) Residual error map for measured wavefront gradients.
Photonics 10 00420 g005
Figure 6. Wavefront reconstruction in video-frame rate. (ac) Selected time lapse of wavefront reconstructed at a video-frame rate. SLM changed patterns in a specified sequence to simulate dynamically changing wavefronts.
Figure 6. Wavefront reconstruction in video-frame rate. (ac) Selected time lapse of wavefront reconstructed at a video-frame rate. SLM changed patterns in a specified sequence to simulate dynamically changing wavefronts.
Photonics 10 00420 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Meng, Y.; Shen, X.; Xie, J.; Peng, Y.; Shao, X.; Yan, F.; Yang, C. One-Dimensional High-Resolution Wavefront Sensor Enabled by Subwavelength Compound Gratings. Photonics 2023, 10, 420. https://doi.org/10.3390/photonics10040420

AMA Style

Meng Y, Shen X, Xie J, Peng Y, Shao X, Yan F, Yang C. One-Dimensional High-Resolution Wavefront Sensor Enabled by Subwavelength Compound Gratings. Photonics. 2023; 10(4):420. https://doi.org/10.3390/photonics10040420

Chicago/Turabian Style

Meng, Yunlong, Xinyu Shen, Junyang Xie, Yao Peng, Xiaowen Shao, Feng Yan, and Cheng Yang. 2023. "One-Dimensional High-Resolution Wavefront Sensor Enabled by Subwavelength Compound Gratings" Photonics 10, no. 4: 420. https://doi.org/10.3390/photonics10040420

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop