Next Article in Journal
Motion Sensors for Knee Angle Recognition in Muscle Rehabilitation Solutions
Previous Article in Journal
An Improved System-Level Calibration Scheme for Rotational Inertial Navigation Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Method for Medical Microscopic Images’ Sharpness Evaluation Based on NSST and Variance by Combining Time and Frequency Domains

1
School of Mathematics and Computer Science, Zhejiang A & F University, Hangzhou 311300, China
2
Zhejiang Provincial Key Laboratory of Forestry Intelligent Monitoring and Information Technology, Hangzhou 311300, China
3
College of Information Science & Electronic Engineering, Zhejiang University, Hangzhou 310027, China
4
State Key Laboratory of CAD & CG, Hangzhou 310027, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(19), 7607; https://doi.org/10.3390/s22197607
Submission received: 15 September 2022 / Revised: 4 October 2022 / Accepted: 6 October 2022 / Published: 7 October 2022
(This article belongs to the Section Sensing and Imaging)

Abstract

:
An algorithm for a sharpness evaluation of microscopic images based on non-subsampled shearlet wave transform (NSST) and variance is proposed in the present study for the purpose of improving the noise immunity and accuracy of a microscope’s image autofocus. First, images are decomposed with the NSST algorithm; then, the decomposed sub-band images are subjected to variance to obtain the energy of the sub-band coefficients; and finally, the evaluation value is obtained from the ratio of the energy of the high- and low-frequency sub-band coefficients. The experimental results show that the proposed algorithm delivers better noise immunity performance than other methods reviewed by this study while maintaining high sensitivity.

1. Introduction

As a complex and orderly system, life and its mysteries have always attracted the attention of scientists. Every major discovery of life is inseparable from technological innovation. As an important branch, microscopic imaging technology makes conducting life science research easier. Microscopic imaging technology was developed from the birth of microscope. In the late 17th century, Leeuwenhoek invented the microscope. However, the lenses produced at that time were relatively rough and the magnification was relatively simple. In the 18th century, Carl Zeiss began to manufacture composite microscopes. Due to the lack of scientific guidance, the optical quality of microscopes produced in the early stage was extremely unstable. At the end of 1860, Abbe cooperated with Zeiss to complete the design of the optical system, laying the foundation of the Abbe imaging principle, and thus, the theory of microscopic imaging. On this basis, microscope technology has experienced rapid development.
With the further development of medical technology and the transformation of the medical model, people are increasingly concerned about their own health, and the requirements for the quality of medical service are also increasing. The application of electron microscopy in the medical field, as a tool to study the microworld, has been widely used in many fields, opening up a new field of vision for medical research. This is true especially in virology, cell biology, histology, pathology, and molecular pathology. In the practical application of clinical medicine, electron microscopy plays an important role in the identification of disease conditions and causes, especially in the classification and diagnosis of tumors, kidney diseases, blood diseases, etc. [1] Both medical students, researchers, and medical personnel need to master the relevant knowledge of biomedical electron microscopy technology. In addition, urinary sediment examination is one of the most important means for medical diagnosis of human diseases, whether it is a routine physical examination or a patient examination at the time of medical treatment. Urine sediment contains red blood cells, white blood cells, crystals, sperm, epithelial cells, casts, and other impurities. The examination of urinary sediment has a very important reference role in the diagnosis and treatment of human kidney, bladder, and other diseases. If the urine contains a large number of white blood cells, it indicates that the patient may suffer from pyelitis, cystitis, or other diseases; if the urine contains a large number of red blood cells, it indicates that the patient may suffer from nephrotic nephritis, acute (chronic) glomerulonephritis, or other medical diseases. If epithelial cells appear in groups and leukocytes also increase, this indicates that tissues and organs with this type of epithelial cell suffer from inflammation; if calcium urate or oxalic acid crystals are detected in fresh urine, and more red blood cells are detected, the patient may suffer from urinary stone disease. When the urine contains more Gram-negative bacteria, the patient may be suffering from cystitis, pyelitis, or other diseases.
However, when the microscope imaging technology is used to obtain the image of urine sediment cells, the microscope platform always deviates from the best imaging position due to its own processing problems or if it is used for too long; thus, the image will become blurred, which directly affects the subsequent image processing and related research, and even causes researchers to misjudge the data. Therefore, in order to obtain a clearer and more accurate image of urine sediment cells, it may be necessary to choose a good autofocusing algorithm.
The innovation of this study lies in applying an improved NSST algorithm to autofocus the image of urinary sediment cells, and to obtain a clear image of urinary sediment cells. In order to verify the feasibility of the algorithm, different types of images are added for the experiments. The main contributions of this study are summarized as follows:
  • The NSST algorithm can decompose the image at multiple scales to obtain a low-frequency sub-band and several high-frequency sub-bands, and the image information contained in different sub-bands is also different. By calculating the variance of different sub-band coefficients, the interference caused by the background of the urine sediment image can be further reduced and the performance of the NSST algorithm can be improved, as to obtain a better clarity evaluation curve.
  • Microscopic imaging technology will inevitably generate noise in images due to factors such as environment, equipment, and improper operation. In order to simulate the noise situation as much as possible, different noises are added to the experimental image, and a bilateral filter and a Gaussian filter are applied to the noise image to improve the noise resistance of the algorithm. Finally, the noise resistance of the improved NSST algorithm and other algorithms in this study are tested under the same operating conditions, with the results showing that the improved NSST algorithm has a better anti-noise performance than other algorithms used in this study.
The organization of the remaining sections is as follows: The related work is presented in Section 2. In Section 3, the principle and motivation of the NSST algorithm are briefly introduced, some improvements to the algorithm are proposed to boost its performance, and some theoretical foundations of the NSST algorithm are explored, while implementation steps, flowcharts, and pseudo-codes of the algorithm will also be presented. In Section 4, the experimental results are discussed, including for the comparison experiment and noise immunity experiment. The last section is the conclusion of this study.

2. Related Work

To obtain sharp images by controlling a motor to adjust the lens or the stage in a computer-controlled microscope imaging process, autofocusing adopts two main methods [2]: active focusing, based on the principle of distance measurement, and passive focusing, based on image processing. Because microscopic images are often taken with a high-magnification and small-aperture objective lens, the depth of field stays at the micron level, and thus, the mechanical system is very demanding. In view of the defects of the active focus mode, the passive focus method based on image processing has been widely implemented, with the purpose of obtaining a real-time image dataset of the microscope, analyzing the current image focus state, and controlling the stepper motor to modify the step size according to specific search strategies and to obtain as sharp of an image as possible. As indicated by Qu et al. [3], even for the same sample, the performance of different focus criterion functions will be quite different. In this process, therefore, the key question is how to choose a suitable evaluation function to judge the sharpness of images.
In terms of the image sharpness evaluation, Xia et al. [4] have summarized and compared 16 traditional functions, including Laplace, Tenengrad, spatial frequency, wavelet transform, and fast Fourier transform, finding that Tenengrad performs best in the focus measure both for global search and local search, but has a weak noise immunity performance. In addition, Xia [5] proposed a new fusion algorithm by dividing information entropy and Tenengrad to enhance noise resistance, but it takes a bit longer timewise. Liu et al. [6] have designed a complex imaging process modeling and image sharpness evaluation method based on fuzzy entropy, a method that is robust to changes in noise, lens magnification, and illumination conditions, but which has a slow test speed, relatively fixed fuzzy control results, and poor flexibility. Alma Rocío et al. [7] offered a new autofocus and fusion algorithm (AFA) by selecting the best focus image (BFI) at different distances from a target in a bunch of images, and experimental results demonstrate that this algorithm is different from traditional fusion ones. For example, the AFA can boost image quality in a shorter time than traditional fusion methods, but its anti-interference performance in terms of noise is unknown. In order to accurately identify the contours of cells and count them, Hore Sirshendu et al. [8] proposed a novel method to develop the automatic qualification of cells’ contour identification, which necessitates marking cells. Thus, a novel method based on a new two 4 × 4 kernel for edge detection after the pre-processing step is employed. The proposed method was implemented and evaluated on light microscope images of rats’ hippocampi, as a sample of the brain cells. A comparative study for the proposed method was performed with other edge detection techniques, such as the Canny, Roberts, Prewitt, and Sobel protocols. The experimental results proved that the proposed method is superior to the other edge detection methods in terms of the accuracy, specificity, and false alarm count. Akiyama et al. [9] proposed a sharpness evaluation function based on Daubechies wavelet transform to deal with the high-pass filtering effect; this function has been used in ground vehicle infrared image guidance systems, but it has an impact on the accuracy. Based on wavelets for blood smears under a microscope, Makkapati et al. [10] further proposed an improved image autofocusing method, which can obtain a smooth sharpness evaluation curve without any fluctuations, thus improving the autofocusing accuracy; however, its focusing speed is reduced. Through machine learning, Chen and Peter van Beek [11] have designed a supervised machine learning method by using two decision-tree classifiers to control the focusing process and find the clearest image, but also at the expense of some focusing speed. For Depth from Defocus (DFD) estimation, S. Matsui and Taniguchi RI [12] proposed a new imaging technology, a half-scan imaging technology and a processing method, which has a high image signal-to-noise ratio, flexible adaptation to scene depth, and full compatibility with conventional imaging, among other advantages. However, although easy to implement in cameras, it is still not widely deployed in other scenes. By removing a large number of unnecessary image sub-blocks from the adjusting microscope image thresholds, Lv and Wang [13] determined the final focus window according to the sum of the gradient amplitude of the image sub-blocks, finally calculating the gradient and variance of the window sub-blocks while taking the pixel weighting method as the focus evaluation function. With noise added, it showcases good noise resistance and stability, but whether it can be applied to non-microscope images is still unknown. To improve the denoising ability of the algorithm, Samsad Beagum et al. [14] proposed an automation technology LPA-ICI-PSO by studying the local polynomial approximation (LPA) filter supported by the intersection confidence interval (ICI) rule (LPA-ICI) and combining it with particle swarm optimization (PSO). This guarantees less computational time along with optimal denoising compared to the LPA-ICI as established by the performance metrics. The experimental results established the superiority of the proposed LPA-ICI-PSO over the classical LPA-ICI filter. In the same year, Amira S. Ashour et al. [15] proposed a new automation technology, “standard optimized LPA-ICI” (SO-LPA-ICI), based on LPA-ICI-PSO. The denoising effects of different methods on rats’ renal microscopic images in the presence of Poisson noise were compared. The results showed that SO-LPA-ICI achieved fast denoising compared with LPA-ICI-PSO. Hamid Reza Shahdoosti and Omid Khayat [16] proposed an image denoising method which uses sparse unmixing by variable splitting and an augmented Lagrangian (SUnSAL) classifier in the non-subsampled shearlet transform (NSST) domain. The experiments demonstrate that the proposed approach improves the image quality in terms of both subjective and objective inspections, compared with some other state-of-the-art denoising techniques. Based on the above literature review, in order to improve the autofocus recognition of urinary sediment cells, this study proposes a sharpness evaluation function based on the non-subsampled shearlet transform (NSST). On this basis, the performance of the algorithm is further improved by a bilateral filter [17] and variance. The experimental results show that this method is effective and the interference of noise to images can be appropriately curbed. The ratio of high- and low-frequency energy coefficients is calculated by combining both frequency and time domains. Calculated with real values of coefficients, it has a lower complexity [18]. Theoretical analysis of the algorithm shows that the method may have obvious single peaks, a high accuracy, and few local peaks while delivering high algorithm noise immunity and focusing speed.
The datasets generated during and/or analyzed during the current study are available in the figshare repository (https://doi.org/10.6084/m9.figshare.19524283 accessed on 6 April 2022).

3. Materials and Methods

3.1. Non-Subsampled Shearlet Wave Transform (NSST)

The NSST [19,20,21,22] includes two parts: multiscale decomposition and multidirectional decomposition. In fact, the NSST is an optimized and improved version of the NSCT algorithm proposed by Cunha et al. [23,24,25], with the characteristics of multiscale, multidirectional, and translation-invariant transformation, as well as high-computational efficiency. The NSST can capture the geometric and mathematical properties of an image, such as scales, directionality, elongated shapes, and oscillations. The NSST is an optimal transform with sparse coefficients, in which a Parseval frame of well-localized waveforms at various scales and directions is formed. During the decomposition of an image, the NSST has no limitation on the number of directions for shearing [26]. After an image is decomposed with k-level non-subsampled pyramid (NSP) multiscale decomposition, k + 1 sub-band images can be obtained, including one low-frequency image and k high-frequency images with different scales. After multiscale decomposition, the sub-band image is decomposed in one-level multiple directions. NSST direction decomposition adopts a shear filter (SF) [27] to ensure that the images are not distorted, but provide translation invariance while effectively suppressing the pseudo-Gibbs effect. Moreover, the computational burden of the NSST is lower than that of the NSCT [26]. In addition, directional selectivity in the NSCT is artificially made by the special sampling rule of filter banks which often produces artifacts, whereas this defect has been resolved in the NSST. The NSST decomposition structure diagram is shown in Figure 1. Therefore, the NSST is selected in this study for a sharpness evaluation of medical microscopic images and makes appropriate improvements.

3.2. Analyses of Algorithm Theories

In the microscopic field of vision, the cell distribution in images is scarce and discrete, and the gray level of an image is similar. The cell image in the defocused state can be approximately regarded as a uniformly distributed circular light spot, and thus, the background would greatly influence the judgment of clarity. The low- and high-frequency information after NSST decomposition has different physical meanings. The coefficient with a large absolute value in the high-frequency information corresponds to the detailed information, such as edge, which is very sensitive to changes in image sharpness. If an image is mixed with noise, it will be mixed in with the high-frequency information after decomposition. In contrast, the low-frequency information can retain the general information of the original image and is not very sensitive to changes in image sharpness. Therefore, the characteristics of low-frequency sub-band coefficients can be used to improve the noise resistance of the evaluation function. In addition, the variance processing of the decomposed sub-bands after NSST decomposition can further diminish the impact of noise.

3.3. Algorithm Improvement

The variance function [28] represents the degree of dispersion of the image’s gray distribution. For out-of-focus images, the gray value transformation range is small, the degree of dispersion is low, and the variance is small; for focused images, the opposite is true. This focus measurement computes variations of pixel intensities and uses the power function to amplify larger differences from the mean image intensity, as defined below, where M N is the total number of pixels, g ( i , j ) is a pixel value, and g ¯ is the average pixel of the image.
V = 1 M N i , j [ g ( i , j ) g ¯ ( i , j ) ] 2
The sub-band coefficient energy is obtained by calculating the variance of sub-band images in different frequency bands and directions acquired in NSST decomposition. This can better extract the features of the image and improve the accuracy of autofocusing.

3.4. Algorithm Implementation

Based on the above discussions, certain implementation steps of the image sharpness evaluation algorithm combining NSST decomposition and variance processing are proposed as follows:
  • Perform NSST decomposition on an image to obtain one low-frequency sub-band and several high-frequency sub-bands.
  • Obtain the variance processing coefficient V L of the low-frequency sub-band image in the NSST transform domain, as defined below:
    E L = V L
Then, calculate the variance processing coefficient V k , l at each level of the high-frequency sub-band image in each direction, and add V k , l in different directions of the same frequency band to obtain the variance processing coefficients S V k , l at each level of the high-frequency sub-band image, as defined below:
S V k , l ' = k l V k , l '
The image energy at each level of high-frequency sub-band E k H is defined as follows:
E k H = 1 2 k S V k , l '
The weighted total energy of high-frequency sub-band images is defined as follows:
E H = s E 1 H + ( 1 s ) k = 2 N E k H w
3.
Combine the energy of the high- and low-frequency components to calculate the sharpness evaluation value, as defined below:
h = E H E L
In this experiment, the parameters are w = 3 and s = 0.8. The specific flow chart of the experiment and the pseudo-code of the algorithm are shown in Figure 2 and Algorithm 1, respectively. In practical application, due to many reasons such as image acquisition equipment and natural environmental factors, the processed image is different from the real image. In order to simulate this difference, we add noise to test the noise immunity performance of the algorithm. As to how to add noise, see the experimental results in Chapter 3. SNVRO is the operation flow without adding noise.
Algorithm 1. Pseudo-Code of the Algorithm
Input: N1 is the number of pictures to be processed.
Output: H is the definition of the evaluation value.
1:
L 1
2:
N 3 Decomposition series
3:
for L to N 1 by 1 do
4:
   i m i m r e a d ( i m a g e   p a t h )
5:
   i m r g b 2 g r a y ( i m )
6:
   i m i m 2 d o u b l e ( i m )
7:
 The image is decomposed by N S S T to obtain low-frequency component f 1 and high-frequency components f 2 k , f 3 k ,   and   f 4 k .
8:
   [ V 1 , V 2 , V 3 , V 4 ] g e t _ v a r i a n c e ( i m , f 1 , f 2 k , f 3 k , f 4 k )
9:
 The energy coefficients of each high-frequency sub-band are processed and added to obtain E 1 , E 2 ,   and   E 3 . The low-frequency coefficient is E L .
10:
   E H s E 2 + ( 1 s ) ( E 3 + E 4 ) / N
11:
   H E H / E L
12:
end for

3.5. Analysis of the Algorithm’s Performance

To showcase the benefits of this algorithm, this study compares it with the clarity evaluation algorithm based on high-frequency sub-band coefficient energy. The comparison results are shown in Figure 3, where the vertical axis is the mapping of the sharpness evaluation value between 0 and 1, and the horizontal axis represents the row number of the pictures. For example, “5” indicates the fifth picture.
It can be seen from Figure 3 that the sharpness evaluation algorithm based on the NSST still shows strong judgment ability when there is only a small amount of image information. Compared with the algorithms that use high-frequency sub-band coefficient energy as the sharpness evaluation, the new algorithm has good unimodality and noise immunity, with all the advantages of spectral functions. However, due to the frequency domain decomposition of images into sub-images with different frequency bands in multiple directions, the running time of the algorithm will be slightly longer than that of the time-domain sharpness evaluation algorithm.

4. Results

The experimental environment: CPU—AMD Ryzen 7, 4800H, with Radeon Graphics, 2.90 GHz; RAM—16.00 GB; MATLAB (R2019b).

4.1. Analyses and Comparison of Algorithms

In order to make the experiment more rigorous and the algorithm more effective, 10 groups of defocus–focus–defocus urinary sediment cell images with 800 × 600 pixels are taken as test samples under the condition of 40X microscope amplification, with the object distance adjusted. Each group contains 19 to 22 images. In this study, four sets of experimental images and results are presented, and 10 of them are listed, as shown in Figure 4. The sharpness evaluation value of each image is calculated in MATLAB and plotted into a sharpness evaluation function curve. In addition, in order to analyze and view the results more conveniently and intuitively, this study normalizes all the results, as shown in Figure 5. The non-subsampled contourlet transform (NSCT), Tenengrad algorithm [29], Roberts algorithm [30,31], discrete cosine transform (DCT) [32,33], energy of gradient (EOG) [34] algorithm, Canny algorithm [35,36], and Laplacian algorithm [37,38] are selected for this comparative experiment. The Tenengrad algorithm uses a Sobel [39,40] operator to extract gradient values in horizontal and vertical directions, and then calculates the gradient square sum of all pixels. In addition, since the datasets involved in the experiment are small in size, the deep learning method cannot be used for comparative experiments. Therefore, no introduction will be made to deep learning in this study.
In order to verify that the algorithm is feasible on other types of images, two sets of public data and two sets of biomedical cell images with different contents are selected for testing. The experimental images and results are shown in Figure 6. Furthermore, Redondo et al. [41] defined the ratio of the width of the evaluation curve at 40% and 80% as a narrow width α / β , which can objectively reflect the performance of the algorithm. The values of the narrow width of the algorithms in the urinary sediment cell dataset, the public dataset, and biomedical cell images are shown in Table 1 and Table 2 below. The average time consumed by each algorithm is given in the tables.
Both advantages and disadvantages of an algorithm can be analyzed based on whether its sharpness evaluation curve has a single peak, whether the curve is steep, and whether there are local peaks and how high the local peaks are. It can be concluded from Figure 5 that in the four sets of comparison experiments, the algorithm proposed in this study can deliver steeper curves, lower local peaks, and higher sensitivity for defocus and in-focus image recognition than other algorithms. Specifically, the normalized evaluation value of the defocus images is close to 0, and that of the focused ones is close to 1, indicating that the algorithm proposed by this study is better than other algorithms reviewed by this study.
As shown in Table 1, the narrow width can objectively reflect the steepness of the sharpness evaluation curve: the larger the narrow width, the steeper the curve—in other words, the better the algorithm’s performance. Among the 11 groups in the test data, the narrow width in the improved NSST algorithm is the highest in 8 groups, and the second highest in 2 groups; besides, the narrow width can be about 100% higher than the lowest value and about 10–20% higher than the second highest value, also indicating that the improved NSST algorithm is superior to other algorithms in terms of the steepness of the sharpness evaluation curve, i.e., its curve is narrower. In the other two sets of data, the narrow width of the improved NSST algorithm delivers the highest or second highest value, about 5% higher than the second highest value in the ball dataset. On another set of biomedical image datasets, it can be clearly seen that the improved NSST algorithm has a good effect in processing such images. From the narrow width obtained, the improved NSST algorithm is 60-80% higher than other algorithms mentioned in this study. This demonstrates that the improved NSST algorithm is also appropriate for other microscope images, but with an effect not as good as that for urine sediment images.

4.2. Noise Immunity Test

While obtaining an image, microscopes often produce some noise due to environment and equipment problems. In order to precisely analyze the advantages and disadvantages of the experimental algorithm as well as its noise immunity performance, this study adds Gaussian, salt-and-pepper, and Poisson noise. After the noises are added, the images are processed via a bilateral filter and guided filter in order to enhance the anti-noise performance of the algorithm. The images with noises are shown in Figure 7, and the results have been normalized. The normalized curves are shown in Figure 8. The images’ normalization curves for the public dataset are shown in Figure 9. The images’ normalization curves for the biomedical cell image sets are shown in Figure 10.
It can be observed in the figures above that when the image is added with salt-and pepper-noise of a noise density value of 0.1, there will be large fluctuations in the clarity evaluation curve for the existing algorithm, multiple local maximums will appear, and the image cannot be intuitively reflected. In other words, it cannot reflect unimodality and unbiasedness. In addition, the principle of the NSCT algorithm is similar to the NSST algorithm, and thus, NSCT can provide better noise immunity performance for the existing algorithms, but it is weaker than the NSST algorithm as mentioned above. When Gaussian noise with a noise density value of 0.1 is added to the image, the existing algorithms can find the global maximum, but there are still fluctuations in the sharpness curve; in contrast, the improved NSST algorithm proposed by this study does not have such problems. In addition, the narrow width α / β can determine the steepness of the curve. After noises are added, all the existing algorithms showcase upward or downward fluctuations to varying degrees. Therefore, it is impossible or difficult to find the narrow width. However, the improved NSST used in this study would not encounter this problem, demonstrating that the anti-noise performance of the improved NSST algorithm adopted in this study performs better than other algorithms reviewed by this study.

5. Conclusions

This study proposes an improved sharpness evaluation algorithm based on the improved NSST for the sharpness evaluation of microscope images, and compared it with other algorithms reviewed by this study. It is concluded that the other algorithms in the sharpness evaluation of microscope images are less feasible than the algorithm proposed in this study. Other algorithms compared in this study cannot well reflect the unimodality and noise resistance after different noises are added. The improved NSST algorithm can perfectly reflect these two points. Additionally, the improved NSST algorithm is especially suitable for the urine sediment cell images, because the background of such images is monotonous, the texture of the cell object is remarkable, and the degree of color interference is weak. In addition, the improved NSST delivers better results than other methods referred to in this study. However, the improved NSST algorithm still has certain shortcomings for noise immunity. For example, when the noise intensity is strong to a certain degree, the sharpness evaluation curve of the improved NSST will inevitably showcase upward and downward fluctuations. In addition, because the improved NSST performs multidirectional subtraction in the frequency domain, the complex decomposition of bands will cause a slightly longer calculation time than the time-domain clarity evaluation algorithm. Since this study aims to find an algorithm with good noise immunity and sensitivity, the impact of time consumption is ignored in this study. However, in the future, it is necessary to conduct more in-depth mining and research on the optimization of the algorithm’s running time and anti-noise performance.

Author Contributions

Conceptualization, X.W., H.Z., J.H. and T.H.; methodology, X.W., H.Y., R.H. and G.Z.; software, X.W.; validation, H.Z., J.H. and T.H.; formal analysis, X.W., H.Y., R.H. and G.Z.; investigation, X.W.; resources, X.W., H.Y., R.H. and G.Z.; data curation, X.W.; writing—original draft preparation, X.W.; writing—review and editing, X.W., H.Z., J.H. and T.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data available on request.

Acknowledgments

The authors would like to thank the anonymous reviewers for their constructive comments and suggestions, which significantly contributed to improving the manuscript. This work was supported by the National Nature Science Foundation of China (No.31971493, No. 31570629, No. 61471321), and the Zhejiang Provincial Natural Science Foundation of China (LY19F020048, No. LY16C160007, No. LY16F010004).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Nilanjan Dey, A.S.A.A. Digital Analysis of Microscopic Images in Medicine. J. Adv. Microsc. Res. 2015, 10, 1–13. [Google Scholar]
  2. Dogan, H.; Baykal, E.; Ekinci, M. Image panorama without loss of focusing for microscopic systems. In Proceedings of the 2017 Interna-tional Conference on Computer Science and Engineering (UBMK), Antalya, Turkey, 5–8 October 2017; IEEE, 2017. [Google Scholar]
  3. Qu, Y.; Zhu, S.; Zhang, P. A self-adaptive and nonmechanical motion autofocusing system for optical microscopes. Microsc. Res. Tech. 2016, 79, 1112–1122. [Google Scholar] [CrossRef]
  4. Xia, X.; Yao, Y.; Liang, J.; Fang, S.; Yang, Z.; Cui, D. Evaluation of focus measures for the autofocus of line scan cameras. Optik 2016, 127, 7762–7775. [Google Scholar] [CrossRef]
  5. Xia, X.; Yin, L.; Yao, Y.; Wu, W.; Zhang, Y. Combining two focus measures to improve performance. Meas. Sci. Technol. 2017, 28, 105401. [Google Scholar] [CrossRef]
  6. Liu, S.; Liu, M.; Yang, Z. An image auto-focusing algorithm for industrial image measurement. EURASIP J. Adv. Signal Process. 2016, 2016, 1. [Google Scholar] [CrossRef] [Green Version]
  7. Cabazos-Marín, A.R.; Álvarez-Borrego, J. Automatic focus and fusion image algorithm using nonlinear correlation: Image quality evaluation. Optik 2018, 164, 224–242. [Google Scholar] [CrossRef]
  8. Hore, S.; Chakraborty, S.; Ashour, A.; Dey, N.; Ashour, A.S.; Sifaki-Pistolla, D.; Bhattacharya, T.; Chaudhuri, S.R.B. Finding Contours of Hippocampus Brain Cell Using Microscopic Image Analysis. J. Adv. Microsc. Res. 2015, 10, 93–103. [Google Scholar] [CrossRef]
  9. Akiyama, A.; Kobayashi, N.; Mutoh, E.; Kumagai, H.; Yamada, H.; Ishii, H. Infrared image guidance for ground vehicle based on fast wavelet image focusing and tracking. In Proceedings of the Novel Optical Systems Design and Optimization XII, San Diego, CA, USA, 2–6 August 2009; pp. 50–57. [Google Scholar] [CrossRef]
  10. Makkapati, V.V. Improved wavelet-based microscope autofocusing for blood smears by using segmentation. In Proceedings of the 2009 IEEE International Conference on Automation Science and Engineering, Bangalore, India, 22–15 August 2009; IEEE, 2009; pp. 208–211. [Google Scholar] [CrossRef]
  11. Chen, R.; van Beek, P. Improving the accuracy and low-light performance of contrast-based autofocus using supervised machine learning. Pattern Recognit. Lett. 2015, 56, 30–37. [Google Scholar] [CrossRef]
  12. Matsui, S.; Nagahara, H.; Taniguchi, R.-I. Half-sweep imaging for depth from defocus. Image Vis. Comput. 2014, 32, 954–964. [Google Scholar] [CrossRef]
  13. Lv, M.; Wang, Z. Study on automatic focusing algorithm of optical microscope. China Meas. Test 2018, 44, 11–16. [Google Scholar] [CrossRef] [Green Version]
  14. Beagum, S.; Dey, N.; Ashour, A.S.; Sifaki-Pistolla, D.; Balas, V.E. Nonparametric de-noising filter optimization using structure-based microscopic image classification. Microsc. Res. Tech. 2016, 80, 419–429. [Google Scholar] [CrossRef] [PubMed]
  15. Ashour, A.S.; Beagum, S.; Dey, N.; Ashour, A.S.; Pistolla, D.S.; Nguyen, G.N.; Le, D.-N.; Shi, F. Light microscopy image de-noising using optimized LPA-ICI filter. Neural Comput. Appl. 2018, 29, 1517–1533. [Google Scholar] [CrossRef]
  16. Shahdoosti, H.R.; Khayat, O. Image denoising using sparse representation classification and non-subsampled shearlet transform. Signal Image Video Process. 2016, 10, 1081–1087. [Google Scholar] [CrossRef]
  17. Zhan, K.; Shi, J.; Wang, H.; Xie, Y.; Li, Q. Computational Mechanisms of Pulse-Coupled Neural Networks: A Comprehensive Review. Arch. Comput. Methods Eng. 2016, 24, 573–588. [Google Scholar] [CrossRef]
  18. Zhang, X.; Wu, H.; Ma, Y. A new auto-focus measure based on medium frequency discrete cosine transform filtering and discrete cosine transform. Appl. Comput. Harmon. Anal. 2016, 40, 430–437. [Google Scholar] [CrossRef]
  19. Liu, X.; Mei, W.; Du, H.; Bei, J. A novel image fusion algorithm based on nonsubsampled shearlet transform and morphological component analysis. Signal Image Video Process. 2015, 10, 959–966. [Google Scholar] [CrossRef]
  20. Liu, Y.; Zhang, S.; Wang, C.; Xu, J. Single image super-resolution via hybrid resolution NSST prediction. Comput. Vis. Image Underst. 2021, 207, 103202. [Google Scholar] [CrossRef]
  21. Diwakar, M.; Singh, P.; Shankar, A. Multi-modal medical image fusion framework using co-occurrence filter and local extrema in NSST domain. Biomed. Signal Process. Control 2021, 68, 102788. [Google Scholar] [CrossRef]
  22. Li, Y.; Po, L.M.; Xu, X.; Feng, L.; Yuan, F.; Cheung, C.-H.; Cheung, K.-W. No-reference image quality assessment with shearlet transform and deep neural networks. Neurocomputing 2015, 154, 94–109. [Google Scholar] [CrossRef]
  23. Peng, H.; Li, B.; Yang, Q.; Wang, J. Multi-focus image fusion approach based on CNP systems in NSCT domain. Comput. Vis. Image Underst. 2021, 210, 103228. [Google Scholar] [CrossRef]
  24. Liu, J.; Jia, Z.; Qin, X.; Yang, J.; Hu, Y. NSCT Remote Sensing Image Denoising Based on Threshold of Free Distributed FDR. Procedia Eng. 2011, 24, 616–620. [Google Scholar] [CrossRef]
  25. Da Cunha, A.; Zhou, J.; Do, M. The Nonsubsampled Contourlet Transform: Theory, Design, and Applications. IEEE Trans. Image Process. 2006, 15, 3089–3101. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Guorong, G.; Luping, X.; Dongzhu, F. Multi-focus image fusion based on non-subsampled shearlet transform. IET Image Processing 2013, 7, 633–639. [Google Scholar] [CrossRef]
  27. Gao, Y.; Ma, S.; Liu, J.; Liu, Y.; Zhang, X. Fusion of medical images based on salient features extraction by PSO optimized fuzzy logic in NSST do-main. Biomed. Signal Processing Control. 2021, 69, 102852. [Google Scholar] [CrossRef]
  28. He, C.; Li, X.; Hu, Y.; Ye, Z.; Kang, H. Microscope images automatic focus algorithm based on eight-neighborhood operator and least square planar fitting. Optik 2020, 206, 164232. [Google Scholar] [CrossRef]
  29. Lin, Z.; Huang, C.; Lu, A. Autofocus method based on blur difference qualitative analysis. J. Comput. Appli-Cations 2015, 35, 2969–2973+2979. [Google Scholar]
  30. Lüthi, B.; Thomas, N.; Hviid, S.; Rueffer, P. An efficient autofocus algorithm for a visible microscope on a Mars lander. Planet. Space Sci. 2010, 58, 1258–1264. [Google Scholar] [CrossRef]
  31. Chu, F.; Mao, Y.; Zeng, J.; Bian, Z.; Hu, A.; Wen, H. Application of auto-focus algorithm in welding pool imaging system. Weld. World 2022, 25, 1–12. [Google Scholar] [CrossRef]
  32. Yao, Y.; Abidi, B.; Doggaz, N.; Abidi, M. Evaluation of sharpness measures and search algorithms for the auto focusing of high-magnification images. In Proceedings of the Visual Information Processing XV, Orlando, FL, USA, 17 April 2006; SPIE, 2006. [Google Scholar]
  33. Zhang, Z.; Liu, Y.; Xiong, Z.; Li, J.; Zhang, M. Focus and Blurriness Measure Using Reorganized DCT Coefficients for an Autofocus Application. IEEE Trans. Circuits Syst. Video Technol. 2016, 28, 15–30. [Google Scholar] [CrossRef]
  34. Bilen, H.; Hocaoglu, M.A.; Unel, M.; Sabanovic, A. Developing robust vision modules for microsystems applications. Mach. Vis. Appl. 2010, 23, 25–42. [Google Scholar] [CrossRef]
  35. Huang, M.; Liu, Y.; Yang, Y. Edge detection of ore and rock on the surface of explosion pile based on improved Canny operator. Alex. Eng. J. 2022, 61, 10769–10777. [Google Scholar] [CrossRef]
  36. Chen, D.; Cheng, J.-J.; He, H.-Y.; Ma, C.; Yao, L.; Jin, C.-B.; Cao, Y.-S.; Li, J.; Ji, P. Computed tomography reconstruction based on canny edge detection algorithm for acute expansion of epidural hematoma. J. Radiat. Res. Appl. Sci. 2022, 15, 279–284. [Google Scholar] [CrossRef]
  37. Helmli, F.; Scherer, S. Adaptive shape from focus with an error estimation in light microscopy. In Proceedings of the 2nd International Symposium on Image and Signal Processing and Analysis, Pula, Croatia, 19–21 June 2001; pp. 2319–212001. [Google Scholar] [CrossRef]
  38. Hurtado-Pérez, R.; Toxqui-Quitl, C.; Padilla-Vivanco, A.; Aguilar-Valdez, J.F.; Ortega-Mendoza, G. Focus measure method based on the modulus of the gradient of the color planes for digital microscopy. Opt. Eng. 2018, 57, 023106. [Google Scholar] [CrossRef] [Green Version]
  39. AS, R.A.; Gopalan, S. Comparative Analysis of Eight Direction Sobel Edge Detection Algorithm for Brain Tumor MRI Images. Procedia Comput. Sci. 2022, 201, 487–494. [Google Scholar]
  40. Joshi, R.; Zaman, A.; Katkoori, S. Fast Sobel Edge Detection for IoT Edge Devices. SN Comput. Sci. 2022, 3, 1–13. [Google Scholar] [CrossRef]
  41. Redondo, R.; Bueno, G.; Valdiviezo, J.C.; Nava, R.; Cristobal, G.; Deniz, O.; García-Rojo, M.; Salido, J.; Fernández, M.D.M.; Vidal, J.; et al. Autofocus evaluation for brightfield microscopy pathology. J. Biomed. Opt. 2012, 17, 0360088. [Google Scholar] [CrossRef]
Figure 1. NSST breakdown structure diagram.
Figure 1. NSST breakdown structure diagram.
Sensors 22 07607 g001
Figure 2. Flow chart of sharpness evaluation curve.
Figure 2. Flow chart of sharpness evaluation curve.
Sensors 22 07607 g002
Figure 3. Comparison results of algorithm performance with and without noise: (a) image without noise; (b) image with noise.
Figure 3. Comparison results of algorithm performance with and without noise: (a) image without noise; (b) image with noise.
Sensors 22 07607 g003
Figure 4. Four groups of test experiment diagrams: (a) Group 1; (b) Group 3; (c) Group 4; (d) Group 9.
Figure 4. Four groups of test experiment diagrams: (a) Group 1; (b) Group 3; (c) Group 4; (d) Group 9.
Sensors 22 07607 g004aSensors 22 07607 g004b
Figure 5. Sharpness evaluation curve of each algorithm for the four groups of images: (a) Group 1; (b) Group 3; (c) Group 4; (d) Group 9.
Figure 5. Sharpness evaluation curve of each algorithm for the four groups of images: (a) Group 1; (b) Group 3; (c) Group 4; (d) Group 9.
Sensors 22 07607 g005
Figure 6. Sharpness evaluation curve of each algorithm for the public data: (a1) ball image; (b1) insect image; (c1,d1) biomedical cell images; (a2) ball sharpness evaluation curve; (b2) insect sharpness evaluation curve; (c2,d2) biomedical cell image sharpness evaluation curve.
Figure 6. Sharpness evaluation curve of each algorithm for the public data: (a1) ball image; (b1) insect image; (c1,d1) biomedical cell images; (a2) ball sharpness evaluation curve; (b2) insect sharpness evaluation curve; (c2,d2) biomedical cell image sharpness evaluation curve.
Sensors 22 07607 g006
Figure 7. Image comparison: (a1,b1,c1,d1) are the datasets used in this study; (a2)–(d5) are two public datasets and two biomedical cell image sets; (a1a5) clear cell image; (b1b5) images with Gaussian noise added; (c1c5) images with salt-and-pepper noise; (d1d5) images with Poisson noise.
Figure 7. Image comparison: (a1,b1,c1,d1) are the datasets used in this study; (a2)–(d5) are two public datasets and two biomedical cell image sets; (a1a5) clear cell image; (b1b5) images with Gaussian noise added; (c1c5) images with salt-and-pepper noise; (d1d5) images with Poisson noise.
Sensors 22 07607 g007aSensors 22 07607 g007b
Figure 8. Curves after noises and filters are added: (a1) Gaussian noise and bilateral filter of Group 1; (b1) Gaussian noise and guided filter of Group 1; (c1) Gaussian noise and bilateral filter of Group 9; (d1) Gaussian noise and guided filter of Group 9; (a2) salt-and-pepper noise and bilateral filter of Group 1; (b2) salt-and-pepper noise and guided filter of Group 1; (c2) salt-and-pepper noise and bilateral filter of Group 9; (d2) salt-and-pepper noise and guided filter of Group 9; (a3) Poisson noise and bilateral filter of Group 1; (b3) Poisson noise and guided filter of Group 1; (c3) Poisson noise and bilateral filter of Group 9; (d3) Poisson noise and guided filter of Group 9.
Figure 8. Curves after noises and filters are added: (a1) Gaussian noise and bilateral filter of Group 1; (b1) Gaussian noise and guided filter of Group 1; (c1) Gaussian noise and bilateral filter of Group 9; (d1) Gaussian noise and guided filter of Group 9; (a2) salt-and-pepper noise and bilateral filter of Group 1; (b2) salt-and-pepper noise and guided filter of Group 1; (c2) salt-and-pepper noise and bilateral filter of Group 9; (d2) salt-and-pepper noise and guided filter of Group 9; (a3) Poisson noise and bilateral filter of Group 1; (b3) Poisson noise and guided filter of Group 1; (c3) Poisson noise and bilateral filter of Group 9; (d3) Poisson noise and guided filter of Group 9.
Sensors 22 07607 g008
Figure 9. Curves after noises and filters are added: (a1) Gaussian noise and bilateral filter of insects; (b1) Gaussian noise and guided filter of insects; (c1) Gaussian noise and bilateral filter of ball; (d1) Gaussian noise and guided filter of ball; (a2) salt-and-pepper noise and bilateral filter of insects; (b2) salt-and-pepper noise and guided filter of insects; (c2) salt-and-pepper noise and bilateral filter of ball; (d2) salt-and-pepper noise and guided filter of ball; (a3) Poisson noise and bilateral filter of insects; (b3) Poisson noise and guided filter of insects; (c3) Poisson noise and bilateral filter of ball; (d3) Poisson noise and guided filter of ball.
Figure 9. Curves after noises and filters are added: (a1) Gaussian noise and bilateral filter of insects; (b1) Gaussian noise and guided filter of insects; (c1) Gaussian noise and bilateral filter of ball; (d1) Gaussian noise and guided filter of ball; (a2) salt-and-pepper noise and bilateral filter of insects; (b2) salt-and-pepper noise and guided filter of insects; (c2) salt-and-pepper noise and bilateral filter of ball; (d2) salt-and-pepper noise and guided filter of ball; (a3) Poisson noise and bilateral filter of insects; (b3) Poisson noise and guided filter of insects; (c3) Poisson noise and bilateral filter of ball; (d3) Poisson noise and guided filter of ball.
Sensors 22 07607 g009
Figure 10. Curves after noises and filters are added: (a1) Gaussian noise and bilateral filter of cell 2; (b1) Gaussian noise and guided filter of cell 2; (c1) Gaussian noise and bilateral filter of cell 1; (d1) Gaussian noise and guided filter of cell 1; (a2) salt-and-pepper noise and bilateral filter of cell 2; (b2) salt-and-pepper noise and guided filter of cell 2; (c2) salt-and-pepper noise and bilateral filter of cell 1; (d2) salt-and-pepper noise and guided filter of cell 1; (a3) Poisson noise and bilateral filter of cell 2; (b3) Poisson noise and guided filter of cell 2; (c3) Poisson noise and bilateral filter of cell 1; (d3) Poisson noise and guided filter of cell 1.
Figure 10. Curves after noises and filters are added: (a1) Gaussian noise and bilateral filter of cell 2; (b1) Gaussian noise and guided filter of cell 2; (c1) Gaussian noise and bilateral filter of cell 1; (d1) Gaussian noise and guided filter of cell 1; (a2) salt-and-pepper noise and bilateral filter of cell 2; (b2) salt-and-pepper noise and guided filter of cell 2; (c2) salt-and-pepper noise and bilateral filter of cell 1; (d2) salt-and-pepper noise and guided filter of cell 1; (a3) Poisson noise and bilateral filter of cell 2; (b3) Poisson noise and guided filter of cell 2; (c3) Poisson noise and bilateral filter of cell 1; (d3) Poisson noise and guided filter of cell 1.
Sensors 22 07607 g010
Table 1. Comparison of narrow widths of noise-free images in different algorithms. The maximum value is given in bold.
Table 1. Comparison of narrow widths of noise-free images in different algorithms. The maximum value is given in bold.
Algorithms1234567891011Time
NSCT0.45450.37710.32380.26720.41200.39740.43220.35410.26420.45770.3778135.54
Sobel0.24330.26990.23980.28530.35520.26960.36590.15500.31970.29150.24732.40
Roberts0.22470.22540.22490.28610.28790.28230.20190.20090.29940.23660.24232.33
DCT0.26310.28720.29060.25180.40540.26620.37170.18760.42350.29890.275910.98
EOG0.26350.30480.29720.32240.35370.24310.19020.17900.38920.28190.29292.34
Laplacian0.28740.30930.29590.36390.28590.21390.16140.24200.47680.22740.26662.34
Canny0.27330.29810.31160.27660.39210.26250.21230.18750.29850.30920.27039.36
NSST0.42580.51700.45000.32480.48990.42240.44170.35870.34780.50290.526333.97
Table 2. Comparison of images’ narrow widths of noise-free public dataset in different algorithms, where ball (left peak) represents the peak on the left and ball (right peak) represents the peak on the right. The maximum value is given in bold.
Table 2. Comparison of images’ narrow widths of noise-free public dataset in different algorithms, where ball (left peak) represents the peak on the left and ball (right peak) represents the peak on the right. The maximum value is given in bold.
AlgorithmsBall (Left Peak)Ball (Right Peak)InsectCellsTime
NSCT0.43180.40260.40850.1897394.67
Sobel0.43610.34540.26670.13045.19
Roberts0.35860.29880.32450.20425.95
DCT0.24120.21110.31310.197031.79
EOG0.44730.38890.24410.22885.96
Laplacian0.44920.40780.25860.24295.97
Canny0.43930.37800.27380.265352.98
NSST0.47190.41960.39560.314677.86
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wu, X.; Zhou, H.; Yu, H.; Hu, R.; Zhang, G.; Hu, J.; He, T. A Method for Medical Microscopic Images’ Sharpness Evaluation Based on NSST and Variance by Combining Time and Frequency Domains. Sensors 2022, 22, 7607. https://doi.org/10.3390/s22197607

AMA Style

Wu X, Zhou H, Yu H, Hu R, Zhang G, Hu J, He T. A Method for Medical Microscopic Images’ Sharpness Evaluation Based on NSST and Variance by Combining Time and Frequency Domains. Sensors. 2022; 22(19):7607. https://doi.org/10.3390/s22197607

Chicago/Turabian Style

Wu, Xuecheng, Houkui Zhou, Huimin Yu, Roland Hu, Guangqun Zhang, Junguo Hu, and Tao He. 2022. "A Method for Medical Microscopic Images’ Sharpness Evaluation Based on NSST and Variance by Combining Time and Frequency Domains" Sensors 22, no. 19: 7607. https://doi.org/10.3390/s22197607

APA Style

Wu, X., Zhou, H., Yu, H., Hu, R., Zhang, G., Hu, J., & He, T. (2022). A Method for Medical Microscopic Images’ Sharpness Evaluation Based on NSST and Variance by Combining Time and Frequency Domains. Sensors, 22(19), 7607. https://doi.org/10.3390/s22197607

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop