Next Article in Journal
ScRpb4, Encoding an RNA Polymerase Subunit from Sugarcane, Is Ubiquitously Expressed and Resilient to Changes in Response to Stress Conditions
Previous Article in Journal
Effectiveness of Common Preprocessing Methods of Time Series for Monitoring Crop Distribution in Kenya
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Uniformity Detection for Straws Based on Overlapping Region Analysis

1
Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
2
Key Laboratory of Modern Agricultural Equipment, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
*
Author to whom correspondence should be addressed.
Agriculture 2022, 12(1), 80; https://doi.org/10.3390/agriculture12010080
Submission received: 16 November 2021 / Revised: 21 December 2021 / Accepted: 4 January 2022 / Published: 9 January 2022
(This article belongs to the Section Digital Agriculture)

Abstract

:
Nowadays, the advanced comprehensive utilization and the complete prohibition of burning fully covered straws in croplands have become increasingly important in agriculture engineering. As a kind of direct straw-mulching method in China, conservation tillage with straw smashing is an effective method to reduce pollution and enhance fertility. In view of the high straw-returning yields, complicated manual operation, and the poor performance of straw detection with machine vision, this study introduces a novel form of uniformity detection for straws based on overlapping region analysis. An image-processing technology using a novel overlapping region analysis was proposed to overcome the inefficiency and low precision resulting from the manual identification of the straw uniformity. In this study, the debris in the gray map was removed according to region characteristics. Through using morphological theory with overlapping region analysis in low-density cases, straws of appropriate length can be identified and then uniformity detection can be accomplished. Compared with traditional threshold segmentation methods, the advantages of an accurate identification, fast operation, and high efficiency contribute to the better performance of the innovative overlapping region analysis. Finally, the proposed algorithm was verified through detecting the uniformity in low-density cases, with an average accuracy rate of 97.69%, providing a novel image recognition solution for automatic straw-mulching systems.

1. Introduction

To reduce the influence of straw burning and utilize the resource sufficiently, some countries have published targeted policies in the comprehensive utilization of straw and achieved positive implementation effects, especially in terms of new energy and conservation tillage with straw mulching [1]. As the leading method of straw recycling in European and North American countries, straw mulching consists of two modes (i.e., direct straw-mulching during harvesting or straw mulching after feeding livestock). Specifically, the amount of straw returned to the field is about 68% of the total straw produced in the United States. More than 60% of the total straw produced in Canada, 73% of the total straw produced in the United Kingdom, more than 66% of the rice straw in Japan, and almost all the rice-wheat straw in South Korea are directly returned to the field.
Due to high rice-wheat production in China, a large quantity of field straw will seriously affect sowing quality during the rice–wheat rotation period [2,3]. As a kind of direct straw-mulching method in China, conservation tillage with straw smashing is an effective method to reduce pollution and enhance fertility in agriculture. This is the main existing production pattern among the grain-producing areas in the middle and lower reaches of the Yangtze River [4,5]. In practice, nonuniform straw mulch will affect the growth of seedlings and hinder the smooth operation of fertilizer and sowing machines. Therefore, the uniform coverage of straw mulching in the field is still an urgent practical problem. Moreover, there is a lack of technology and equipment for straw crushing and returning satisfying such agronomic requirements. In recent years, reasonable mechanized smashing and returning technology has attracted the attention of researchers, where the straw uniformity was the index of the smashing level [6,7,8,9,10]. As shown in Figure 1, the straw-mulch machine, which was developed by Nanjing Institute of Agricultural Mechanization (Ministry of Agriculture and Rural Affairs, China), adjusts the spraying device to realize straw mulching with smashed straw after the manual identification of the shredded-straw uniformity. This machine includes a straw-crushing device, a straw-spraying device and a straw-dispersing device [11]. The straw-spraying device is mainly composed of the throwing impeller and the throwing pipeline (Figure 2). The crushed straw is blown along the pipeline under the action of the rotation of the impeller. During operation, the straw-crushing device and the straw-spraying device are powered by the tractor. After the residual straw in the field is picked and smashed by the straw-crushing device, the straw is lifted by the straw-spraying device, and it is evenly thrown back under the action of the dispersing device. Through this, the ultimate intention is to maintain shredded-straw uniformity in the coverage region. However, this traditional straw-mulching method requires manual operation, which not only consumes manpower and material resources, but also has a low working efficiency [6,7,8].
With the development of intelligent equipment and Internet of Things technology [12,13], machine vision and image-processing technologies have received extensive attention, especially for crop disease detection [14,15], spraying machines [16], and weed robots [17]. Advanced image-processing methods can greatly reduce costs and improve productivity [18]. From the perspective of agricultural product harvesting, the quantity and quality parameters can be considered as important indicators of agricultural products. By taking photos with electronic cameras, straws can be accurately identified to reduce artificial errors and provide technical solutions for the intelligent straw-mulching system.
Over the past few years, researchers have studied the advantages of agricultural image-processing technology. Tian et al. [19] applied machine vision technology to estimate weed density and size in real time, which can effectively reduce the amount of herbicide in corn and soybean fields. Coy et al. [20] proposed an unsupervised segmentation algorithm for the accurate estimation of vegetation coverage, which converted the original image into a certain color space (brightness, red-green, and blue-yellow), and the results showed that the accuracy rate was more than 85%. Hamuda et al. [21] summarized image-processing techniques for the extraction and segmentation of wild plants, including color-based, threshold-based, and learning-based segmentation. Hernández et al. [22] proposed a novel color-processing probability method, which can not only create a best optimal color model for plant/soil segmentation, but can also select the most appropriate color space in different situations.
However, studies on straw recognition technology are lacking. Kujawa et al. [23] utilized computer image analysis and a neural model to construct a classification of sewage sludge composted with maize straw and identify the early maturity stage. Jia et al. [24] introduced machine vision to obtain the numbers of corn plants and demarcated the geometric centers. Liu et al. [25] established a high-throughput quantitative image-processing method to analyze the decomposition degree of straws based on their color degradation in soil over a few days. Sun et al. [26] proposed a multi-scale retinal-enhanced image technology to obtain images of a cucumber canopy and improve crop performance. Based on a binary image-processing method, Eisa et al. [27] estimated the coverage rate of rice and wheat straws. Liu et al. [28] detected straw coverage by a multi-threshold segmentation algorithm, which combined the searching mechanism of the gray wolf algorithm with the differential evolution algorithm.
Although experts have made progress in exploring the straw-coverage rate as described above, no published research to estimate straw uniformity using a vision-based process is available at present. In practice, uniformity detection for straws is a challenging task in view of the overlapping problems, which makes the detection of targets difficult and cumbersome when using machine vison. Considering the high straw-returning yields, complicated manual operation, and the poor performance of straw detection with machine vision, this study focused on a novel morphological image recognition technology for solving overlapping problems in low-density cases, which can offer a value reference for investigating the high-density overlapping issues. A fast visual-processing method based on color vector and shape feature, with the ability to remove the debris according to region characteristics, was designed. Notably, straws of appropriate length were identified by a mathematical morphological operation algorithm, and the uniformity was then calculated immediately. Above all, the straw detection approach in this study provides a novel image analysis solution for automatic straw-mulching systems, which can improve recognition accuracy and reduce work time.

2. Materials and Methods

Field experiments in an artificial laboratory were performed at the Nanjing Institute of Agricultural Mechanization, Jiangsu Province, China, in the daytime, with dim sunlight or on a cloudy day without wind. The experimental soil was a loam. Shredded rice straws in the low-density case were investigated under uniform illumination. The Intel-RealSense D435 camera is a stereo-tracking device that provides high-quality color and depth data for a variety of applications. With the advantages of low cost, compact size, and wide field of view, it is ideal for robotics or applications such as augmented reality and virtual reality. This digital camera, with a maximum resolution of 1920 × 1080 pixels, was fixed at a height of about 30 cm from the targets, and sixty RGB images were processed in Matlab 2019b (The MathWorks, Inc., Natick, MA, USA). Images were captured vertically if possible. A maximum contrast between the straws and the loam guaranteed the target extraction. In this study, the exposure parameter was set within 80~150, and the RGB frame rate of the camera was 30 fps.

3. Digital Image Pre-Processing

An RGB image is an original full-color model, which is synthesized through the three primary colors of red, green, and blue. A typical original RGB image of the straws with a resolution of 960 × 720 pixels is shown in Figure 3. Utilizing gray image analysis can significantly reduce the storage space and accelerate the subsequent recognition. However, a gray image merely contains brightness without color information. The weighted average method is suitable for gray transformation from the full-color RGB image, and human eyes can easily distinguish targets [29].
For convenience, the color value of any pixel in the RGB model is represented by:
C ( i , j ) = { x | x [ R ( i , j )   G ( i , j )   B ( i , j ) ] }
where i(j) denotes the row (column) index of the pixel matrix, R ( i , j ) denotes the red component value, G ( i , j ) denotes the green component value, and B ( i , j ) denotes the blue component value.
According to the indicator of the pixel components in RGB images, the weighted average value of the three primary color components is taken as the gray value, and its luminance formula is established as follows [29]:
C g ( i , j ) = 0.299 × R ( i , j ) + 0.587 × G ( i , j ) + 0.114 × B ( i , j )
Based on the weighted average method, the straw image in grayscale was obtained (Figure 4).
Usually, HSV and HSI are powerful tools to distinguish targets, which can present the color intensity intuitively [30]. The hue and saturation are regarded as the main indices in such models. However, this is not suitable for the lighting model. To compare the identification accuracy, the HSV and HSI models were adopted to test the performance in this study, and the results are shown in Figure 5.
As presented in Figure 5, in the images, the luminance variation between the background and the straws in an artificial laboratory is relatively obvious, and this may serve as an effective tool for detecting the straws. The H and S component values indicate that the straws in the map are similar to the background, and the characteristics of the crops cannot be highlighted obviously. Then, the I component shows that the luminance variation can enhance the contrast effect, which was regarded as the evaluation criterion in this study.

4. Image Smoothing by Spatial Filtering

Generally, random noise exists in images and the pixel values have random mutations in this field. To eliminate such random noise and thus smooth the image, spatial filtering is necessary [31].
To compare the filtering results, various methods were used and these are listed in Table 1. Average filtering is a popular linear method to average pixel values in the whole window range. However, it fails to protect image details well and makes the image blurry, as shown in Figure 6a. Furthermore, frequency-domain filtering is a widespread method used in engineering and is carried out based on Fourier transform [32]. The flow diagram of frequency-domain filtering is presented in Figure 7, and the filter transfer function is defined as H ( u , v ) . In order to eliminate the noise in an image, low-frequency filtering is usually adopted, removing the high-frequency signal. As shown in Table 1, the filtering parameters of the frequency-domain approaches can be defined as follows: D 0 > 0 , D ( u , v ) represents the distance between the point ( u , v ) and the center of the filter template, and σ is the standard deviation. The classical low-pass filtering result is shown in Figure 6b. Moreover, Gaussian filtering is widely used in image processing to reduce Gaussian noise [15]. Specifically, the results are presented in Figure 6c.
In brief, linear spatial filtering is used to calculate the sum of the product (i.e., a linear operation). Nevertheless, nonlinear spatial filtering is used to order the pixel values, and then update the center pixel based on the sorted values. For example, nonlinear adaptive median filtering, by using a filter window of adaptive size, puts the median value of the neighborhood in the center [33], and this approach can protect the sharp edges and preserve the original information reasonably. Therefore, the adaptive median filter was remarkably effective in this study, and the filtering result is shown in Figure 8, where S max denotes the maximum allowed size of the adaptive filter window.

5. Image Threshold Segmentation Algorithm

Image thresholding is one of many active segmentation methods, which divides the image pixels into several classes. Due to the simple calculation, high efficiency, and stable performance, image thresholding has been widely applied in image-processing techniques.

5.1. Threshold Segmentation by Setting Manually

In fact, different targets are separated by setting the grayscale threshold, and then image binarization processing can be carried out. To distinguish the target and the background, the straw image threshold was assumed to be Y, and the binarization image after thresholding was defined as:
B ( x , y ) = { a ,   w h e n   B ( x , y ) > Y b ,   w h e n   B ( x , y ) Y
where a = 1 and b = 0 . By setting the threshold value manually, the original typical straw image was segmented as shown in Figure 9. Obviously, the segmentation accuracy depends on manual operation rather than automatic generation.

5.2. Adaptive Global Thresholding

In an adaptive global threshold method, the custom and iterative optimization algorithms are combined, and the basic steps can be described as follows:
(1)
Select an initial global threshold Y i n i = Y min + Y max 2 , where Y min and Y max denote the maximum and minimum grayscale values of the image.
(2)
Segment the image to produce two sets of pixels according to the threshold Y: C g 1 = { C g ( x , y ) > Y i n i } and C g 2 = { C g ( x , y ) Y i n i } .
(3)
Calculate the average value of each pixel set: A V 1 = ( x , y ) C g 1 C g ( x , y ) n u m ( C g 1 ) and A V 2 = ( x , y ) C g 2 C g ( x , y ) n u m ( C g 2 ) .
(4)
According to the updated threshold, Y u p = A V 1 + A V 2 2 , repeat the previous steps and stop iterations until the variation of Y u p is less than the setting, so as to obtain the final global threshold.
Adaptive global threshold methods perform threshold segmentation based on the variation of gray levels, and then execute the binarization procedure. This method fully considers the characteristics of each region in an image and can easily highlight the target as distinct from the background. The segmentation result with adaptive global thresholding is shown in Figure 10.

5.3. Otsu’s Optimal Global Thresholding

In addition, Otsu’s optimal global threshold processing has been adopted in this study [34], and the basic principle of the algorithm is described as follows:
(1)
According to the chosen threshold Y, the pixel sets C 1 and C 2 can be divided with levels [1, 2, …, k] and [k + 1, k + 2, …, L], where L denotes the color intensity. The occurring probabilities of sets C 1 and C 2 can be denoted as P 1 ( k ) = i = 1 k p i and P 2 ( k ) = i = k + 1 L p i = 1 P 1 ( k ) , where p i is the occurring probability of pixels with level i.
(2)
Define the within-class variance, between-class variance, and total variance as σ w 2 ( k ) = P 1 ( k ) σ 1 2 + P 2 ( k ) σ 2 2 , σ b 2 ( k ) = P 1 ( k ) [ η 1 η t ] 2 + P 2 ( k ) [ η 2 η t ] 2 and σ t 2 ( k ) = i = 1 L ( i η t ) 2 p i , where σ 1 = i = 1 k ( i η 1 ) 2 p i / P 1 ( k ) , σ 2 = i = k + 1 L ( i η 2 ) 2 p i / P 2 ( k ) , η 1 = i = 1 k i p i / P 1 ( k ) , η 2 = i = k + 1 L i p i / P 2 ( k ) , and η t = i = 1 L i p i , the global average gray intensity.
Through searching for the optimal threshold Y, Otsu’s threshold method maximizes the objective functions λ = σ b 2 / σ w 2 , κ = σ t 2 / σ w 2 and γ = σ b 2 / σ t 2 . In previous functions, the discriminant γ corresponding to the variable k can serve as the divisibility indicator to evaluate the threshold value with level k. By using Otsu’s threshold algorithm described above, the processing result shown in Figure 11 was obtained.

6. Segmentation Algorithm Based on Overlapping Region Analysis

6.1. Target Extraction with Region Characteristics

The image segmentation algorithm based on region characteristics divides the image into a series of independent regions based on the similar features of the pixels, and then each region will possess the consistency characteristics.
Furthermore, the area-based filter of connected regions can remove the miscellaneous noise. According to the morphology of the straws, the target recognition can be completed through calculating the external rectangular aspect ratio of the targets [35,36]. The whole image region can be represented by R, which is divided into n regions (R1, R2, …, Rn). If adjacent pixel values in the image are nonzero, these pixels can be considered as a connected region. According to the shape characteristic of the pixel matrix, the connected directions of two adjoining pixels can be defined as horizontal, vertical, and diagonal directions, as shown in Figure 12.
Obviously, discontinuous parts remain in the straw image and do not contribute to the objects. Based on the shape feature of the straws, binary denoising can be used to obtain the slender target. In this study, the 8-connected template was applied to mark the grayscale image, where the background pixels were denoted as 0 and the other pixels were denoted as 1, 2, ..., n [37]. According to the area sizes and region shapes of segmented targets, the discontinuous parts were excluded to obtain the result in Figure 13.

6.2. Morphological Operation Based on Overlapping Region Analysis

According to the operational specifications and requirements provided in the Chinese National Standard GB/T 24675.6-2009 “Conservation tillage equipment-Smashed straw machine”, the length qualification rate of shredded straws can be a critical factor affecting the uniformity level [4]. Traditional region segmentations are inefficient, particularly in occluded situations. As a result, this study concentrates on separating the straws one by one with a novel morphological approach.
To apply the overlapping region algorithm, a three-step process on the binary image was proposed:
(i)
Thinning-based skeletonization algorithms are devoted to the extraction of skeletons.
(ii)
Endpoints can be identified by the Harris–Stephens algorithm in skeleton images.
(iii)
Each straw skeleton can be extracted by using the overlapping region optimization, and the corresponding straw length can be calculated.
The first step is to skeletonize the straws by adopting the one-pixel thick skeleton method [38], and a representative skeleton image is shown in Figure 14. Subsequently, the second step is to find all endpoints by the Harris–Stephens algorithm in skeleton images [39]. The feature point, which has only one pixel in the adjacent positions, was regarded as an endpoint. The identified endpoints were marked with red dots in Figure 14.
The third step is critical, that is, to extract each skeleton by a novel overlapping region optimization. As shown in Figure 14, nine straw skeletons are intersecting, of which four endpoints can be denoted as A, B, C, and D. To split each skeleton, the line segments AB and CD should be extracted. For instance, the endpoint A can be the initial searching point and its adjacent pixels are taken as candidate points in the 8-connected neighborhood. If the number of candidate points is more than two, the most suitable candidate point should be selected as the follow-up searching point based on the optimization equation, Equation (6).
As shown in Figure 15, the slant angle of the line that passes through the initial searching point and the ith candidate point can be established by:
a n g l e i = tan 1 ( k i ) ,   k i = y i y i n i t i a l x i x i n i t i a l
where ( x i n i t i a l , y i n i t i a l ) represents the initial searching pixel coordinate; ( x i , y i ) denotes the ith candidate point coordinate around the previous searching point; k i and a n g l e i denote the slope and the slant angle of the line that passes through the initial searching point and the ith candidate point, respectively.
In Figure 15, the slant angle of the line that passes through the previous searching point and the initial searching point can be obtained by:
a n g l e p s = tan 1 ( k p s ) ,   k p s = y p s y i n i t i a l x p s x i n i t i a l
where ( x p s , y p s ) represents the previous searching pixel coordinate; k p s and a n g l e p s denote the slope and the slant angle of the line that passes through the previous searching point and the initial searching point, respectively.
Notably, the future searching point can be established with the optimization approach as follows:
If   | a n g l e _ e r r o r k | = = min ( | a n g l e _ e r r o r | )   ( k = 1 ,   ,   n ) ,
( x k , y k )   will   be   chosen   as   the   follow - up   searching   point .
where a n g l e _ e r r o r = a n g l e i a n g l e p s   ( i = 1 , , n ) , n is the total number of candidate points around the previous searching point, a n g l e _ e r r o r k denotes the kth element of a n g l e _ e r r o r , | | denotes the absolute value of , min ( ) is the minimum of .
Similarly, the searching point around the branchpoint can be determined, and the searching program will stop once one of the other endpoints is found. As shown in Figure 16a, the line segment AB can be obtained by using the novel overlapping region analysis, and the length of AB can be calculated at once. Then, the rest of the straw skeletons in this image can be precisely extracted by using the same principle, as shown in Figure 16b,c. Finally, the length qualification rate of the shredded straws, as well as the uniformity of the straws, can be easily estimated.

7. Results and Discussion

In order to verify the novel overlapping region method, shredded straws in low-density overlapping situations were considered. In each of the three cases introduced in Table 2, Figure 17 shows that the length value of each straw was exactly calculated with the proposed optimization algorithm, and the results demonstrate that the approach used is effective for the uniformity detection of straws, with an average accuracy rate of 97.69%.
The pattern recognition technology described here is a new and simple approach; however, it is quite effective. Therefore, one can conclude that the detection of slender bodies is an efficient approach to overcome the low-density overlapping problem, because it is possible to calculate and approximate the nonvisible part of the objects in an accurate manner. Additionally, one can easily eliminate the problems of the target intersection by utilizing the proposed scheme. Determining the correct searching points is essential before starting the following optimization step; one should remember that in this step the algorithm attempts to find the best searching point that belongs to the same body. This step prevents the merging of pixels that do not belong to the same body, thereby ensuring high-quality detection. In practice, the high-performance approach presented in this study can be applied for solving the low-density overlapping problems of slender objects, and will have comprehensive application in straw uniformity detection for automatic straw-mulching systems.

8. Conclusions

This study investigated a novel image recognition technique for straw uniformity in low-density cases based on overlapping region analysis. The adaptive global threshold method and Otsu’s optimal global threshold segmentation, which are convenient means to separate the objects from the background, were adopted to extract the straws. The debris in the grayscale map was removed according to region characteristics. Additionally, a novel morphological optimization approach was developed to demonstrate the pattern information, and the results verified the effectiveness of the proposed algorithm, which can provide a novel image recognition solution for automatic straw-mulching systems.
Compared with traditional threshold segmentation methods, the advantages of an accurate identification, fast operation, and high efficiency contribute to the better performance of the innovative overlapping region analysis. In conclusion, this approach is suitable for the high-quality detection of slender bodies in low-density overlapping cases, and this technology offers a value reference for investigating the high-density overlapping issues. In future, uniformity detection in high-density overlapping cases will be investigated for better applications in actual production.

Author Contributions

Conceptualization, F.G.; methodology, J.M.; software, J.M.; validation, F.G. and F.W.; formal analysis, J.M.; investigation, J.M.; resources, F.G.; data curation, H.X.; writing—original draft preparation, J.M.; writing—review and editing, Z.H.; visualization, H.Y.; supervision, Z.H. and F.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Jiangsu Agricultural Science and Technology Innovation Fund, grant number CX(20)3066, National Natural Science Foundation of China, grant number 31901418, and Central Public-interest Scientific Institution Basal Research Fund, grant number S202120.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, H.; Wang, F.; Sun, R.; Gao, C.; Wang, Y.; Sun, N.; Wang, L.; Bi, Y. Policies and regulations of crop straw utilization of foreign countries and its experience and inspiration for China. Trans. Chin. Soc. Agric. Eng. 2016, 32, 216–222, (In Chinese with English Abstract). [Google Scholar]
  2. Xiao, L.G.; Zhao, R.Q.; Kuhn, N.J. Straw mulching is more important than no tillage in yield improvement on the Chinese Loess Plateau. Soil Tillage Res. 2019, 194, 104314. [Google Scholar] [CrossRef]
  3. Akhtar, K.; Wang, W.; Ren, G.; Khan, A.; Feng, Y.; Yang, G. Changes in soil enzymes, soil properties, and maize crop productivity under wheat straw mulching in Guanzhong, China. Soil Tillage Res. 2018, 182, 94–102. [Google Scholar] [CrossRef]
  4. Shi, Y.Y.; Sun, X.; Wang, X.C.; Hu, Z.C.; David, N.; Hu, Z.C. Numerical simulation and field tests of minimum-tillage planter with straw smashing and strip laying based on EDEM software. Comput. Electron. Agric. 2019, 166, 105021. [Google Scholar] [CrossRef]
  5. Zhao, S.H.; Wang, J.Y.; Yang, C.; Chen, J.Q.; Yang, R.Q. Design and experiment of stubble chopper for interaction with subsoiler. Trans. Chin. Soc. Agric. Mach. 2019, 50, 59–67, (In Chinese with English Abstract). [Google Scholar]
  6. Gu, F.W.; Hu, Z.C.; Chen, Y.Q.; Wu, F. Development and experiment of peanut no-till planter under full wheat straw mulching based on “clean area planting”. Trans. Chin. Soc. Agric. Eng. 2016, 32, 15–23, (In Chinese with English Abstract). [Google Scholar]
  7. Luo, W.W.; Hu, Z.C.; Wu, F.; Gu, F.; Xu, H.; Chen, Y. Design and optimization for smashed straw guide device of wheat clean area planter under full straw field. Trans. Chin. Soc. Agric. Eng. 2019, 35, 1–10, (In Chinese with English Abstract). [Google Scholar]
  8. Luo, W.W.; Gu, F.W.; Wu, F.; Xu, H.; Chen, Y.; Hu, Z. Design and experiment of wheat planter with straw crushing and inter-furrow collecting-mulching under full amount of straw and root stubble cropland. Trans. Chin. Soc. Agric. Mach. 2019, 50, 42–52, (In Chinese with English Abstract). [Google Scholar]
  9. Khokan, K.S.; Xu, C.L.; Wang, X.Y.; Li, M.; Li, L.; Liu, G. Band tillage with fertilizer application for unpuddled transplanting rice in northeast of China. Int. J. Agric. Biol. Eng. 2016, 9, 73–83. [Google Scholar]
  10. Wang, W.W.; Zhu, C.X.; Chen, L.Q.; Li, Z.; Huang, X.; Li, J. Design and experiment of active straw-removing anti-blocking device for maize no-tillage planter. Trans. Chin. Soc. Agric. Eng. 2017, 33, 10–17, (In Chinese with English Abstract). [Google Scholar]
  11. Yan, W.; Wu, N.; Gu, F.W.; Lin, D.; Zhou, X.; Hu, Z. Parameter optimization and experiment for the power consumption of impeller-blower. J. China Agric. Univ. 2017, 22, 99–106, (In Chinese with English Abstract). [Google Scholar]
  12. Tzounis, A.; Katsoulas, N.; Bartzanas, T.; Kittas, C. Internet of Things in agriculture, recent advances and future challenges. Biosyst. Eng. 2017, 164, 31–48. [Google Scholar] [CrossRef]
  13. Cicioğlu, M.; Çalhan, A. Smart agriculture with internet of things in cornfields. Comput. Electr. Eng. 2021, 90, 106982. [Google Scholar] [CrossRef]
  14. Patrício, D.I.; Rieder, R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Comput. Electron. Agric. 2018, 153, 69–81. [Google Scholar] [CrossRef] [Green Version]
  15. Sharif, M.; Khan, M.A.; Iqbal, Z.; Azam, M.F.; Lali, M.I.U.; Javed, M.Y. Detection and classification of citrus diseases in agriculture based on optimized weighted segmentation and feature selection. Comput. Electron. Agric. 2018, 150, 220–234. [Google Scholar] [CrossRef]
  16. Tang, J.L.; Chen, X.Q.; Miao, R.H.; Dong, W. Weed detection using image processing under different illumination for site-specific areas spraying. Comput. Electron. Agric. 2016, 122, 103–111. [Google Scholar] [CrossRef]
  17. Wang, A.C.; Zhang, W.; Wei, X.H. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
  18. Xuan, G.T.; Gao, C.; Shao, Y.Y.; Wang, X.; Wang, Y.; Wang, K. Maturity determination at harvest and spatial assessment of moisture content in okra using Vis-NIR hyperspectral imaging. Postharvest Biol. Technol. 2020, 180, 111597. [Google Scholar] [CrossRef]
  19. Tian, L.; Reid, J.F.; Hummel, J.W. Development of a precision sprayer for site-specific weed management. Trans. ASAE 1999, 42, 893–900. [Google Scholar] [CrossRef]
  20. Coy, A.; Rankine, D.; Taylor, M.; Nielsen, D.C.; Cohen, J. Increasing the accuracy and automation of fractional vegetation cover estimation from digital photographs. Remote Sens. 2016, 8, 474. [Google Scholar] [CrossRef] [Green Version]
  21. Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
  22. Hernández-Hernández, J.L.; García-Mateos, G.; González-Esquiva, J.M.; Escarabajal-Henarejos, D.; Ruiz-Canales, A.; Molina-Martínez, J.M. Optimal color space selection method for plant/soil segmentation in agriculture. Comput. Electron. Agric. 2016, 122, 124–132. [Google Scholar] [CrossRef]
  23. Kujawa, S.; Nowakowski, K.; Tomczak, R.J.; Dach, J.; Boniecki, P.; Weres, J.; Mueller, W.; Raba, B.; Piechota, T.; Carmona, P.C.R. Neural image analysis for maturity classification of sewage sludge composted with maize straw. Comput. Electron. Agric. 2014, 109, 302–310. [Google Scholar] [CrossRef]
  24. Jia, H.L.; Wang, G.; Guo, M.Z.; Shah, D.; Jiang, X.; Zhao, J. Methods and experiments of obtaining corn population based on machine vision. Trans. Chin. Soc. Agric. Eng. 2015, 31, 215–220, (In Chinese with English Abstract). [Google Scholar]
  25. Liu, Y.T.; Zhao, S.Q.; Zhu, Q.; Ding, W. Image grey value analysis for estimating the effect of microorganism inoculants on straws decomposition. Comput. Electron. Agric. 2016, 128, 120–126. [Google Scholar] [CrossRef]
  26. Sun, G.X.; Li, Y.B.; Wang, X.C.; Hu, G.Y.; Wang, X.; Zhang, Y. Image segmentation algorithm for greenhouse cucumber canopy under various natural lighting conditions. Int. J. Agric. Biol. Eng. 2016, 9, 130–138. [Google Scholar]
  27. Eisa, B.; Cedric, Q.; Ding, Q.S.; Zahir, T. Mass-Based image analysis for evaluating straw cover under high-residue farming conditions in rice-wheat cropping system. Agric. Res. 2017, 6, 359–367. [Google Scholar]
  28. Liu, Y.Y.; Wang, Y.Y.; Yu, H.Y.; Qin, M.; Sun, J. Detection of straw coverage rate based on multi-threshold image segmentation algorithm. Trans. Chin. Soc. Agric. Mach. 2018, 49, 27–35, 55, (In Chinese with English Abstract). [Google Scholar]
  29. Grundland, M.; Dodgson, N.A. Decolorize: Fast, contrast enhancing, color to grayscale conversion. Pattern Recognit. 2007, 40, 2891–2896. [Google Scholar] [CrossRef] [Green Version]
  30. Zhi, S.Q.; Cui, Y.N.; Deng, J.X.; Du, W. An FPGA-Based simple RGB-HSI space conversion algorithm for hardware image processing. IEEE Access 2020, 8, 173838–173853. [Google Scholar] [CrossRef]
  31. Lin, G.C.; Tang, Y.C.; Zou, X.J.; Xiong, J.; Fang, Y. Color-, depth-, and shape-based 3D fruit detection. Precis. Agric. 2020, 21, 1–17. [Google Scholar] [CrossRef]
  32. Bansal, R.; Lee, W.; Satish, S. Green citrus detection using fast Fourier transform (FFT) leakage. Precis. Agric. 2013, 14, 59–70. [Google Scholar] [CrossRef]
  33. Eng, H.L.; Ma, K.K. Noise adaptive soft-switching median filter. IEEE Trans. Image Process. 2001, 10, 242–251. [Google Scholar]
  34. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  35. Koc, A.B. Determination of watermelon volume using ellipsoid approximation and image processing. Postharvest Biol. Technol. 2007, 45, 366–371. [Google Scholar] [CrossRef]
  36. Jin, C.Q.; Cai, Z.Y. A circular arc approximation algorithm for cucumber classification with image analysis. Postharvest Biol. Technol. 2020, 165, 111184. [Google Scholar] [CrossRef]
  37. Haralick, R.M.; Shapiro, L.G. Computer and Robot Vision; Prentice Hall: Englewood Cliffs, NJ, USA, 1993; Volume 2. [Google Scholar]
  38. Schmitt, M. One Pixel Thick Skeletons; Serra, J., Soille, P., Eds.; Springer: Berlin/Heidelberg, Germany, 1994. [Google Scholar]
  39. Harris, C.; Stephens, M. A combined corner and edge detector. In Proceedings of the 4th Alvey Vision Conference, Manchester, UK, 31 August–2 September 1988. [Google Scholar]
Figure 1. The straw-mulching machine with straw-spraying device.
Figure 1. The straw-mulching machine with straw-spraying device.
Agriculture 12 00080 g001
Figure 2. The structure of straw-spraying device.
Figure 2. The structure of straw-spraying device.
Agriculture 12 00080 g002
Figure 3. Original RGB image of the straws.
Figure 3. Original RGB image of the straws.
Agriculture 12 00080 g003
Figure 4. Grayscale map obtained by weighted average method.
Figure 4. Grayscale map obtained by weighted average method.
Agriculture 12 00080 g004
Figure 5. (a) HSV color model. (b) HSI color model. (c) H component value. (d) S component value. (e) I component value.
Figure 5. (a) HSV color model. (b) HSI color model. (c) H component value. (d) S component value. (e) I component value.
Agriculture 12 00080 g005
Figure 6. (a) Average filtering. (b) Classical low-pass filtering. (c) Gaussian low-pass filtering.
Figure 6. (a) Average filtering. (b) Classical low-pass filtering. (c) Gaussian low-pass filtering.
Agriculture 12 00080 g006
Figure 7. Block diagram of frequency-domain filtering.
Figure 7. Block diagram of frequency-domain filtering.
Agriculture 12 00080 g007
Figure 8. Adaptive median filtering ( S max = 7 ).
Figure 8. Adaptive median filtering ( S max = 7 ).
Agriculture 12 00080 g008
Figure 9. Threshold segmentation by setting manually (Y = 0.4).
Figure 9. Threshold segmentation by setting manually (Y = 0.4).
Agriculture 12 00080 g009
Figure 10. Adaptive global thresholding.
Figure 10. Adaptive global thresholding.
Agriculture 12 00080 g010
Figure 11. Straw image obtained by Otsu’s thresholding.
Figure 11. Straw image obtained by Otsu’s thresholding.
Agriculture 12 00080 g011
Figure 12. The 4-connected and 8-connected pixels.
Figure 12. The 4-connected and 8-connected pixels.
Agriculture 12 00080 g012
Figure 13. Target extraction by using morphological principles.
Figure 13. Target extraction by using morphological principles.
Agriculture 12 00080 g013
Figure 14. Skeleton image with endpoints.
Figure 14. Skeleton image with endpoints.
Agriculture 12 00080 g014
Figure 15. Diagram of the proposed searching algorithm.
Figure 15. Diagram of the proposed searching algorithm.
Agriculture 12 00080 g015
Figure 16. (a) Line segment AB with length = 443 pixels. (b) Line segment CD with length = 699 pixels. (c) Other line segments with different length values in the skeleton image.
Figure 16. (a) Line segment AB with length = 443 pixels. (b) Line segment CD with length = 699 pixels. (c) Other line segments with different length values in the skeleton image.
Agriculture 12 00080 g016
Figure 17. Identification results with the proposed algorithm in three cases. (a) Case 1: seven targets. (b) Case 2: eight targets. (c) Case 3: nine targets.
Figure 17. Identification results with the proposed algorithm in three cases. (a) Case 1: seven targets. (b) Case 2: eight targets. (c) Case 3: nine targets.
Agriculture 12 00080 g017aAgriculture 12 00080 g017bAgriculture 12 00080 g017c
Table 1. Common filters in practice.
Table 1. Common filters in practice.
MethodClassificationTransfer Function
Average filterlinear/
Classical low-pass filterfrequency domain H ( u , v ) = { 1 ,   w h e n   D ( u , v ) D 0 0 ,   w h e n   D ( u , v ) > D 0
Gaussian low-pass filterfrequency domain H ( u , v ) = e D 2 ( u , v ) / 2 σ 2
Adaptive median filternonlinear/
Table 2. Identification accuracy rate in low-density overlapping cases.
Table 2. Identification accuracy rate in low-density overlapping cases.
CasesBranchpoint NumberAccuracy Rate %
Seven targets798.23
Eight targets1097.68
Nine targets1297.16
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ma, J.; Wu, F.; Xie, H.; Gu, F.; Yang, H.; Hu, Z. Uniformity Detection for Straws Based on Overlapping Region Analysis. Agriculture 2022, 12, 80. https://doi.org/10.3390/agriculture12010080

AMA Style

Ma J, Wu F, Xie H, Gu F, Yang H, Hu Z. Uniformity Detection for Straws Based on Overlapping Region Analysis. Agriculture. 2022; 12(1):80. https://doi.org/10.3390/agriculture12010080

Chicago/Turabian Style

Ma, Junteng, Feng Wu, Huanxiong Xie, Fengwei Gu, Hongchen Yang, and Zhichao Hu. 2022. "Uniformity Detection for Straws Based on Overlapping Region Analysis" Agriculture 12, no. 1: 80. https://doi.org/10.3390/agriculture12010080

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop