Next Article in Journal
Chasing a Better Decision Margin for Discriminative Histopathological Breast Cancer Image Classification
Next Article in Special Issue
A Binary Neural Network with Dual Attention for Plant Disease Classification
Previous Article in Journal
Siamese Visual Tracking with Spatial-Channel Attention and Ranking Head Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Pattern Orientation Finder (POF): A Robust, Bio-Inspired Light Algorithm for Pattern Orientation Measurement

by
Alessandro Carlini
* and
Michel Paindavoine
Laboratory for Research on Learning and Development (LEAD), CNRS UMR 5022, University of Burgundy, 21000 Dijon, France
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(20), 4354; https://doi.org/10.3390/electronics12204354
Submission received: 12 September 2023 / Revised: 12 October 2023 / Accepted: 17 October 2023 / Published: 20 October 2023
(This article belongs to the Special Issue Machine Vision and 3D Sensing in Smart Agriculture)

Abstract

:
We present the Pattern Orientation Finder (POF), an innovative, bio-inspired algorithm for measuring the orientation of patterns of parallel elements. The POF was developed to obtain an autonomous navigation system for drones inspecting vegetable cultivations. The main challenge was to obtain an accurate and reliable measurement of orientation despite the high level of noise that characterizes aerial views of vegetable crops. The POF algorithm is computationally light and operable on embedded systems. We assessed the performance of the POF algorithm using images of different cultivation types. The outcomes were examined in light of the accuracy and reliability of the measurement; special attention was paid to the relationship between performance and parameterization. The results show that the POF guarantees excellent performance, even in more challenging conditions. The POF shows high reliability and robustness, even in high-noise contexts. Finally, tests on images from different sectors suggest that the POF has excellent potential for application to other fields as well.

1. Introduction

Autonomous vehicles, autonomous robots, and AI agents are becoming increasingly common [1,2]. The effectiveness of these autonomous systems strongly depends on the reliability of the information and the robustness of its extraction from the environment [3,4,5]. For these reasons, the conception of effective and efficient algorithms is becoming increasingly important. However, the effective extraction of information and features from real-world images, especially in the presence of high levels of noise, remains an open challenge [5,6].
In order to face this challenge and provide a solution for autonomous navigation, the present work introduces a new algorithm for measuring the orientation of patterns, named the Pattern Orientation Finder (POF). The orientation measurement of vegetable rows was the initial context for the development of the POF, which is intended to enable the autonomous flight of drones over cultivations. Nevertheless, subsequent tests revealed a broader field of effective application in which the functional characteristics of the POF offer a significant advantage. Orientation can be measured on any type of 2D image, including images captured via UAV systems such as drones, as well as satellite images, medical or technical imaging, or pictures of everyday life. The tests performed show that the POF algorithm is computationally robust and reliable (even under difficult and noisy conditions, such as in vegetable fields) and that it also achieves good results in the presence of disturbing elements. The Pattern Orientation Finder is the object of Patent Application INPI n. 1660598.
The next paragraph describes the existing methodologies, which are the reference today. In the third paragraph, the POF algorithm and its application to the specific case of crop orientation is described. Paragraph four describes the practical application of the POF optimized to real images; paragraph five presents the results and an analysis. Paragraph six shows a real application with generic parameters and the presence of perturbations in the image. Paragraph seven and the following two Appendices A and B summarize the characteristics of the POF, highlight the possibilities of its application in different contexts, and introduce a variant characterized by the absence of convolution (LOF).

2. Existing Methodologies

Existing methodologies: an unsatisfied need. Although research on autonomous navigation systems proposes a large number of methods for the guidance of terrestrial wheeled robots [7,8,9,10], only a few methodologies appear to be available for their specific implementation in flying systems. The development of a specific solution therefore appears to be the more appropriate solution if an effective and robust system is to be obtained. An image-based flying system collects images from aerial points of view, solves specific problems, bases its algorithm on varying available information, and requires a physically—and computationally—lightweight solution [11,12,13]. Methodologies that perform similar functions have been developed in other domains and are adaptable to the present purpose, but they generally offer insufficient degrees of efficiency, versatility, and/or robustness to noise. The Hough transform (HT) is certainly the most famous and widely adopted technique for geometric feature extraction (such as lines and circles). The classical HT detects structures sharing the same specific features which are expressed by a mathematical formulation and a set of parameters, while the generalized HT is employed when an analytical description of the feature is not possible [14,15,16]. The HT is a powerful tool in shape analysis; it can be adapted to perform different types of shape detection and feature analyses [8,15,17,18,19]. Nevertheless, the HT has some important disadvantages, such as large memory requirements and, more importantly, poor noise resistance that prevents one from obtaining reliable measurements in noisy environments.
Among the best-known methods, we should also mention the Histogram of Oriented Gradients (HOG). The philosophy underlying the HOG and its field of application differ from the purpose of the present work; however, there is a shared point in common in the first phases of calculating the HOG, which aim to define an orientation of certain elements in an image [20,21,22]. However, the difference between the HOG and the POF is fundamental: the first is based on the calculation of the chromatic gradient, whereas the second estimates the maximum variance in luminance alternations. Being based on gradient calculation, the HOG, like the HT, is also sensitive to noise and sees its performance and reliability reduced with noisy images.
Previous work in other specific sectors required the development of dedicated methodologies for measuring the orientation of relevant elements within images. Some methods of interest are devoted to detecting the direction of arrows, for example, in medical imaging. These studies aim to determine the presence and the direction of single or multiple arrows within images, using different techniques such as evolving artificial neural networks, genetic algorithms, or the algorithmic detection of the geometric properties of the arrows [23,24,25,26]. Although their levels of effectiveness are interesting, the proposed methodologies are computationally expensive, and their transposition for the purpose of the present research appears arduous.
Last but not least, it is important to mention Deep Learning networks. Deep Learning networks are increasingly proposed as solutions for many different problems, thanks to their ability to learn—which they are also capable of under very complex conditions—and their high rates of achieving correct results. These networks have proven to be capable of carrying out even complex tasks [27,28,29]. However, Deep Learning networks also have some disadvantageous features, such as the high level of complexity of the computational structure, a long and complex training procedure, the need for large amounts of training data to ensure accurate and reliable results, and, above all, a very high computational cost [30,31,32,33]. Therefore, their use seems unsuitable for our needs. As shown below, the POF guarantees good results, which are usually better than the results of previous methods, although it typically requires a number of arithmetic operations of at least one order of magnitude lower (compared to Deep Networks, the computational cost advantage is even greater). This excellent result is achieved through the use of a computational structure inspired by biological image processing: a convolution with a Gabor filter, which constitutes the first processing layer of the POF. This processing structure takes inspiration from the work of Hubel and Wiesel concerning hierarchical information processing in the visual cortex of a cat [34]. This structure was already the basis of previous bio-inspired architectures for computer vision, such as HMAX [35,36].
In the present context, the use of this architecture for the POF naturally comes within the context of the Precision Agriculture field. The POF allows for the reliable measurement of the orientation of rows, even in very noisy environments. The POF allows for the alignment of the flight of the drone, permitting detailed monitoring over large areas without the inherent high costs and lengthy times of control carried out by an operator on foot or via a land vehicle or the high costs and safety risks involved in the use of larger aircraft. The aim is to locally monitor the condition of plants and soil and to provide them with water, fertilizer, and anti-parasite products in a quantified and localized manner, improving both the effectiveness and efficiency of production in accordance with environmental and sustainable development requirements [2,37,38,39].

3. Method: Pattern Orientation Finder Algorithm

We developed the Pattern Orientation Finder in order to provide (i) a computationally lighter algorithm able to run on mobile systems such as ARMs processors and (ii) a more reliable system to measure pattern orientation in order to guarantee accurate results even in noisy and challenging conditions. The POF measure relies on recurrent variations between adjacent elements of the pattern. The orientation index parameter (OIN) is introduced to “quantify” these variations. The repeated calculation of the OIN along different orientations and then the identification of the larger OIN value enable the orientation of the pattern to be obtained.
The algorithm can be applied in two equivalent ways:
-
The calculation of the OIN can be performed on just one image by performing the analysis according to the different orientations on the plane of the image;
-
Alternatively, it can be calculated while maintaining the same orientation of analysis but using different images, eachof which reproduces the same pattern in a different orientation (for example, by rotating the image taking device).
We consider the second approach to be easier for aerial photographic systems where the aircraft or mounted camera could easily turn on its yaw axis (Figure 1, Panel b).
Having knowledge of the OIN values along a small number of orientations is sufficient to achieve good results: the interpolation of the available data allows us to search for the maximum value on the interpolating curve, which generally provides very accurate results. Given the symmetry of the orientation field, an analysis between 0 and 180 degrees is enough to explore the whole range of solutions. In this interval, five images usually seem to be sufficient to obtain an accurate result, but a larger number of samples provides a measurement with greater accuracy and precision.
The entire calculation of the OIN comprises two steps: the convolution of the image based on the Gabor filter to enhance the alternations between the elements of the pattern (Step 1) and a subsequent computation based on the pixel values in order to obtain a scalar value for each orientation, the OIN (Step 2). Figure 2 presents a flowchart of the whole procedure.
From top to bottom, the six panels summarize the main stages of the Pattern Orientation Finder’s procedure. The procedures in Panels (a) to (d) are performed on each image (corresponding to Steps 1 and 2—see the manuscript for more details) and produce an Orientation Index (OIN) value for each orientation; then, Panels (e) and (f) summarize the procedure that identifies the pattern’s orientation. In detail, (a) any input image featuring three color layers is converted into a single layer. (b) The measurement procedure can be performed on the whole image or on a selected area (“analysis window”; dotted rectangle). (c) The convolution, using a Gabor filter, is conducted on the analysis window. (d) The elements of the resulting matrix are summed by column to obtain a row vector (the Alignment Vector, AVe); the standard deviation values of the elements of this vector provide an index of the alignment between the orientation of the pattern and the orientation of the analysis (Panel c, orange arrows). (e) Knowledge of the OIN values for different directions enables a diagram of its distribution to be created. (f) The maximum of the OIN fitting curve identifies the orientation of the pattern.
  • Step 1: Image convolution
First, three-layer color images are reduced to single-layer images (Figure 2, Panel b). The way the image is converted into a single layer implicitly defines criteria for identifying and differentiating elements of the pattern and, consequently, their orientation. The “classical” conversion to a gray scale is the most common approach, but other methods able to enhance the relevant information enclosed are also possible. Our tests show that in the case of applying the procedure to vegetal rows, simply selecting the blue layer improves the precision of the measure.
A convolution using a Gabor filter is then applied to the image (Figure 2, Panel c) (for information about convolution, see [14,15]). The aim of the convolution is to perform spatial filtering on the image, enhancing the pattern and filtering noise and irrelevant spatial scales. To this end, a single filter size—defined with respect to the pattern features—appears sufficient to produce accurate results. The Gabor filter equation is presented in Appendix A, and its parameters are described in Table 1. The convolution is performed along a single orientation with the single-size filter; consequently, each recurring convolution is computationally inexpensive and fast. As a convention, we arranged the Gabor filter in a vertical orientation, as indicated by the orange arrows in Figure 2 Panel (c).
  • Step 2: The Calculation of the OIN
The outcome of Step 1 is a matrix generated via the convolution; the sum by column of its elements generates the Alignment Vector “AVe” (Figure 2, Panel d). The sum is performed along the columns because it must correspond to the filter’s orientation (i.e., a vertical orientation) in the initial convolution. Depending on the alignment between the orientations of the pattern and the analysis, the AVe is characterized by the alternation of larger and smaller values rather than by uniform average values. Therefore, the calculation of the standard deviation of the AVe allows for the OIN value to be obtained, which provides a measurement of the alignment between the pattern’s orientation and the orientation of the analysis.
As indicated previously, the repetition of Step 1 and Step 2 along different orientations allows for the identification of the pattern’s orientation, which corresponds to the higher value of the OIN. The search for the maximum is more effective and provides more accurate results when performed on a curve interpolating the measurement data. In our tests, the spline interpolation provided the best results.
As described in Table 1, five parameters define the computational structure of the POF: three parameters define the Gabor filter’s shape (“λ”, “σ”, and “γ”), one parameter refers to the Gabor Filter Size (“G_Size”), and finally, the fifth parameter, Window Size (“Win_Size”), refers to the size of the area selected for the measurement of the pattern’s orientation.
Despite the apparent simplicity of the algorithm and the limited number of parameters that regulate its operation, the POF method exhibits interesting capabilities in terms of its accuracy and robustness to noise. In order to assess the performance of the method and to evaluate the relationship between its parameters and its effectiveness, we performed the battery of tests presented in the following paragraph. Our tests show that the POF provides very interesting results, even when its parameters are non-optimized for a specific work context; nevertheless, correct parameterization guarantees accurate measurements even under difficult conditions. Subsequently, we also present the results of a comparison with the HT method in which the POF shows a higher resistance to noise.

4. Tests

To assess the performance of the Pattern Orientation Finder, we performed a battery of tests using the ten images presented in Figure 3. This set of images represents different types of crops cultivated in rows. The images are characterized by different rows, inter-row spacings, and features, different points of view and focal lengths, and different luminance and contrast levels. All original images are color, compressed jpg files measuring 600 × 450 pixels in size. This set of images also enables an assessment of the great variability that characterizes vegetal rows: each cultivation is different, and in a given cultivation, the natural irregularity of the growth of plants and leaves acts as “noise” superimposed on the row’s orientation information (we shall name this effect “vegetal noise”).
In the present assessment, all five parameters of the POF were manipulated to assess their influence on the POF’s performance. Among them, Win_Size deserves special interest because of its impact on the number of patterns being processed (i.e., the number of rows in the present test), as well as on the computational load. Indeed, the larger the Win_Size, the higher the number of pixels that the convolution has to process. At the same time, preliminary tests on regular patterns showed that the larger the Window Size, the more accurate the measure of the orientation is, mainly because of the larger amount of available information. To evaluate the effect of this parameter under real working conditions, we tested six different Window Size values, ranging from 20 pixels up to 220 pixels in size, using square-shaped windows. It is important to note that depending on the image features, smaller Window Sizes can contain few rows: two rows, a single row (Figure 1—Image 3), or only part of a row (Figure 1—Image 5). Table 1 presents all five independent parameters taken into account in the present assessment and their tested values.
For each image, the measurement was performed at three different points of the image. For each one of these points of analysis, all combinations of the five parameters were tested, giving 4500 combinations for each point and 13,500 measurements for each image.
We assessed the POF’s performance using three different indexes. The Measure Error (ME) evaluated the accuracy of the orientation measurement; it is defined as the difference between the POF’s measurement and the pattern’s actual orientation (ME = Measured orientation − True orientation). The pattern orientation measurements and the ME are expressed in degrees. Figure 4 shows how ME varies as result of POF parameters. In order to provide information about the robustness and the reliability of the calculation, we defined two parameters: the Peak Saliency (PkS) is defined as the distance in magnitude between the maximum value of the Alignment Vector (AVe) and its mean value (PkS = AVe_MAX/AVe_AVER). The Peak Distance from the second peak (PkD2) is defined as the distance in magnitude between the maximum value and the second largest peak of the AVe vector (PkD2 = AVe_PEAK1/AVe_PEAK2). The two parameters are graphically presented in Panel (a) of Figure 5. The PkS and PkD2 are expressed as percentages of the mean AVe value and of the value of the second peak, respectively. For instance, a PkS equal to 300% means that the main peak (corresponding to the identified orientation) has a magnitude equal to three times the mean AVe value; in the same manner, a PkD2 equal to 200% means that the magnitude of the main peak is double that of the second peak.
Concerning the identification of the reference of the “correct orientation” of rows, within each image, it should be pointed out that this parameter is constant only locally and that it varies along the same row. The irregular growth of plants and leaves, in addition to the optical deformations of the image due to the camera lens, naturally leads to the obtention of different row orientations in different parts of the same image, sometimes even consistently, as in images 5 and 8 (Figure 1). Consequently, the measurement of the row-orientation reference could be affected by a measurement error, which we estimate to be less than half of a degree in the present assessment.

5. Results

The results of the tests on the POF show that by using a set of parameters optimized for the specific spatial scale to be measured, it is possible to obtain a measurement error of nearly zero with every image, even under the most difficult conditions (ME < 1°, also for images 3, 5, and 8 of Figure 1). Based on a functional analysis, the worst condition appears to be when the filter shape becomes wider than longer. In this case, the envelope of wavelet-peaks generates a convolution orthogonal to the nominal orientation and, consequently, leads to the incorrect identification of the pattern’s orientation. Usefully, these types of ineffectiveness could easily be prevented through the correct choice of parameters in order to guarantee the functionality of the POF components and regardless of the image features. In the present example, for instance, to avoid the modification of the Gabor filters into wavelets, it is advantageous to choose higher values for Lambda and smaller values for Sigma independently of the pattern features. Moreover, in some cases, a smaller Gabor Filter Size could reduce this unfavorable effect by selecting just the central wave of the wavelet.
For each image, Table 2 reports both the best results for each image with an individual optimisation (Columns B), and the results referring to the Generally Optimized Set of parameters “GOSet” (Columns C). The GOSet is defined as the set of values for the five POF parameters that provide the best average overall result for the collection of ten images. GOSet corresponds to the following values: Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180. Figure 3 graphically represents the ME values for each image, corresponding to the GOSet parameterization. Although these results refer to a non-optimized set of parameters, we can note that the measurements remain remarkably accurate. Also, under these conditions, the mean error remains lower than 2 degrees for all images (except for Image 5, which represents the more extreme and challenging conditions; this image is characterized by an important optic deformation and a high level of vegetal noise, and the Window Size contains only one vegetable row).
For each image, the graph shows the Measure Error (ME) values corresponding to the best condition (solid line) and to the Generally Optimized Set of parameters (dotted line;, blue markers). The best condition corresponds to the parameterization providing the best possible accuracy for each individual image. The Generally Optimized Set of parameters, the “GOSet”, is defined as the set of parameter values providing the best average result over the whole collection of images; it corresponds to the following values: Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180. The results show that the POF guarantees accurate results for all images, even when using a set of parameters that are not optimized for each individual context. The Measure Error is defined as the difference between the real and measured orientations of the pattern.
An analysis of the results for each image reveals that three parameters mainly influence the Measure Error: the Window Size, Lambda, and the Gabor Filter Size. Panel b presents the qualitative relationship between the Measure Error (ME) and each one of these three parameters. Regarding the Gabor Size plot, the solid line represents the majority of cases; the dashed line shows that the relationship can reverse, depending on the characteristics of the image (in our tests, it happened in about 25% of cases).
The overall results suggest that the method remains precise and stable, even when the parameters are not optimized for a specific application condition. A deeper analysis indicates that the main unfavorable factors are (i) a Window Size that is too narrow, enclosing too few rows or just one single row, resulting in the loss of the pattern’s nature and features; (ii) an irregular or deformed image of the pattern due to photographic/optical effects, such as the “fish-eye” deformation, or too close a point of view; (iii) the presence of perturbing elements within the pattern; (iv) the presence of marked shadows; when the sun is low in the sky, and depending on row features, plants can cast long shadows, creating a secondary and noisy pattern. To avoid these issues, it is important to adopt an adequate Window Size in order to guarantee the correct amount of relevant information and its primacy over the possible noise; it is equally important to adopt larger sizes for the Gabor filter so as to reduce the effect of disturbing elements and noise.
Panel (a) of Figure 4 shows the relationship between ME’s mean values and each of the five POF parameters; Panel (b) summarizes the general trends in the ME with respect to the three most influential parameters: Window Size, Lambda, and Gabor Size. The results confirm that larger Window Sizes provide smaller Measure Errors; in the presence of regular patterns and in the absence of disturbing elements, larger Window Sizes actually provide larger quantities of pertinent information, as well as more favorable information-to-noise ratios. Concerning the Lambda parameter, it controls the wavelength of the Gabor filter. Larger Lambda values reduce the harmonic frequency of the Gabor filter, and, in the present application, they improve the effectiveness of the Gabor filter and the accuracy of the orientation measurement.
Finally, results concerning the Gabor Size parameter deserve special consideration: under some conditions, smaller sizes could reduce the ME, as is shown in the third trend chart of Figure 4, Panel b. In our tests with images of vegetable rows, this apparently favorable condition occurs in 75% of cases when the images contain regular patterns. Consequently, we may be led to select smaller filter sizes to reduce computational costs and at the same time expect to obtain a more accurate measurement of the orientation. Unfortunately, this strategy only works under ideal or optimal conditions. In practice, a small Gabor filter size also entails a concurrent reduction in reliability, as presented via the dotted line in Panel (b) of Figure 4. Moreover, too small a filter is less effective at eliminating the noise and/or the perturbing elements present in the image, potentially making the measurement of direction less accurate. Images 3 and 5 provide practical examples of this condition (Figure 1).
The numerical results of PkS and PkD2 for each image tested are presented in Table 2; Panel (b) of Figure 5 shows the PkS and PkD2 values in “GOSet”—Generally Optimized Set of parameter conditions. The PkD2 values refer to the mean values among conditions for which a second peak is present (corresponding to 92.5% of the trials). The blue line in the chart corresponds to the PkS values and shows that, for all images, the PkS is larger than 200%, and in eight cases, it is in the order of 300% or greater. The gray line shows the PkD2 values. For Image 6 alone, the PkD2 appears lower than 300%. The comparison of characteristics between different images confirms that both the geometrical and color features of images influence the POF’s performance: The POF provides higher values of PkS and PkD2 when images contain regular and well-defined pattern structures and with greater luminance contrast between rows and inter-row spacing.
Panel (c) of Figure 5 presents the PkS and PkD2 trends toward the three most influential parameters: Window Size, Lambda, and Gabor Size. These trends are generally the opposite of the tendencies of the ME in Figure 4. Hence, larger values for Window Size and Lambda improve both the accuracy and the reliability of the measurement. Regarding the trends in PkS and PkD2 versus the Gabor Filter Size, similar to previous results, it is co-variant in some cases and contra-variant in others. In the majority of cases, a larger Gabor Filter Size improves the reliability and robustness of the measurement (the solid line in the chart), filtering the noise and some secondary patterns more effectively. More rarely, a larger Gabor Size leads to reductions in the PkS and PkD2 (the dotted line in the chart).
Two parameters allow for an evaluation of the robustness and the reliability of the POF measure: the Peak Saliency (PkS) and the Peak Distance (PkD2). Panel (a) graphically presents the meaning of each parameter: The Peak Saliency is defined as the ratio between the maximum value and the average value of the Alignment Vector (AVe), where the maximum value corresponds to the pattern orientation identified via the POF. The Peak Distance is defined as the ratio between the maximum value (corresponding to the “first peak”) and the second peak (i.e., the second highest maximum value). The PkD and PkD2 are expressed as percentages. The larger the PkS and PkD2 values, the more robust and reliable the orientation measured by the POF is. Panel (b) shows the results of tests in terms of the PkS and PkD2 for each image. The results presented refer to the Generally Optimized Set of parameters (“GOSet”), corresponding to the following values: Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180. Moreover, by adopting this non-optimized set of parameters, the POF guarantees results in terms of tge PkS and PkD2 over the 200% for all images. The only exception is represented by image #6 (PkS = 177% and PkD2 = 143%), in which the useful information consisting of the vegetable row lines has a power close to the “vegetal noise” on the image. The mean overall values for the ten images are PkS = 374% and PkD2 = 449%. Panel (c) represents the qualitative relation between both the Peak Saliency and Peak Distance (the two parameters are characterized by the same trends) and the POF parameters Window Size, Lambda, and Gabor Filter Size.
The POF algorithm demonstrates great robustness and efficiency which are due mainly to its statistical nature. The use of only the image brightness channel optimizes the required computational resources. These operational characteristics generally ensure full effectiveness, even when the space between rows is occupied by vegetation, thanks to the information contributed by ambient light. Only a condition that leads to a grayscale conversion in a way that nullifies the difference between rows and gaps would reduce the effectiveness of the POF.

6. Typical Applications

6.1. Application in the Field of Farming

To provide an example of a typical application of the Pattern Orientation Finder in the field of farming, let us consider Panel (a) of Figure 6. The image schematizes the flight of a drone over a vineyard and the measurement of row orientation in three subsequent points above the crop to provide information relevant for autonomous navigation. Using the POF, we measured the orientation of the rows in each area centered on the three points. The choice of the parameter values adopted in this example remains the same as the previously defined “GOSet” to demonstrate the generality and the robustness of the algorithm, even with non-optimized parameterization (Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180). The size of the original crop image is 1024 × 682 pixels; each of the three analysis windows is 180 × 180 pixels. Two perturbing elements (two farm tractors) are present within the image; points 2 and 3 partially overlap their images.
The OIN distribution generated by the POF during the orientation measurement in Point 1 is presented in Panel (b) of Figure 6 (Points 2 and 3 provided equally shaped distributions). The OIN was calculated every 10 degrees, and the measurements were interpolated using the spline method (Matlab function). The dotted line highlights the peak of the OIN fitting curve, corresponding to 7 degrees, which corresponds to the identified pattern orientation. Panel (b) shows the ME, PkS, and PkD2 values for each point of measurement. In this simple example, the results show that the measurement remains accurate even if the analysis window contains parts of disturbing elements (ME < 1°). The increase in the ME corresponding to Points 2 and 3 seems to be attributable to a different inflection of the curve interloping the OIN measured due to the presence of the disturbing elements. We also tested and analyzed the Gabor Filter Size parameter within the extended range (1 ÷ 100 pixels), in order to verify its impact on performance. The trends in the ME and of PkS toward the Gabor Filter Size are presented in Panel (c) (on the left- and right-hand sides, respectively). In this case, the increase in the Gabor Filter Size leads to increases in both the ME and PkS. The choice of G_Size initially made (i.e., 10 × 10 pixels) appears to be an appropriate compromise between the need for accuracy and reliability/robustness.
To evaluate the robustness to noise of the POF, we compared it with the Hough method, using the images in Figure 1. The results show that in the present application, the Hough method manifests relevant difficulties. Indeed, the Hough method produces orientation measurements with lower levels of accuracy, and in some cases, the method is incapable of producing a result. An analysis revealed that the presence of a high level of noise disturbs its orientation measurement. The statistical nature of the POF, however, allows this algorithm to remain effective even in the presence of a high level of noise. Figure 7 shows, as example, a comparison of the results for image #7 (low noise) and image #3 (high noise). In the second case, only the POF algorithm is capable of providing a correct measure of orientation (white arrow).

6.2. Other Applications

A further analysis shows the POF’s high accuracy and robustness qualities even in contexts other than those of crops. Figure 8 presents seven examples of its application; the purpose of this collection, composed of images from very different fields, is to demonstrate the flexibility and the reliability of the POF algorithm. Once again, in these cases, the POF was parameterized using the GOSet values (Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180). All images were 600 × 450 pixels in size. For each image, two or three measurements were performed in different areas containing a pattern to measure (orange dotted rectangles). Blue arrows show the measured orientations in each station. As we can see, the accuracy is high also when using the same set of parameters for all images and patterns.

7. Conclusions and Future Work

The effective extraction of features from images is a major challenge in artificial vision. This capability is a key point for any system that needs to be autonomous or provide reliable image-based services.
Here, we presented an effective method of measuring the orientation of patterns. More specifically, we introduced a new algorithm to measure the orientation of parallel elements. The main application is the autonomous flight of drones for the control of crops. This method, named the Pattern Orientation Finder (POF), is based on the analysis of images of a row pattern; the procedure consists of the convolution of the original image and the subsequent calculation of the Orientation Index, a statistically-based parameter. The adoption of just one size and one orientation for the Gabor filter used in convolution makes the algorithm computationally light. This feature is very important, especially for a flying system such as a drone. One of the most important factors for such a flying system is the operational range, which decreases as fast as the embarked elements absorb energy. In addition, a simple algorithm guarantees not only a higher computational velocity but also a generally greater compatibility with hardware (e.g., processor, storage, and memory), a higher degree of reliability, and a generally better cost–benefit relationship. In the present work, the POF was tested with ten images representing plant cultivations in rows; the images were different from each other in terms of the type of crop and the image features. The results demonstrate that the POF guarantees highly valuable accuracy and reliability, even under difficult conditions and in presence of high levels of noise. Tests showed that in order to obtain a high level of accuracy, the selection of the optimal parameters is important but not mandatory. With regular patterns, including when the parameters are not optimized, the POF provides accurate and reliable results across a wide range of conditions. When irregularities or perturbing elements are present within the image, such as intrusive elements within the pattern or superimposed secondary patterns, the choice of a correct parameterization becomes more important in order to guarantee a correct measurement.
The tests performed show that the choice of the parameter values also influences the computational load of the measurement, and some parameters are more influential than others. Enlarging the image analysis window and the size of the Gabor filter results in more than proportional increases in the computation time and load. However, the choice of a small size for these two factors must be mediated via consequent reductions in the accuracy and reliability of the outcome. We also provided an example of a typical POF application consisting of a hypothetical drone flying over a crop and performing measurements of row orientation at three subsequent points.
It is now advisable to continue research into the use of the POF under embedded, real-time conditions, using a real drone flying over vegetable crops. This forthcoming study can lead to the functional optimization of the POF in actual flight utilizations and the evaluation of its operational performance. Moreover, the study of the functional coupling between the POF and the modules of the flight control system—the inertial system, for instance—will certainly lead to an effective system which is even more efficient from an energetic point of view.
Finally, the POF also demonstrated interesting characteristics of accuracy and reliability when applied in other contexts: secondary tests on different images showed that the POF can be utilized in many other fields, such as the life and Earth sciences, medical imaging, biometric data processing, the feature analysis of photographed objects or animals, or—more generally—in the classification of images and more. It seems to us recommendable to carry out research in these areas where the use of the POF could make new performances or even new features available.

Author Contributions

M.P. and A.C. conceived the presented method; A.C. performed tests and analyzed the data; M.P. performed a comparative assessment between the Hough and POF methods; M.P. and A.C. wrote the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the French National Research Agency ANR (Agence Nationale de la Recherche) [ANR-12-INSE-0009] and the Conseil Régional Bourgogne Franche Comté [COGSTIM Project]. The funding sources did not play a role in any of the research phases, in the data collection or analysis, or the decision making for publication.

Data Availability Statement

All the images used in the present work are available at the following URL: http://leadserv.u-bourgogne.fr/POF/ (accessed on 1 October 2023).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Gabor Parameters

EQ_1 represents the system equations defining the Gabor filter implemented in the convolution (Step 1 of the POF):
E Q _ 1 F u 1 , u 2 = e x p u ^ 1 2 + γ 2 u ^ 2 2 2 σ 2 × cos 2 π λ u ^ 1 u ^ 1 = u 1 cos θ + u 2 sin θ u ^ 2 = u 1 sin θ + u 2 cos θ
Four parameters are present in the equations: “λ“, “σ”, and “γ” define the geometry of the filter, and “θ” designates the orientation. “λ” represents the wavelength of the sinusoidal factor; “σ” controls the radius of the Gaussian distribution; and “γ” is the spatial aspect ratio, and specifies the ellipticity of the Gabor function. In the present implementation, the “θ” parameter assumes a single, fixed value corresponding to the unique orientation of the filter in the convolution; by default, we chose the vertical orientation. In order to optimize the measurement performance, other parameters should be set accordingly to the image and the pattern features to enhance the pattern’s elements and to filter any noise present in the image. Table 1 shows, for all parameters, the interval of the value tested in the present work.

Appendix B. Local Orientation Finder

Preliminary tests on different images suggested that the POF still provides good results even with small Gabor filter sizes in most applications. To provide an example, in the “typical application” assessment (Figure 6), we extended the testing conditions to a wider range of values for the Gabor Filter Size between 1 and 100 pixels. It is important to note that the condition under which the filter has unitary value is special because the filter becomes ineffective. The results presented in Panel (c) of Figure 6 show that, in this case, with smaller Gabor Filter Size values, the performance is partially reduced, but its level remains satisfactory. The example presented is a particularly favorable case because the ME decreases with a reduction in the filter’s size. When the Gabor Size is unitary (i.e., without filtering), the PkS does not fall below 330%.
The exclusion of the first step of the POF algorithm, consisting of image convolution, modifies the computational structure but leaves the overall philosophy of the algorithm unchanged. The new version of the algorithm thus obtained is devoid of the search and accentuation of a pattern and becomes a local descriptor; we can name this new algorithm the “Local Orientation Finder” (LOF). In fact, the Local Orientation Finder appears to work well on small portions of images and when measuring the orientation of the element characterized by the greatest contrast in luminance.
The LOF, deprived of convolution, remains effective when a pattern is present and provides a “general” and even faster measurement than the POF. At the same time, it is advisable to bear in mind its limits: the LOF does not emphasize a pattern before the measurement, and consequently, it appears more sensitive to any perturbing element present in the images. The POF is not affected by this issue.

References

  1. Botta, A.; Cavallone, P.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Appl. Mech. 2022, 3, 830–854. [Google Scholar] [CrossRef]
  2. Mogili, U.R.; Deepak, B.B.V.L. Review on Application of Drone Systems in Precision Agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  3. Bai, Y.; Zhang, B.; Xu, N.; Zhou, J.; Shi, J.; Diao, Z. Vision-based navigation and guidance for agricultural autonomous vehicles and robots: A review. Comput. Electron. Agric. 2023, 205, 107584. [Google Scholar] [CrossRef]
  4. Marwah, N.; Singh, V.K.; Kashyap, G.S.; Wazir, S. An analysis of the robustness of UAV agriculture field coverage using multi-agent reinforcement learning. Int. J. Inf. Technol. 2023, 15, 2317–2327. [Google Scholar] [CrossRef]
  5. Bhat, S.A.; Huang, N.F. Big Data and AI Revolution in Precision Agriculture: Survey and Challenges. IEEE Access 2021, 9, 110209–110222. [Google Scholar] [CrossRef]
  6. Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in agriculture robotics: A state-of-the-art review and challenges ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
  7. Xue, J.; Xu, L. Autonomous Agricultural Robot and its Row Guidance. In Proceedings of the 2010 International Conference on Measuring Technology and Mechatronics Automation, Changsha, China, 13–14 March 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 725–729. [Google Scholar]
  8. Marchant, J.A.; Brivot, R. Real-Time Tracking of Plant Rows Using a Hough Transform. Real-Time Imaging 1995, 1, 363–371. [Google Scholar] [CrossRef]
  9. Jiang, G.-Q.; Zhao, C.-J.; Si, Y.-S. A machine vision based crop rows detection for agricultural robots. In Proceedings of the 2010 International Conference on Wavelet Analysis and Pattern Recognition, Qingdao, China, 11–14 July 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 114–118. [Google Scholar]
  10. Rahmadian, R.; Widyartono, M. Machine Vision and Global Positioning System for Autonomous Robotic Navigation in Agriculture: A Review. J. Inf. Eng. Educ. Technol. 2017, 1, 46–54. [Google Scholar] [CrossRef]
  11. Xue, J.; Zhang, L.; Grift, T.E. Variable field-of-view machine vision based row guidance of an agricultural robot. Comput. Electron. Agric. 2012, 84, 85–91. [Google Scholar] [CrossRef]
  12. Duffy, J.P.; Cunliffe, A.M.; DeBell, L.; Sandbrook, C.; Wich, S.A.; Shutler, J.D.; Myers-Smith, I.H.; Varela, M.R.; Anderson, K. Location, location, location: Considerations when using lightweight drones in challenging environments. Remote Sens. Ecol. Conserv. 2017, 4, 7–19. [Google Scholar] [CrossRef]
  13. Zhang, J.; Hu, J.; Lian, J.; Fan, Z.; Ouyang, X.; Ye, W. Seeing the forest from drones: Testing the potential of lightweight drones as a tool for long-term forest monitoring. Biol. Conserv. 2016, 198, 60–69. [Google Scholar] [CrossRef]
  14. Duda, R.O.; Hart, P.E. Use of the Hough transformation to detect lines and curves in pictures. Comm. ACM 1971, 15, 11–15. [Google Scholar] [CrossRef]
  15. Ballard, D.H. Generalizing the Hough transform to detect arbitrary shapes. Pattern Recognit. 1981, 13, 111–122. [Google Scholar] [CrossRef]
  16. Illingworth, J.; Kittler, J. A survey of the hough transform. Comput. Vision Graph. Image Process. 1988, 44, 87–116. [Google Scholar] [CrossRef]
  17. Wolfson, H.J. Generalizing the generalized hough transform. Pattern Recognit. Lett. 1991, 12, 565–573. [Google Scholar] [CrossRef]
  18. Barinova, O.; Lempitsky, V.; Kholi, P. On detection of multiple object instances using hough transforms. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 1773–1784. [Google Scholar] [CrossRef] [PubMed]
  19. Leemans, V.; Destain, M.-F. Line cluster detection using a variant of the Hough transform for culture row localisation. Image Vis. Comput. 2006, 24, 541–550. [Google Scholar] [CrossRef]
  20. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  21. Dalal, N.; Triggs, B. Histograms of Oriented Gradients for Human Detection. In Proceedings of the Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–25 June 2005; pp. 886–893. [Google Scholar]
  22. Déniz, O.; Bueno, G.; Salido, J.; De La Torre, F. Face recognition using Histograms of Oriented Gradients. Pattern Recognit. Lett. 2011, 32, 1598–1603. [Google Scholar] [CrossRef]
  23. Santosh, K.C.; Wendling, L.; Antani, S.; Thoma, G.R. Overlaid Arrow Detection for Labeling Regions of Interest in Biomedical Images. IEEE Intell. Syst. 2016, 31, 66–75. [Google Scholar] [CrossRef]
  24. Cheng, B.; Stanley, R.J.; De, S.; Antani, S.; Thoma, G.R. Automatic Detection of Arrow Annotation Overlays in Biomedical Images. Int. J. Healthc. Inf. Syst. Inform. 2011, 6, 23–41. [Google Scholar] [CrossRef]
  25. Wendling, L.; Tabbone, S. A new way to detect arrows in line drawings. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 935–941. [Google Scholar] [CrossRef] [PubMed]
  26. You, D.; Simpson, M.; Antani, S.; Demner-Fushman, D.; Thoma, G.R. A Robust Pointer Segmentation in Biomedical Images toward Building a Visual Ontology for Biomedical Article Retrieval; Zanibbi, R., Coüasnon, B., Eds.; International Society for Optics and Photonics: Bellingham, WA, USA, 2013; Volume 8658, p. 86580Q. [Google Scholar]
  27. Smolyanskiy, N.; Kamenev, A.; Smith, J.; Birchfield, S. Toward low-flying autonomous MAV trail navigation using deep neural networks for environmental awareness. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
  28. Jung, S.; Lee, H.; Hwang, S.; Shim, D.H. Real Time Embedded System Framework for Autonomous Drone Racing using Deep Learning Techniques. In Proceedings of the 2018 AIAA Information Systems-AIAA Infotech @ Aerospace, Kissimmee, FL, USA, 8–12 January 2018. [Google Scholar]
  29. Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of ‘MangoYOLO’. Precis. Agric. 2019, 20, 1107–1135. [Google Scholar] [CrossRef]
  30. Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, 13–15 May 2010. [Google Scholar]
  31. Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
  32. Sutskever, I.; Martens, J.; Dahl, G.; Hinton, G. On the importance of initialization and momentum in deep learning. In Proceedings of the 30th International Conference on Machine Learning, Atlanta, GA, USA, 16–21 June 2013. [Google Scholar]
  33. Srivastava, N.; Hinton, G.; Krizhevsky, A.; Salakhutdinov, R. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
  34. Hubel, D.H.; Wiesel, T.N. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J. Physiol. 1962, 160, 106. [Google Scholar] [CrossRef]
  35. Chikkerur, S.; Poggio, T. Approximations in the HMAX Model. Comput. Sci. Artif. Intell. Lab. Tech. Rep. 2011, 1–12. [Google Scholar]
  36. Carlini, A.; Boisard, O.; Paindavoine, M. Analysis of HMAX algorithm on black bar image dataset. Electronics 2020, 9, 567. [Google Scholar] [CrossRef]
  37. Stehr, N.J. Drones: The Newest Technology for Precision Agriculture. Nat. Sci. Educ. 2015, 44, 89. [Google Scholar] [CrossRef]
  38. Pederi, Y.A.; Cheporniuk, H.S. Unmanned Aerial Vehicles and new technological methods of monitoring and crop protection in precision agriculture. In Proceedings of the 2015 IEEE International Conference Actual Problems of Unmanned Aerial Vehicles Developments (APUAVD), Kyiv, Ukraine, 13–15 October 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 298–301. [Google Scholar]
  39. Collins, C.A.; Roberson, G.T.; Hale, S.A. The Assessment of Accuracy and Stability for a UAS Sensor Platform as a Precision Agriculture Management Tool in Detecting and Mapping Geospatial Field Variability. In Proceedings of the 2018 ASABE Annual International Meeting, Detroit, MI, USA, 29 July–1 August 2018; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2018; p. 1. [Google Scholar]
Figure 1. Images used for the tests. Panel (a) presents the ten images adopted for the assessment of the Pattern Orientation Finder. The images are characterized by different row widths, spacings between rows, optical focal lengths, orientations, distances from the ground, luminance contrasts, levels of noise, and the types of crops cultivated. Images 1 to 6 represent vineyards; images 7 and 8 represent crops of flowers, tulips and sunflowers, respectively; image 9 represents a cotton crop; and finally, image 10 represents a cabbage crop. Some images intentionally contain irregularities or disturbing elements. All images are color images, compressed jpg files measuring 600 × 450 pixels. Panel (b) illustrates image acquisition using a flying system (e.g., a drone), and the different directions obtained via the rotation of the drone or the camera.
Figure 1. Images used for the tests. Panel (a) presents the ten images adopted for the assessment of the Pattern Orientation Finder. The images are characterized by different row widths, spacings between rows, optical focal lengths, orientations, distances from the ground, luminance contrasts, levels of noise, and the types of crops cultivated. Images 1 to 6 represent vineyards; images 7 and 8 represent crops of flowers, tulips and sunflowers, respectively; image 9 represents a cotton crop; and finally, image 10 represents a cabbage crop. Some images intentionally contain irregularities or disturbing elements. All images are color images, compressed jpg files measuring 600 × 450 pixels. Panel (b) illustrates image acquisition using a flying system (e.g., a drone), and the different directions obtained via the rotation of the drone or the camera.
Electronics 12 04354 g001
Figure 2. Pattern Orientation Finder: procedure flowchart.
Figure 2. Pattern Orientation Finder: procedure flowchart.
Electronics 12 04354 g002
Figure 3. Measure error: the accuracy of the pattern orientation measurement.
Figure 3. Measure error: the accuracy of the pattern orientation measurement.
Electronics 12 04354 g003
Figure 4. Effect of the POF’s parameters on the Measure Error. Panel (a): each box presents one of the five independent parameters of the POF, its tested values, and the related ME value in terms of the mean value and standard deviation throughout the ten images. The results presented here refer to the average variation of each individual parameter on the basis of the GOSet parameterization. Panel (b) presents the qualitative relation between the Measure Error (ME) and each one of these three parameters. Regarding the Gabor Size plot, the solid line represents the majority of cases; the dashed line shows that the relationship can reverse, depending on the characteristics of the image (in our tests it happens in about 25% of cases).
Figure 4. Effect of the POF’s parameters on the Measure Error. Panel (a): each box presents one of the five independent parameters of the POF, its tested values, and the related ME value in terms of the mean value and standard deviation throughout the ten images. The results presented here refer to the average variation of each individual parameter on the basis of the GOSet parameterization. Panel (b) presents the qualitative relation between the Measure Error (ME) and each one of these three parameters. Regarding the Gabor Size plot, the solid line represents the majority of cases; the dashed line shows that the relationship can reverse, depending on the characteristics of the image (in our tests it happens in about 25% of cases).
Electronics 12 04354 g004
Figure 5. Analysis of robustness and reliability. Panel (a) graphically presents the definition of the two parameters for measuring Robustness and Reliability. The blue curve represents a possible trend of OIN values between 0 and 180 degrees. Peack Saliency is defined as the distance between the maximum and average values, while Peak Distance is defined as the difference between the first and second peaks. Panel (b) presents the PkS and PhD2 values for the ten images in Figure 1. It is interesting to note that for all images the values are significantly higher than 200%, except for image n. 6. Panel (c) presents the qualitative relation between PkS and PkD2, and the three parameters Window size, Lamba, and Gabor Size.
Figure 5. Analysis of robustness and reliability. Panel (a) graphically presents the definition of the two parameters for measuring Robustness and Reliability. The blue curve represents a possible trend of OIN values between 0 and 180 degrees. Peack Saliency is defined as the distance between the maximum and average values, while Peak Distance is defined as the difference between the first and second peaks. Panel (b) presents the PkS and PhD2 values for the ten images in Figure 1. It is interesting to note that for all images the values are significantly higher than 200%, except for image n. 6. Panel (c) presents the qualitative relation between PkS and PkD2, and the three parameters Window size, Lamba, and Gabor Size.
Electronics 12 04354 g005
Figure 6. POF typical application. Panel (a)—Left: the figure schematizes a typical application, the measurement of row direction via a drone operating along its path and flying over a vineyard. For example, we consider a drone obtaining three subsequent measurements; each area of analysis (black-line boxes) measures 180 × 180 pixels in a gray color scale and is a compressed jpeg. Two disturbing elements—two farm tractors—are present among the rows of plants. Stations 2 and 3 partially include the images of the disturbing elements. Right: the results of the POF measure and a comparison with the precision offered by the Hough Method. The POF parameterization corresponds to the same values of “GOSet” previously introduced: Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180. Panel (b)—Left: the diagram presents the Orientation INdex (OIN) distribution generated via the POF and corresponding to Point 1 (Points 2 and 3 have the same trend). Only one large peak is present, corresponding to a 7° orientation. Right: the three diagrams provide the results of the row direction measurement in terms of the Measure Error (ME), Peak Saliency (PkS), and Peak Distance to the 2° peak (PkD2). Panel (c)—Left: the effect of the Gabor Filter Size (G_Size) on the Measure Error. Right: the effect of the Gabor Filter Size (G_Size) on the Peak Saliency.
Figure 6. POF typical application. Panel (a)—Left: the figure schematizes a typical application, the measurement of row direction via a drone operating along its path and flying over a vineyard. For example, we consider a drone obtaining three subsequent measurements; each area of analysis (black-line boxes) measures 180 × 180 pixels in a gray color scale and is a compressed jpeg. Two disturbing elements—two farm tractors—are present among the rows of plants. Stations 2 and 3 partially include the images of the disturbing elements. Right: the results of the POF measure and a comparison with the precision offered by the Hough Method. The POF parameterization corresponds to the same values of “GOSet” previously introduced: Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180. Panel (b)—Left: the diagram presents the Orientation INdex (OIN) distribution generated via the POF and corresponding to Point 1 (Points 2 and 3 have the same trend). Only one large peak is present, corresponding to a 7° orientation. Right: the three diagrams provide the results of the row direction measurement in terms of the Measure Error (ME), Peak Saliency (PkS), and Peak Distance to the 2° peak (PkD2). Panel (c)—Left: the effect of the Gabor Filter Size (G_Size) on the Measure Error. Right: the effect of the Gabor Filter Size (G_Size) on the Peak Saliency.
Electronics 12 04354 g006
Figure 7. Comparison of the Hough and POF methods. The upper and lower pairs of images show the results of orientation measurements using, respectively, the Hough method (A,B) and the POF method (C,D). As evidenced by the images, when the pattern exhibits a low noise level (A,C), both methods produce a good result (the white arrow indicates that the measured orientation is correct). However, when the image has a high level of noise (B,D), the Hough method loses its effectiveness (the red arrow indicates the incorrect orientation generated by Hough), while the POF method remains effective.
Figure 7. Comparison of the Hough and POF methods. The upper and lower pairs of images show the results of orientation measurements using, respectively, the Hough method (A,B) and the POF method (C,D). As evidenced by the images, when the pattern exhibits a low noise level (A,C), both methods produce a good result (the white arrow indicates that the measured orientation is correct). However, when the image has a high level of noise (B,D), the Hough method loses its effectiveness (the red arrow indicates the incorrect orientation generated by Hough), while the POF method remains effective.
Electronics 12 04354 g007
Figure 8. Examples of different applications. The seven panels provide different examples of the POF applied to the measurement of orientation. From top to bottom and from left to right: (1) different directions in vineyards; (2) zebra stripes; (3) cardiac muscle tissue; (4) solar panel rows; (5) mineral layers; (6 and 7) striped clothes, and a body contour. The POF demonstrated a high level of accuracy under all conditions. We included the intentionally erroneous result with the zebra fur to provide an example of error due to the presence of a secondary contrast of luminance coexisting within the window analysis and competing with the pattern rows. Instead, this same condition is positively exploited in silhouette detection (7). On each picture, we highlighted the two or three points of analysis, the windows of analysis (180 × 180 pixels); blue arrows show the orientation measured by the POF. The wide range of contexts and subjects demonstrates the versatility of the POF algorithm. The ability to provide correct results using the same “generic” GOSet parameterization previously presented demonstrates the reliability, robustness, and accuracy of the Pattern Orientation Finder algorithm.
Figure 8. Examples of different applications. The seven panels provide different examples of the POF applied to the measurement of orientation. From top to bottom and from left to right: (1) different directions in vineyards; (2) zebra stripes; (3) cardiac muscle tissue; (4) solar panel rows; (5) mineral layers; (6 and 7) striped clothes, and a body contour. The POF demonstrated a high level of accuracy under all conditions. We included the intentionally erroneous result with the zebra fur to provide an example of error due to the presence of a secondary contrast of luminance coexisting within the window analysis and competing with the pattern rows. Instead, this same condition is positively exploited in silhouette detection (7). On each picture, we highlighted the two or three points of analysis, the windows of analysis (180 × 180 pixels); blue arrows show the orientation measured by the POF. The wide range of contexts and subjects demonstrates the versatility of the POF algorithm. The ability to provide correct results using the same “generic” GOSet parameterization previously presented demonstrates the reliability, robustness, and accuracy of the Pattern Orientation Finder algorithm.
Electronics 12 04354 g008
Table 1. POF parameters.
Table 1. POF parameters.
ParameterDescriptionTested Values
λ (Lambda)Wavelength of the Gabor filter[2–4–7–12–20]
σ (Sigma)Envelope of the Gabor filter[2–4–7–12–20]
γ (Gamma)Aspect ratio of the Gabor filter[0.5–1–1.5–2–2.5–3]
G_Size (Gabor Filter Size)Size of the Gabor filter[10–20–35–60–100]
Win_Size (Window Size)Selected and processed area size[20–60–100–140–180–220]
Five parameters characterize the POF algorithm and were manipulated in the present assessment: the first three parameters, “λ”, “σ”, and “γ”, define the shape of the Gabor filter; the Gabor Filter Size size (“G_Size”) refers to the size of the filter matrix; and the Window Size (“Win_Size”) refers to the size of the image (or part of the image) initially selected for the orientation measurement. G_Size and Win_Size are expressed in pixels.
Table 2. Results of tests.
Table 2. Results of tests.
Image Number(A)
Image Features
(B)—Best Results
(Each Image, Individually)
(C)—Generally Optimized Parameters “GOSet”
Row WRow DMEPkSPkD2MEPkSPkD2
118−0.440.00173.6149.8−0.5570.9519.6
222−0.370.00318.4187.5−0.3526.6446.1
3730.570.00151.3134.70.8251.4399.3
4202.510.00157.9134.71.9439.0398.2
5431.980.00136.9114.42.8307.7630.4
67−0.260.00180168.6−0.5176.7142.9
740−0.700.00176.8184.6−0.1437.8372.7
8361.050.00237.8230.11.0317.6307.6
9221.930.00142.8128.50.8375.8365.8
1017−0.530.00160.7148.2−0.4340.2907.6
Columns A present the main spatial features of the rows/patterns tested (the images are presented in Figure 1): the mean width of the rows (Row W), and the mean spacing between two consecutive rows (Row D). Row W and Row D are expressed in pixels. Columns B present the best POF results in terms of the Measure Error (ME) for each image optimized individually and the related values of robustness and reliability (PkS and PkD2). On the other hand, Columns C present the results corresponding to the Generally Optimized Set of parameters (“GOSet”), defined as the set of values providing the best average result over the whole group of images; it corresponds to the following values: Sigma = 4; Gamma = 2; Lambda = 20; G_Size = 10; and Win_Size = 180. The ME is defined as the difference between the measured and the actual orientation of rows. PkS and PkD2 provide measures of the reliability and robustness of the measurements (see the text for more details). The ME is expressed in degrees; larger ME values mean less accurate measurements. PkS and PkD2 are expressed as percentages; and larger PkS and PkD2 values indicate more reliable and robust measurements.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Carlini, A.; Paindavoine, M. Pattern Orientation Finder (POF): A Robust, Bio-Inspired Light Algorithm for Pattern Orientation Measurement. Electronics 2023, 12, 4354. https://doi.org/10.3390/electronics12204354

AMA Style

Carlini A, Paindavoine M. Pattern Orientation Finder (POF): A Robust, Bio-Inspired Light Algorithm for Pattern Orientation Measurement. Electronics. 2023; 12(20):4354. https://doi.org/10.3390/electronics12204354

Chicago/Turabian Style

Carlini, Alessandro, and Michel Paindavoine. 2023. "Pattern Orientation Finder (POF): A Robust, Bio-Inspired Light Algorithm for Pattern Orientation Measurement" Electronics 12, no. 20: 4354. https://doi.org/10.3390/electronics12204354

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop