Next Article in Journal
Power Efficient Current Driver Based on Negative Boosting for High-Speed Lasers
Previous Article in Journal
Design and Emulation of All-Digital Phase-Locked Loop on FPGA
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implementation of Grading Method for Gambier Leaves Based on Combination of Area, Perimeter, and Image Intensity Using Backpropagation Artificial Neural Network

1
Electrical Engineering Department, Faculty of Engineering, Universitas Andalas, Padang City 25163, Indonesia
2
Computer System Department, Faculty of Information System, Universitas Andalas, Padang City 25163, Indonesia
3
Agriculture Engineering Department, Faculty of Agriculture Technology, Universitas Andalas, Padang City 25163, Indonesia
4
Physics Department, Faculty of Science, National University of Singapore, Singapore 117546, Singapore
*
Author to whom correspondence should be addressed.
Electronics 2019, 8(11), 1308; https://doi.org/10.3390/electronics8111308
Submission received: 3 October 2019 / Revised: 29 October 2019 / Accepted: 4 November 2019 / Published: 7 November 2019
(This article belongs to the Section Computer Science & Engineering)

Abstract

:
Gambier leaves are widely used in cosmetics, beverages, and medicine. Tarantang village in West Sumatera, Indonesia, is famous for its gambier commodity. Farmers usually classify gambier leaves by area and color. They inherit this ability through generations. This research creates a tool to imitate the skill of the farmers to classify gambier leaves. The tool is a box covered from outside light. Two LEDs are attached inside the box to get maintain light intensity. A camera is used to capture the leaf image and a raspberry Pi processes the leaf features. A mini monitor is provided to operate the system. Six hundred and twenty-five gambier leaves were classified into five grades. Leaves categorized into grades 1, 2, and 3 are forbidden to be picked. Grade 4 leaves are allowed to be picked and those in grade 5 are the recommended ones for picking. Leaf features are area, perimeter, and intensity of leaf image. Three artificial neural networks are developed based on each feature. One thousand leaf images were used for training and 500 leaf images were used for testing. The accuracies of the features are about 93%, 96% and 97% for area, perimeter and intensity, respectively. A combination of rules are introduced into the system based on the feature accuracy. Those rules can give 100% accuracy compared to the farmer’s recommendation. A real time application to classify the leaves could provide classification with the same decision result as the classifying performed by the farmers.

1. Introduction

Gambier is a commodity plant with a good economic value. Gambier is used for several purposes such as medicine, food, and paint. Indonesia has been one of the main gambier producers in the world [1]. The Indonesian gambier production total in 2013 was 20,507 tons and West Sumatera dominated with a 13,809-ton gambier production, or 67% of the total Indonesian production [2]. Lima Puluh Kota, a region in West Sumatera, is the center of gambier cultivation. It supplies almost 70% of gambier from West Sumatera [3]. Gambier can be used in various industries such as cosmetics, medicine, beverages, and paint [4,5,6,7]. Leaves are the parts of the gambier plant that are taken to be processed [8,9,10].
The farmers in a village at the 50-Kota regency have picked the leaves manually. Available leaves are picked selectively based on area and color. This is an ability inherited from generation to generation. This skill has been developed on site instead of from formal education. The skill adoption process has required considerable time and is highly dependent on the availability of experienced farmers [11]. Over the years, this routine has become more challenging and complicated because of a significant decrease of young people who are willing to get involved in agriculture. In 2018, there were only about 10%, from 27.5 million farmers, as head of household under the age of 35 years old [12]. Therefore, there is a need to model the gambier leaf classifying skill to avoid the extinction of knowledge.
Some methods have been introduced in clustering, such as the Extreme Learning Machine (ELM) [13] for speech [14] and energy classification [15], fuzzy clustering for detecting plant disease [16], and a backpropagation artificial neural network for maize detection [17]. Leaf recognition is one of the implementations of clustering methods using image processing [18]. The leaf is one of the objects used to recognize a plant [19,20]. Image color as a feature was also effective for identifying the leaves [21]. Plant classification from Flavia based on Hu moment and a uniform local binary pattern histogram dataset was also conducted by Reference [22]. These features could be implemented to the support vector machine method with a good performance. Laos leaves were specially studied by Reference [20]. The study used moment invariant as the feature to recognize Laos leaves with different positions. Euclidean and Canberra distance performed with acceptable accuracies of about 86% and 72%, respectively.
Feature combinations were developed to improve the accuracy of computer vision to classify the leaves. A moment invariant feature using a neural network method to classify leaves had about 93% accuracy. Centroid representation as an addition increased this accuracy to 100% [20]. In Reference [23], gross shape, Fourier, and Multiscale Distance Matrix-Average Distance (MDM-A) were the features used to recognize different types of leaves from Flavia and Swedish Leaf Image Database (SLID) datasets. Combination features produced an average accuracy above 95%. Combination features showed improvement in leaf recognition. In Reference [24], more leaf features used in a recognition system resulted in better accuracy for a leaf recognition system using a computer vision.
In this research, the skill of the farmers in Tarantang Village, West Sumatera, in classifying whether a gambier leaf is ready to be picked is modeled based on three features, as follows: Area, perimeter, and intensity. Three artificial neural networks were constructed for each feature. Trained networks from three features were combined to build a set of rules to detect leaf status. This algorithm was implemented in a real time application using a camera and a mini processor. A monitor was provided to serve as a user–machine interface.

2. Aim and Contribution

The goal of this research is to implement an artificial neural network in order to classify gambier leaves in Tarantang Village based on three features, as follows: Area, perimeter, and intensity. The main contributions this study are four-fold, as follows:
  • Farmers in Tarantang Village have been manually picking gambier leaves to be processed. They have learned this skill from their senior farmers by directly testing the gambier trees. The learning process requires considerable time and is highly dependent on the availability of experienced farmers. Over the years, this routine has become more challenging and complicated because the number of young people wanting to get involved in farming, especially the gambier plant, is significantly decreasing. In this study, we introduce the first modelling of the farmers’ knowledge for classifying gambier leaf in Tarantang Village. There is a probability that the gambier leaf from other areas are different from Tarantang Village, which required further research.
  • This research implemented the farmers’ knowledge about the gambier leaf to a real time system. A leaf can be set up inside a designed box to be analyzed in real time in order to determine whether a leaf is ready to be picked or not.
  • The three features in this research are area, perimeter, and intensity. These features require farmers’ knowledge, who classify gambier leaf based on size and color. Three artificial neural networks were designed based on each feature. Each feature has advantages in classifying; however, the highest accuracy is only about 97%.
  • A combination of three artificial neural networks for three features was designed as the final rule of the real time system to classify gambier leaf. The result shows that the combination resulted in a high accuracy of 100% in which the same decision was reached between farmers and the system when 500 leaves were classified.

3. Materials and Methods

3.1. Sample of Leaf

This study was conducted on gambier leaves from Tarantang Village, West Sumatera, Indonesia. This village is very famous for its gambier production. From discussions with the farmers, the gambier leaves in Tarantang Village are usually picked when they have reached the age of 5 months. The farmers usually use area and color of the leaf to simplify the selection process. Table 1 shows the explanation of color and area of the five grades and their class. Leaves that are old enough are classified as grade 5. This is the only recommended grade to be picked by farmers, yet a few grade 4 leaves are also carried away in the picking process. Grade 1, 2, and 3 leaves are totally forbidden to be picked, because they could degrade product quality.

3.2. System Design

A group of farmers with approximately 30 years of experience with gambier plant was asked to classify the leaves into grades 1, 2, 3, 4, and 5. This grade was also related to the age of leaf. The age prediction made by the farmers for grades 1 to 5 were 1 month, 2 months, 3 months, 4 months, and 5 months, respectively. There were 300 leaves in each grade.
A system was designed to predict leaf maturity. The system designed to classify the leaves has four main parts, as shown in Figure 1. The input is the gambier leaf image captured by a camera. The output is leaf grade shown on an LCD. A raspberry Pi as a microprocessor is used to process image features and recognize the grade. A user interface was designed to operate this system on a mini monitor.
Hardware Design
A rectangular box was designed with a dimension of 19 cm by 14 cm, as shown in Figure 2. The electrical components consist of an LCD, a camera, and a microcontroller. They are attached on the top side of the box, and a mini monitor sits beside the box, as shown in Figure 2. Two LEDs are located on two sides facing each other. These two LEDs are 13 cm from the bottom of the box. Each LEDs is supplied with a 3.6 volt and 150 mA power source. The color temperature is from 6000K to 6500 K with 140° view angle. A camera is attached on the roof exactly at the center point of the roof (9.5 cm × 7 cm). The camera resolution is 5 MP and the image size is 720 pixels (horizontal) × 480 pixels (vertical).
A leaf must be placed inside the box to have its picture taken. Figure 3a shows the box cover opened to position the leaf inside the box. Figure 3b illustrates the leaf position on the bottom of the box. This plane has a dimension of 19 cm × 14 cm. The center viewpoint of the camera is at the diagonal line intersection of the bottom side and the starting pixel is the (0, 0) pixel coordinate. An imaginary auxiliary line is used to center the leaf position at the horizontal axis. The tip of the leaf faces toward the viewpoint of the camera. A horizontal pixel is 0.026 cm in length and a vertical pixel has a 0.029 cm length.
Software Design
The software was designed as the interface between machine and human. A user can choose both input types to select the image source and feature to classify the leaf based on the image. There are three available input sources, which are the camera, file, and folder. This process is shown in Figure 4.
File type source is an input that processes an image from a saved image file. It allows the user to processed only one file at a time. On other hand, the folder type input gives flexibility to the user to process multiple files in one processing. The saved image for both file and folder type inputs must be the picture taken using the camera at the top of the box.
After choosing the input type, the user can choose the feature to predict the grade of leaf. Three features available on the menu are area, perimeter, and color. In particular, the result from a file or camera input type is available in the detail section. This section provides information about area, perimeter, and color. The grade of the leaf is also shown on this part. The layout of this software is illustrated in Figure 5.

3.3. Leaf Features

Leaf features used in this research were area, perimeter, and intensity of the leaf image. The image was produced by a camera installed at the top side of cover box. The image was represented as a bitmap image in jpeg format, which had pixels. The three features were processed from the image pixels. Captured images were saved in Raspberry Pi memory.
Area
The original image is processed using the grayscale method provided by OpenCV. The RGB value is converted to grayscale using Equation (1). After the gray scale image is obtained, the next step is to get the binary image using threshold Equation (2). The threshold value (T) is set to 70. Pixels are labelled “0” for white if the grayscale value is smaller than threshold and “1” for black if the grayscale is equal or bigger than threshold value. The image is divided into two parts, as follows: Area 1 and 2, as shown by Figure 6a,b. Each part has a size of 720 pixels × 240 pixels. The leaf area is calculated using horizontal integral projection for both parts, based on Equation (3) for area 1 and Equation (4) for area 2. The total area for the leaf in the pixel is given by Equation (5).
Grayscale = 0.299 R + 0.587 G + 0.114 B
P ( i , j ) =   { 1 ,   grayscale   <   T 0 ,   grayscale     T
A 1 = i = 1 720 j = 1 240 P ( i , j )
A 2 = i = 1 720 j = 241 240 P ( i , j )
A p = A 1 + A 2
  • A1 = area 1 in pixel
  • A2 = area 2 in pixel
  • Ap = total area in pixel
  • P (i, j) = Pixel label in line i and column j.
The surface area of the leaf is also measured manually for the validation method, as shown in Figure 7. The outline of the leaf is projected on millimeter grid paper. The size of the major square on the paper is 1 cm2. There are two types of filled squares, as follows: Full grid ( A full _ grid ) and half grid ( A half _ grid ) . The total of area ( A r ) in cm2 is the sum of full grid and half grid, as in Equation (6). The full grid total is equal to the amount of full squares ( n full _ grid ) (7) and the half grid total is the total of the half grid squares divided by two ( n half _ grid 2 ) (Equation (8)).
A r = A full _ grid + A half _ grid
A full _ grid = n full _ grid
A half _ grid = n half _ grid 2
Perimeter
The Canny method provided by the cv.Canny() function in OpenCV is used to detect the edges of the leaf image. Hysteresis thresholds were declared to be 70 and 150 for minimum and maximum values, respectively. This method converts the original image into a binary image as shown in Figure 8. The black pixels are labeled as “0” and white pixels are “1”. Similar to the area procedure, the image is also divided into area 1 and area 2 to calculate the perimeter 1 (Equation (9)) and 2 (Equation (10)), respectively. The total leaf perimeter in the pixel is given by Equation (11).
C 1 = i = 1 720 j = 1 240 P ( i , j )
C 2 = i = 1 720 j = 241 480 P ( i , j )
C p = C 1 + C 2
  • C1 = perimeter 1 in pixel
  • C2 = perimeter 2 in pixel
  • Cp = perimeter of leaf in pixel
  • P (i, j) = Pixel label in line i and column j.
The perimeter of the leaf is also manually measured to validate the Canny method. The leaf is projected to a millimeter grid paper. Then, a string is attached to trace the line on that paper, as shown by Figure 9a,b. The length of string is measured using a ruler. The perimeter of the leaf is equal to the length of the string.
Intensity
Image intensity is used to recognize leaf grade. The intensity of the image is calculated using Equation (12). Ten pixels are used as samples for the intensity feature. The coordinates of the ten pixels are set from the lengths and widths of leaves from all grades. Figure 10 shows the ten sample pixels for the image intensity. There are five areas to position them. Each area has two sample pixels. The horizontal line boundaries located on the camera are vertical pixel 230 for Ylow and vertical pixel 260 for Yhigh. The vertical line boundaries are constructed using the horizontal pixel values. Only leaves on grade 5 had ten sample pixels on the leaf. For grade 1, there two sample pixels are located on the leaf and the remaining eight pixels are attached on the background of the image. The completed information about the area vertical boundaries and distribution of the sample locations are provided in Table 2.
I = R   +   G   +   B 3

3.4. Neural Network

Each feature has its own artificial neural network (ANN) system. There is an input layer (Xi), one hidden layer (Zj), and an output layer (Yk). The hidden and output layer have bias functions bZj and bYk, respectively. The input layer and the hidden layer is connected with weight (cij). The hidden layer and output is weighted by djk. The Logsigmoid function is used for the hidden layer and purelin is the activation function for the output layer. The illustration of the ANN is shown by Figure 11. The output from the hidden layer is calculated using Equation (13) and activated by Equation (14). The output layer produces Equation (15) and is activated by Equation (16).
Z j = B Zj + 1 n C ij *   X i
OZ j = Sigmoid ( Z j )
Y k = B Yk + 1 n d jk * OZ j
OY k = purelin ( Y k )
The number of neurons for each layer is shown by Table 3. The area and perimeter features, respectively, have two ANN inputs. The area feature uses area 1 and area 2 as the input and the perimeter feature utilizes perimeter 1 and perimeter 2 as the input. The intensity feature uses ten inputs, which are the ten sample pixels. Three binary output neurons as the representatives of the amount of cluster are selected for the output layer. The relationship between the output condition and the grade is shown by Table 4.
There were 1000 training data sets used to build neuron weight and bias. There were 500 data sets for test sessions. Each grade had 100 data sets to test. The ANN was trained using MATLAB Software to get the weight and bias. The performance of the trained ANN was evaluated based on the accuracy.

4. Results

4.1. Hardware and Software

The implementation of hardware design from front and top view is shown by Figure 12a,b. This tool is shaped like a box with black color to prevent light from outside. The user can operate this tool through a user interface application. The example results of this application are shown by Figure 13a for grade 1 and Figure 13b for grade 5. There are options for input and feature. The user can see the detail and grade of the leaf.

4.2. Leaf Features

Area and perimeter of each leaf were obtained from both manual and the image processing method. The data are shown in Table 5. Mean values of area and perimeter from 1500 leaves are given for both manual and image processing. The absolute difference between image processing and the manual method results is stated as error (%). The extent of data to give information about data distribution is defined by standard deviation (SD). The maximum error in area was 2.76 cm2 for grade 3. However, in percentage, grade 1 gave the largest error in area at about 5.25%. The image processing method to calculate area best worked for grade 5, with only 1.43% error. On average, the error in area calculation was about 3.68% for the five grades. The opposite situation occurred in the perimeter feature, when the smallest error occurred in grade 1 at 5.20%. The average error for perimeter calculation using image processing in all grades was 1.74 cm or 5.98%.
Figure 14a shows the trend of area from grade 1 to grade 5 and Figure 14b gives the information about perimeter. Both the manual and image processing method gave similar trends in area and perimeter. The leaf area and perimeter features are linear to the grades. It indicates that the gambier leaf age in Tarantang Village grows gradually from 1 month to 5 months old.

4.3. Neural Network

4.3.1. Training Performance

Each feature had its own neural network system. The designed networks were trained using 1000 datasets. Table 6 shows the training performances of three features. The difference between the target and neural network outputs was stated by the mean square error (MSE) as a performance indicator. Neural network training for the intensity feature gave the best performance compared to others and area training resulted in the lowest accuracy. Perimeter had better accuracy than intensity, but the training process required the longest time.
The training process resulted in weights and bias as constant variables to calculate output. There were two neurons in the input layer, five neurons in the hidden layer, and three neurons in the output layer for area and perimeter features. Input weights for area and perimeter features are given in Table 7 and layer weights for both features are provided in Table 8. Bias variables for the hidden layer and output layer are given in Table 9.
The neural network for intensity had ten neurons in the input layer, twelve neurons in the hidden layer, and three neurons in the output layer. The neural network input weights for the intensity feature are provided in Table 10. There were 120 input weights and 36 layer weights generated for this system, as shown in Table 11 and Table 12, respectively.

4.3.2. Testing Performance

Each feature was tested using 100 data for each grade. So, there were 500 testing data. The results for area, perimeter, and intensity features are shown in Table 13, Table 14 and Table 15, respectively. The area feature could make predictions with 100% accuracy for only the lowest grades. This feature made 5 different decisions when it grouped five leaves to grade 3 instead of grade 2, as with the farmers’ recommendation. The same number occurred when area feature put five leaves into grade 2 instead of grade 3, as with the farmers’ recommendation. The perimeter feature was better than the area feature in predicting grade 4 to grade 5. There was no difference between neural network decision using the perimeter feature and the farmers’ recommendation for grade 4 to grade 5. The average accuracy for area was 93%. Perimeter had a better accuracy at 96%. The neural network system for intensity of ten sample pixels only made mistakes in classifying grade 2. This feature had 100% accuracy for grade 1, 3, 4, and 5. The intensity feature performance can cover the disadvantages in area and perimeter.
These results show the potential of the three features to help the farmers with deciding whether a leaf is ready or forbidden to be picked. A set of rules was developed using three investigated features to configure the algorithm to decide the status of a leaf, as shown in Table 16. This algorithm was introduced into the designed tool for a real time application for leaf grading. The resulting decision using this algorithm was 100% the same as the algorithm built based on the skill of farmers, shown in Table 1.
Table 17 shows the comparison between the proposed method and others. This research was conducted to assist with local agriculture problems. It grades gambier leaves from Tarantang Village as the object. This is a famous village in Indonesia with a gambier commodity. A tool was designed with a box shape with a camera to take the image. Two LEDs with a 6000 K to 6500 K color temperature were activated for the light setting inside the box. Image processing was run to recognize whether a leaf was ready, recommended, or forbidden to be picked. The image size was 720 pixels × 480 pixels. The three features were area, perimeter, and intensity. The combination of these features was also investigated. A backpropagation neural network was used to build the classifier method. The result shows that the combination of these features using the backpropagation neural network had a significant performance in grading gambier leaves.

5. Conclusions

Gambier leaves in Tarantang village were classified based on farmers’ knowledge into three class; namely, recommended, available, and forbidden to be picked. Five-month-old gambier leaves are recommended to be picked and four-month-old gambier leaf are available to be picked. Younger gambier leaves are forbidden to be picked. Farmers’ ability to classify the gambier leaf were reproduced using three artificial neural networks for each feature of area, perimeter, and intensity. The results show that the neural network with the intensity input was the most accurate system, with a 97% accuracy. The area and perimeter features resulted in 93% and 96% accuracy, respectively. Based on that performance, a set of grading rules was determined using the combination of three neural networks. The recommended class is only given to leaf with an intensity grade of 5 and other features have grade ≥ 4. The leaf is forbidden to be picked if the intensity grade is lower than 5. Those rules can give 100% accuracy to cluster gambier leaves, compared with farmers’ knowledge in this village.

Author Contributions

Conceptualization, M.I.R. and B.R.; methodology, M.I.R.; software, A.A.; validation, K.F.; writing—original draft preparation, M.I.R.; writing—review and editing, A.R.; supervision, A.R.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fauza, H. Gambier: Indonesia Leading Commodities in The Past. Int. J. Adv. Sci. Eng. Inf. Technol. 2014, 4, 455. [Google Scholar] [CrossRef]
  2. Directorat General of Estatate Crops. Tree Corp Estate Statistices of Indonesia 2012–2014, 2013 ed.; Juga, B.S., Sukriya, L.L., Eds.; Directorat General of Estate Crops: Jakarta, Indonesia, 2014.
  3. BPS West Sumatera. Sumatera Barat in Figures; The Proces; BPS West Sumatera: Padang, Indonesia, 2012.
  4. Labanni, A.; Handayani, D.; Arief, S. Uncaria gambir Roxb. mediated green synthesis of silver nanoparticles using diethanolamine as capping agent. IOP Conf. Ser. Mater. Sci. Eng. 2018, 299, 012067. [Google Scholar] [CrossRef] [Green Version]
  5. Rauf, A.; Rahmawaty; Siregar, A.Z. The Condition of Uncaria Gambir Roxb. as One of Important Medicinal Plants in North Sumatra Indonesia. Procedia Chem. 2015, 14, 3–10. [Google Scholar] [CrossRef] [Green Version]
  6. Zebua, E.A.; Silalahi, J.; Julianti, E. Hypoglicemic activity of gambier (Uncaria gambir robx.) drinks in alloxan-induced mice. IOP Conf. Ser. Earth Environ. Sci. 2018, 122, 012088. [Google Scholar] [CrossRef]
  7. Nurdin, E.; Fitrimawati, F. The Effect of the Gambir (Uncaria gambir (Hunt.) Roxb.) Leaves Waste and White Turmeric (Curcuma zedoaria) for the Productivity, Antioxidant Content and Mastitis Condition of the Fries Holland Dairy Cows. IOP Conf. Ser. Earth Environ. Sci. 2018, 119, 012041. [Google Scholar] [CrossRef]
  8. Bahri, S.; Endaryanto, T. Gambier extracts as an inhibitor of calcium carbonate (CaCO3) scale formation. Desalination 2011, 265, 102–106. [Google Scholar]
  9. Yunarto, N.; Aini, N. Effect of purified gambir leaves extract to prevent atherosclerosis in rats. Health Sci. J. Indones. 2016, 6, 105–110. [Google Scholar] [CrossRef]
  10. Andasuryani, A.; Purwanto, Y.A.; Budiastra, I.W.; Syamsu, K. Determination of Catechin Content in Gambir Powder from Dried Gambir Leaves Quickly using FT NIR PLS Model. Int. J. Adv. Sci. Eng. Inf. Technol. 2014, 4, 303. [Google Scholar] [CrossRef]
  11. Codizar, A.L.; Solano, G. Plant leaf recognition by venation and shape using artificial neural networks. In Proceedings of the 7th International Conference on Information, Intelligence, Systems & Applications (IISA), Chalkidiki, Greece, 13–15 July 2016. [Google Scholar]
  12. BPS-Statistic Indonesia. The Result of Inter-Cencus Agriculture Survey; Team of Sutas 2018, Ed.; BPS-Statistic Indonesia: Jakarta, Indonesia, 2018; ISBN 9786024382551.
  13. Huang, G.; Huang, G.; Song, S.; You, K. Trends in extreme learning machines: A review. Neural Netw. 2015, 61, 32–48. [Google Scholar] [CrossRef] [PubMed]
  14. Hussain, T.; Siniscalchi, S.M.; Lee, C.C.; Wang, S.S.; Tsao, Y.; Liao, W.H. Experimental study on extreme learning machine applications for speech enhancement. IEEE Access 2017, 5, 25542–25554. [Google Scholar] [CrossRef]
  15. Salerno, V.M.; Rabbeni, G. An extreme learning machine approach to effective energy disaggregation. Electronics 2018, 7, 235. [Google Scholar] [CrossRef]
  16. Bai, X.; Li, X.; Fu, Z.; Lv, X.; Zhang, L. A fuzzy clustering segmentation method based on neighborhood grayscale information for defining cucumber leaf spot disease images. Comput. Electron. Agric. 2017, 136, 157–165. [Google Scholar] [CrossRef]
  17. Dimililer, K.; Kiani, E. Application of back propagation neural networks on maize plant detection. Procedia Comput. Sci. 2017, 120, 376–381. [Google Scholar] [CrossRef]
  18. Hsiao, J.K.; Kang, L.W.; Chang, C.L.; Hsu, C.Y.; Chen, C.Y. Learning sparse representation for leaf image recognition. In Proceedings of the IEEE International Conference on Consumer Electronics—Taiwan, Taipei, Taiwan, 26–28 May 2014; pp. 209–210. [Google Scholar]
  19. Gao, L.; Lin, X.; Zhong, M.; Zeng, J. A neural network classifier based on prior evolution and iterative approximation used for leaf recognition. In Proceedings of the Sixth International Conference on Natural Computation, Yantai, China, 10–12 August 2010; pp. 1038–1043. [Google Scholar]
  20. Isnanto, R.R.; Zahra, A.A.; Julietta, P. Pattern recognition on herbs leaves using region-based invariants feature extraction. In Proceedings of the 3rd International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE), Semarang, Indonesia, 19–20 October 2017; pp. 455–459. [Google Scholar]
  21. Sahay, A.; Chen, M. Leaf analysis for plant recognition. In Proceedings of the 7th IEEE International Conference on Software Engineering and Service Science (ICSESS), Beijing, China, 26–28 August 2017; pp. 914–917. [Google Scholar]
  22. Lukic, M.; Tuba, E.; Tuba, M. Leaf recognition algorithm using support vector machine with Hu moments and local binary patterns. In Proceedings of the IEEE 15th International Symposium on Applied Machine Intelligence and Informatics (SAMI), Herl’any, Slovakia, 26–28 January 2017; pp. 485–490. [Google Scholar]
  23. Sari, C.; Akgül, C.B.; Sankur, B. Combination of gross shape features, fourier descriptors and multiscale distance matrix for leaf recognition. In Proceedings of the ELMAR, Zadar, Croatia, 25–27 September 2013; pp. 23–26. [Google Scholar]
  24. Thanikkal, J.G.; Kumar, D.; Ashwani, T.M.T. Wheter Color, Shape and Texture of Leaves are the key Features for Image Processing based Plant Recognition? An Analysis! In Proceedings of the Recent Development in Control, Automation & Power Engineering, Noida, India, 26–27 October 2017; IEEE: Noida, India, 2017; pp. 404–409. [Google Scholar]
  25. Turkoglu, M.; Hanbay, D. Recognition of plant leaves: An approach with hybrid features produced by dividing leaf images into two and four parts. Appl. Math. Comput. 2019, 352, 1–14. [Google Scholar] [CrossRef]
  26. Choudhury, S.D.; Yu, J.G.; Samal, A. Leaf recognition using contour unwrapping and apex alignment with tuned random subspace method. Biosyst. Eng. 2018, 170, 72–84. [Google Scholar] [CrossRef]
  27. Khmag, A.; Al-Haddad, S.A.R.; Kamarudin, N. Recognition system for leaf images based on its leaf contour and centroid. In Proceedings of the IEEE 15th Student Conference on Research and Development (SCOReD), Putrajaya, Malaysia, 13–14 December 2018; pp. 467–472. [Google Scholar]
  28. Chaki, J.; Parekh, R. Plant Leaf Recognition using Shape based Features and Neural Network classifiers. Int. J. Adv. Comput. Sci. Appl. 2011, 2, 41–47. [Google Scholar] [CrossRef]
  29. Jeon, W.S.; Rhee, S.Y. Plant leaf recognition using a convolution neural network. Int. J. Fuzzy Log. Intell. Syst. 2017, 17, 26–34. [Google Scholar] [CrossRef]
  30. Raut, S.P.; Bhalchandra, A.S. Plant Recognition System Based on Leaf Image. In Proceedings of the Second International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 14–15 June 2018; pp. 1579–1581. [Google Scholar]
Figure 1. Designed system to detect the age of gambier leaf.
Figure 1. Designed system to detect the age of gambier leaf.
Electronics 08 01308 g001
Figure 2. System hardware design.
Figure 2. System hardware design.
Electronics 08 01308 g002
Figure 3. (a) Cover box is opened to position the leaf inside (b) leaf position on the bottom side.
Figure 3. (a) Cover box is opened to position the leaf inside (b) leaf position on the bottom side.
Electronics 08 01308 g003
Figure 4. Process of feature processing.
Figure 4. Process of feature processing.
Electronics 08 01308 g004
Figure 5. Software system design.
Figure 5. Software system design.
Electronics 08 01308 g005
Figure 6. (a) Full area of image and (b) area 1 and area 2 for leaf image.
Figure 6. (a) Full area of image and (b) area 1 and area 2 for leaf image.
Electronics 08 01308 g006
Figure 7. Printed leaf on a millimeter grid paper.
Figure 7. Printed leaf on a millimeter grid paper.
Electronics 08 01308 g007
Figure 8. (a) Full area of image and (b) area 1 and area 2 for leaf image.
Figure 8. (a) Full area of image and (b) area 1 and area 2 for leaf image.
Electronics 08 01308 g008
Figure 9. The length of leaf in millimeter paper. (a) String follows the edge of leave picture on a millimeter grade paper; (b) The length of the string as the leave perimeter.
Figure 9. The length of leaf in millimeter paper. (a) String follows the edge of leave picture on a millimeter grade paper; (b) The length of the string as the leave perimeter.
Electronics 08 01308 g009
Figure 10. Designed system of image processing based on RGB color by ten pixel points.
Figure 10. Designed system of image processing based on RGB color by ten pixel points.
Electronics 08 01308 g010
Figure 11. ANN architecture for system recognition.
Figure 11. ANN architecture for system recognition.
Electronics 08 01308 g011
Figure 12. Hardware implementation system: (a) Box front view; (b) box top view.
Figure 12. Hardware implementation system: (a) Box front view; (b) box top view.
Electronics 08 01308 g012
Figure 13. Software implementation system for the leaf: (a) Grade 1 and (b) grade 5.
Figure 13. Software implementation system for the leaf: (a) Grade 1 and (b) grade 5.
Electronics 08 01308 g013
Figure 14. Feature pattern for (a) area feature and (b) perimeter feature.
Figure 14. Feature pattern for (a) area feature and (b) perimeter feature.
Electronics 08 01308 g014
Table 1. Criteria of leaf groups.
Table 1. Criteria of leaf groups.
GradeAge (month)ColorAreaClass
Grade 11Very Light greenSmallestForbidden to be picked
Grade 22Light GreenSmallForbidden to be picked
Grade 33GreenMediumForbidden to be picked
Grade 44Dark greenBigAvailable
Grade 55Very dark greenBiggestRecommended
Table 2. Vertical boundaries for the five grades.
Table 2. Vertical boundaries for the five grades.
XlowXhighOn LeafOn Background
Grade 13019528
Grade 227634246
Grade 340547964
Grade 453254882
Grade 5601632100
Table 3. Artificial neural network structure based on leaf features.
Table 3. Artificial neural network structure based on leaf features.
Leaf FeaturesTotal Neurons on Input Layer (i)Total Neurons on Hidden Layer (j)Total Neurons on Output Layer (k)
Area253
Perimeter253
Intensity10123
Table 4. Relationship between output and leaf cluster.
Table 4. Relationship between output and leaf cluster.
OutputGrade
0011
0102
0113
1004
1015
Table 5. Mean of area and perimeter for five grades.
Table 5. Mean of area and perimeter for five grades.
GradeAreaPerimeter
ManualImage ProcessingError (%)ManualImage ProcessingError (%)
MeanSDMeanSDMeanSDMeanSD
115.253.1214.452.755.2514.051.5313.321.425.20
241.36.2139.36.354.8423.471.2322.121.965.75
367.308.9064.548.314.1029.892.3128.162.035.79
489.018.3486.528.882.8035.452.3533.272.156.15
5113.6514.22113.026.161.4340.252.937.422.187.03
Table 6. Training performance of the neural network.
Table 6. Training performance of the neural network.
NoFeatureEpochTimePerformance
1Area55924 s0.008
2Perimeter74535 s0.002
3Intensity6751 min1.21 × 10−10
Table 7. Input weight for area and perimeter features.
Table 7. Input weight for area and perimeter features.
AreaPerimeter
X1X2X1X2
Z1−2.11−0.57−21.05−0.11
Z232.568.1020.290.60
Z32.140.57−10.09−9.54
Z41.392.3213.654.38
Z5−1.24−0.2513.674.37
Table 8. Layer weight for area and perimeter features.
Table 8. Layer weight for area and perimeter features.
AreaPerimeter
Z1Z2Z3Z4Z5Z1Z2Z3Z4Z5
Y1−31.010.00−30.010.000.00−1.000.000.000.000.00
Y231.010.0030.011.000.001.001.000.000.000.00
Y331.900.9730.93−1.00−1.001.00−1.00−1.00−78.4779.47
Table 9. Bias for area and perimeter features.
Table 9. Bias for area and perimeter features.
Hidden LayerOutput Layer
BZ1BZ2BZ3BZ4BZ5BY1BY2BY2
Area109.15−2313.96−110.73−48.8337.8231.01−31.01−29.90
Perimeter349.04−198.12255.58−341.94−342.301.00−1.001.00
Table 10. Input weight for intensity feature.
Table 10. Input weight for intensity feature.
X1X2X3X4X5X6X7X8X9X10
Z1−0.32−0.070.110.090.04−0.320.21−0.330.52−0.56
Z2−0.210.400.84−0.620.57−0.170.30−0.08−0.320.03
Z3−0.26−0.45−0.390.110.27−0.13−0.590.010.310.02
Z40.490.01−0.180.090.100.170.25−0.280.32−0.45
Z5−0.03−0.45−0.13−0.060.39−0.60−0.85−0.18−0.581.02
Z60.290.190.07−0.150.140.06−0.50−0.050.09−0.06
Z70.360.180.240.00−0.570.16−0.230.460.04−0.27
Z80.550.15−0.39−0.140.220.190.27−0.140.36−0.49
Z90.600.15−0.410.61−0.380.480.07−0.040.26−0.55
Z10−0.35−0.050.11−0.110.30−0.35−0.160.010.36−0.24
Z110.23−0.311.420.08−0.450.190.131.40−0.25−0.21
Z120.610.25−0.030.270.86−0.570.12−0.06−0.410.11
Table 11. Layer weight for intensity feature.
Table 11. Layer weight for intensity feature.
Z1Z2Z3Z4Z5Z6Z7Z8Z9Z10Z11Z12
Y10.550.34−0.34−0.010.34−9.17 × 10−58.00 × 10−50.010.34−0.661.36 × 10−40.05
Y2−1.19 × 10−5−2.77 × 10−61.00−5.66 × 10−7−5.98 × 10−63.15 × 10−6−3.29 × 10−67.26 × 10−7−5.49 × 10−6−7.18 × 10−64.24 × 10−6−4.01 × 10−6
Y38.22 × 10−4−1.42 × 10−31.45 × 10−33.46 × 10−5−1.00−1.06 × 10−54.47 × 10−6−4.85 × 10−5−1.42 × 10−3−1.44 × 10−3−1.000.78
Table 12. Bias for hidden and output layer for intensity feature.
Table 12. Bias for hidden and output layer for intensity feature.
Hidden LayerOutput Layer
BZ1BZ2BZ3BZ4BZ5BZ6BZ7BZ8BZ9BZ10BZ11BZ12BY1BY2BY2
−0.030.88−0.280.21−0.34−2.390.621.111.41−1.541.170.100.277.89 × 10−61.22
Table 13. System performance for area feature.
Table 13. System performance for area feature.
Grade 1Grade 2Grade 3Grade 4Grade 5
Grade 11000000
Grade 2095500
Grade 3059500
Grade 40011890
Grade 50001486
Table 14. System performance for perimeter feature.
Table 14. System performance for perimeter feature.
Cluster 1Cluster 2Grade 3Grade 4Grade 5
Grade 11000000
Grade 20871300
Grade 3059500
Grade 40001000
Grade 50000100
Table 15. System performance for intensity feature.
Table 15. System performance for intensity feature.
Grade 1Grade 2Grade 3Grade 4Grade 5
Grade 11000000
Grade 20831700
Grade 30010000
Grade 40001000
Grade 50000100
Table 16. Grading rules based on combination artificial neural network output.
Table 16. Grading rules based on combination artificial neural network output.
GradeClass
AreaPerimeterIntensity
Rule 1≥4≥45Recommended
Rule 2<4≥45Available
Rule 3anyany<5Forbidden to be picked
Table 17. Result of comparisons with the purposed method.
Table 17. Result of comparisons with the purposed method.
ObjectFeatureMethodResult
Flavia dataset (Four Sections) [25]ColorExtreme Learning Classifier90.57%
Flavia dataset (Four Sections) [25]Gray-Level Co-Occurrence Matrix78.68%
Flavia dataset (Four Sections) [25]Fourier Descriptor92.10%
Flavia dataset [26]Mayor Axis Length, Minor Axis Length, Area and CircumferenceRandom Subspace Method98.4%
Flavia dataset [23]Gross shape, Fourier Descriptor, Multiscale DistanceQuadratic discriminant analysis (QDA), support vector machines (SVM) and k nearest neighbors (k-NN)94.62%
SLID dataset [23]Gross shape, Fourier Descriptor, Multiscale Distance96.67%
Flavia dataset [27]ContourSVM97.69%
Flavia dataset+ Swedish dataset + ICL dataset+ ImageCLEF data set [22]Hu-moment, Local Binary Pattern HistogramSVM94.13%
Leafsnap dataset [26]Mayor Axis Length, Minor Axis Length, Are and CircumferenceSVM84.8%
Plantscan dataset [28]Moment InvariantFeed Forward Neural Network95.5%
Plantscan dataset [28]Centroid Radii100%
Plantscan dataset [28]Moment Invariant and Centroid Radi100%
Leaf [29]Leaf ContourConvolutional Neural Network94%
Herbal Leaf [20]ShapeEuclidian86.67%
Canberra Distance72%
Gonzales dataset and Wood website [30]Leaf area, circumference, centroid, major axis length, minor axis lengthEuclidian distance100%
Gambier Leaf from Tarantang Village (Proposed)AreaBackpropagation ANN93%
Gambier Leaf from Tarantang Village (Proposed)Circumference96%
Gambier Leaf from Tarantang Village (Proposed)Intensity97%
Gambier Leaf from Tarantang Village (Proposed)Area, Circumference, IntensityCombination of ANN100%

Share and Cite

MDPI and ACS Style

Rusydi, M.I.; Anandika, A.; Rahmadya, B.; Fahmy, K.; Rusydi, A. Implementation of Grading Method for Gambier Leaves Based on Combination of Area, Perimeter, and Image Intensity Using Backpropagation Artificial Neural Network. Electronics 2019, 8, 1308. https://doi.org/10.3390/electronics8111308

AMA Style

Rusydi MI, Anandika A, Rahmadya B, Fahmy K, Rusydi A. Implementation of Grading Method for Gambier Leaves Based on Combination of Area, Perimeter, and Image Intensity Using Backpropagation Artificial Neural Network. Electronics. 2019; 8(11):1308. https://doi.org/10.3390/electronics8111308

Chicago/Turabian Style

Rusydi, Muhammad Ilhamdi, Arrya Anandika, Budi Rahmadya, Khandra Fahmy, and Andrivo Rusydi. 2019. "Implementation of Grading Method for Gambier Leaves Based on Combination of Area, Perimeter, and Image Intensity Using Backpropagation Artificial Neural Network" Electronics 8, no. 11: 1308. https://doi.org/10.3390/electronics8111308

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop