*Article* **Vision-Based Sorting Systems for Transparent Plastic Granulate**

#### **Tadej Peršak 1,\*, Branka Viltužnik 2, Jernej Hernavs <sup>1</sup> and Simon Klanˇcnik <sup>1</sup>**


Received: 23 April 2020; Accepted: 18 June 2020; Published: 22 June 2020

**Abstract:** Granulate material sorting is a mature and well-developed topic, due to its presence in various fields, such as the recycling, mining, and food industries. However, sorting can be improved, and artificial intelligence has been used for this purpose. This paper presents the development of an efficient sorting system for transparent polycarbonate plastic granulate, based on machine vision and air separation technology. The developed belt-type system is composed of a transparent conveyor with an integrated vision camera to detect defects in passing granulates. The vision system incorporates an industrial camera and backlight illumination. Individual particle localization and classification with the *k*-Nearest Neighbors algorithm were performed to determine the positions and conditions of each particle. Particles with defects are further separated pneumatically as they fall from the conveyor belt. Furthermore, an experiment was conducted whereby the combined performances of our sorting machine and classification method were evaluated. The results show that the developed system exhibits promising separation capabilities, despite numerous challenges accompanying the transparent granulate material.

**Keywords:** sorting; machine vision; *k*-NN algorithm; transparent plastic granulate; recycling; air nozzles

#### **1. Introduction**

Plastic waste is present almost all over the planet and poses serious problems to living organisms and the environment. Such types of waste have decomposition periods in the environment of more than 100 years [1]. Therefore, they should be introduced into the environment as little as possible. Waste plastic decomposes into fragments under five millimeters in size. Such particles are microplastics and are ubiquitous in marine and terrestrial environments, and even in the water we drink [2]. They are also present in other organisms, particularly in marines, which is undesirable. Therefore, plastic waste must be managed correctly to prevent it from reaching the environment [3].

Recycling and reuse are more effective ways to reduce the accumulation of plastic waste, including plastic products that have lost their functionality or have defects from their manufacture. Such products could be of different colors or have some patterns on them, but they can also be transparent. When reusing transparent plastic products, all of the impurities must be removed. Potential impurities have a strong impact on recycled products. They affect their mechanical and visual properties. To this end, sorting machines are used to remove impurities from the mix of used ground plastic [4]. The classification of plastics is essential in the recycling industry, because only in this way can different plastics be separated from one another [5,6]. The sorting machines are of many configurations and operate on different principles for the detection of recycled materials.

There are various techniques for identifying and sorting polymers. Some of these techniques include manual sorting, density separation, electrostatic processes, and various optical systems, including optical inspection using photodiodes or charge-coupled device (CCD) machine vision, near infrared (NIR), ultraviolet (UV), X-ray analysis, and fluorescent light or laser radiation [7].

The most basic non-automated sorting is manual sorting. Manual sorting is prone to error, expensive, tedious, and can be unsafe [7].

Density separation systems are used to separate particles with higher densities than water from buoyant ones. Here, the density of particles must significantly differ [7,8].

Electrostatic separation systems are used to separate a mixture of plastics that can acquire different charges through triboelectrification. It is not suitable for sorting complex mixtures and the particles must be clean and dry [7,9].

Optical systems, which are based on color imaging (Visible light—VIS) sensors can separate different plastics, primarily by color [10]. Problems only arise when the difference between the different colors is very small [11]. In that case, more advanced methods should be used.

Spectrometer and Hyperspectral Imaging are used with a wavelength above a visible range. The former only captures the point and the latter captures the entire line [4]. The most used ranges are VNIR (Visible and near infrared light; 400–1000 nm) and NIR (Near infrared; 800–2500 nm) [10,12]. The purity of the material in reuse is related to the quality of the product. A lot of research has been done to separate the various types of plastics, such as acrylonitrile butadiene styrene (ABS), polystyrene (PS), polypropylene (PP), polyethylene (PE), polyethylene terephthalate (PET), and polyvinyl chloride (PVC). These plastics can be separated from each other completely by spectrography [12,13]. Information on the composition of the material can also be obtained by using spectrometry [14,15]. For better performance, NIR hyperspectral methods are integrated with artificial intelligence methods [16]. These methods are very effective in separating different materials from one another. When looking for differences in the same material with different colors, these methods are less accurate.

X-ray is suitable for the identification of polyvinyl chloride (PVC) from polyethylene terephthalate (PET). X-ray may involve higher system complexity and also health risk [10].

Laser technique offers less-than-a-microsecond fast identification of plastic based on atomic emission spectroscopy. A laser is used to release excited ions and atoms from the material surface, and these can then be identified through spectral analysis to provide polymer type and additives present [7,17,18].

Sorters based on CCD cameras and machine vision occur mainly in sorters of various objects. Support vector machines and artificial neural networks are popular choices for classifiers. Thus, a sorter was developed that separates various objects, such as gears, coins, and connectors [19].

Methods of machine vision, such as deep learning, are emerging in waste sorting. With such methods it is possible to classify and detect various objects, such as glass bottles, paper boxes, paper cups, ceramic plates, and so on [20].

Most of the methods of machine vision in sorting occur in food production. Thus, various defects on crops, such as tomatoes, oranges, lemons, eggs, seeds, and almonds, are inspected with deep learning [21–23], support vector machines [24], neural networks [25], and *k*-Nearest Neighbors [26].

These methods are also used in the processing of foods, such as fish and chicken. The quality is checked or certain parts of the animal are identified [27,28].

There are also simpler methods for checking seed coloration by counting pixels of a particular color [29].

Researching the sorting machines, it was found that there were different manipulators for the physical separation of particles. These are various pneumatic, electrical, and hydraulic manipulators for sorting larger objects. Ejecting with compressed air and pneumatic nozzles for smaller objects is also widely used [30–32]. There are also advanced algorithms for tracking and simulating single particle excretion. These algorithms are able to track objects on the conveyor belt and separate them from other objects with great precision [33–35].

In this research, the development of a real-time sorting machine for sorting transparent plastic granulate with a lot of different defects was presented. These defects are different black dots, blurs, burns, etc. These products are unusable. This can be done by turning the product into base material for new plastics. In the recycling process, defects caused by product injection molding must be eliminated. For this purpose, a polycarbonate (PC) granulate sorting machine was developed, which removes particles with these defects.

The developed sorting machine made an optical control of the granules with the help of a camera operating in the visible range of light. The particles needed to be inspected, so machine vision was used. Machine vision searched for each individual granule in the image and classified it using the *k*-Nearest Neighbors (*k*-NN) method. This method was chosen based on the datatype and application requirements. The images to be processed were of size (40 × 40 pixels), which is very small in comparison to images used in conventional machine vision applications. Granule classification requires real-time processing, so a fast and efficient algorithm is needed. The physical sorting of the granules was performed with air nozzles.

Existing sorters offer the sorting of particles by color and size, and some can even determine the composition of the material the particle is made of. The developed prototype sorter uses machine vision and artificial intelligence methods. With these methods, each individual transparent particle can be classified according to the training database so one can determine whether they are to be ejected. The sorting machine can check their color and any irregularities that may appear in a particle.

#### **2. Materials and Methods**

#### *2.1. Materials*

#### 2.1.1. Hardware of the Sorting Machine

The development of the sorting machine began in Solidworks 3D modeler. The 3D model was always changing throughout the development, and the final version is presented in Figure 1. The sorting machine hardware consisted of a mechanical part and a control part.

The main part of the mechanical part was a conveyor with a transparent conveyor belt, as shown in Figure 1. The belt was 3 mm thick and made of polyurethane. The width of the belt was 140 mm, the length was 600 mm, and the diameter of the conveyor rollers was 50 mm. The conveyor was driven by a three-phase electro motor with a power of 0.37 kW and a speed of 1370 rpm. Engine speed could be adjusted with the Mitsubishi FR-S520SE-0.2K-EC (Mitsubishi Electric, Tokyo, Japan) inverter. The speed of the belt could be adjusted from 0 to 0.55 m/s.

A feeder oversaw the dosing of the granules to the main conveyor. It consisted of a smaller conveyor on which the granule tank was mounted, as shown in Figure 1. The conveyor was powered by a three-phase electro motor with a power of 0.09 kW and a speed of 1360 rpm. The gearbox ensured the proper speed. Engine speed could be adjusted with the Mitsubishi FR-S520SE-0.2K-EC inverter. The material quantity that was being filled onto the conveyor varied, depending on the speed of the feeder belt and the size of the tank opening.

The material sorting air nozzles consisted of nozzles 1 mm in diameter and 5 mm in spacing, as shown in Figure 1. The air nozzles were connected by pneumatic tubes to a valve block, on which each nozzle had its own pneumatic valve SMC SX11F-GH (SMC Corporation, Tokyo, Japan). The air nozzles were located under the conveyor belt and required a compressed air source and a pressure regulator.

An Industrial Controller managed the entire sorting machine. A camera was mounted above the conveyor to capture the images of material lying on it, as shown in Figure 1. The illumination was positioned below the conveyor in the background configuration, as shown in Figure 1. An encoder with a resolution of 1024 signals per rotation RLS RE22IC0410B30F2C00 (RLS, Pod vrbami, Slovenia) was mounted on the conveyor for position measurement. Both 24 V and 5 V power supplies were required to power the components. A catcher of separated and ejected material, as shown in Figure 1, and adaptive circuits for switching the valves and motors were needed.

**Figure 1.** Sorting machine prototype with marked components: 1. Feeder, 2. Camera, 3. Illumination, 4. Conveyor, 5. Air nozzles and valves block, 6. Catcher of ejected material, 7. Catcher of separated material.

Image processing was performed on an Industrial Controller (IC) National Instruments IC-3173 (National Instruments, Austin, Texas, ZDA). The controller had an Intel Core i7-5650U processor; 2.2 GHz, 8 GB RAM. The Industrial Controller also had Field-Programmable Gate Array (FPGA) Xilinx Kintex-7 XC7K160T for the faster processing of lower-level algorithms. A Basler CMOS camera (Basler, Ahrensburg, Germany), model acA2500-14uc was used for image capture. The camera was capable of recording at 14 frames per second at a 2590 × 1942 pixels (5 MP) resolution, and was used with an Opto Engineering 5 Megapixel 12 mm lens.

#### 2.1.2. Samples

The sorting material comprised transparent polycarbonate (PC) in the granules, as shown in Figure 2. The granules were 3 mm wide, 4 mm high, and 2 mm thick. The granules were of different colors. Other defects on the granules were spots that represent burns that can form during the plastic injection process. Through sorting, all of the granules with impurities must be ejected. Only completely transparent material, as shown in Figure 2a, is acceptable for reuse.

**Figure 2.** Representatives of each granule class: (**a**) Clean; (**b**) Blur; (**c**) Black dots; (**d**) More black dots; (**e**) Dark; (**f**) Pink; (**g**) Green; (**h**) Yellow (**i**) Material mix.

An image database was created for learning and testing the classifier. All 9 classes are listed in Table 1. For each class, 1000 images of individual granules were made, as shown in Figure 2. This database was used to train and test the classifier.


**Table 1.** List of used samples.

#### *2.2. Methods*

#### 2.2.1. Image Acquisition

The camera was triggered by a hardware trigger from the controller. The camera sent the captured image to the controller where the machine vision was executed. A universal serial bus (USB) 3.0 was used for image transfer. The Region of Interest (ROI) size was 57 × 77 mm.

A key part of the machine vision system was lighting. Without consistent illumination quality, images could not be captured. A 48 W light emitting diode (LED) light with color temperature 6000 K was used to illuminate the granules. Granulate illumination is specific, because of reflection on its surface. Therefore, illumination was placed under the conveyor belt in a backlight configuration. Two diffusers were used for more even light. One was mounted directly on the light, the other below the conveyor belt, as shown in Figure 1.

#### 2.2.2. Image Processing

Before processing, the image was captured and stored in a buffer. The images were processed with delay to save time. This means that, while one image was captured, the other was processed. In classic processing, the image would be captured and processed immediately. Thus, the time it took to capture a camera image and transfer the image to a computer was lost. This time would be difficult to save in real-time applications of image processing.

The overall image processing was separated into two main parts. These were the localization and classification of the granules.

At localization, the originally captured images were reduced to 37% of their original size, as shown in Figure 3a, to speed up the image processing time. Then, the RGB green color extraction was performed and the correct brightness was set, as shown in Figure 3b. We then applied a Modified Sauvola threshold [36,37], which makes a binary image with only granules, and some small regions which do not represent granules, as shown in Figure 3c. The processed image was sent to a filter to remove small regions of the image that did not represent granules, as shown in Figure 3d. Finally, the Particle Analysis method from LabVIEW National Instruments software was carried out, which gave us the *x* and *y* coordinates of the granules that were in the image. The *x* and *y* coordinates of the granules represented the center of mass of the object in the image. The Particle Analysis method takes some time to implement but it is critical to the operation of the sorter.

At the conveyor's full speed, image processing took up to 100 ms. The actual time needed for processing depended on the number of particles in the image. Image capture, granule localization and classification were performed during this time. When a new image capture was triggered, the image capture was started, and the previous image was processed. Thus, the image capture and the processing were performed in parallel. The image buffer and image minimization saved 92 ms in image processing time. Without the implementation of the described procedure, the total image processing time would have been almost 200 ms.

**Figure 3.** Image processing operations: (**a**) Original image; (**b**) RGB green color extraction and brightness settings; (**c**) Modified Sauvola threshold; (**d**) Particle filter. Blue color indicates marked pellets with color defects and red color indicates pellets with structural defects.

Only granules which were completely transparent and error-free were accepted. Therefore, a color classifier was used. All the granules found in the image were examined by the classifier. The granule location data from Particle Analysis were taken, and the granule image was cut at the granule location and sent to the classifier. The cropped image size was 40 × 40 pixels. Each cropped image was converted to the HSL color space. Then, the hue, saturation, and luminance histograms of the color sample were calculated. The hue and saturation histograms each contained 256 values, and the luminance histogram was reduced to 8 values. Combined, the 520 hue, saturation, and luminance values produced a high-resolution color feature vector. Because very fast real-time processing is required, the dimensions of the feature vector were reduced using a dynamic mask. A reduced color feature vector contains 128 hue and saturation values and 8 luminance values—for a total of 136 values—and represents the input to the classifier. By suppressing the luminance histogram into eight values (12.5% reduction), the algorithm accentuated the color information for the sample.

Color feature vector of granule image was sent to color classifier, which was based on the *k*-NN algorithm. This is a statistical method for classification based on the nearest neighbor [38]. This method was characterized by Lazy Learner, because it does not learn in advance, but only when it receives a classification requirement. When a method receives a classification request, it compares the data obtained with those of its database. It finds the closest or *k*-Nearest Neighbors (*k*-NN) based on Euclidean distance [39]. The accuracy of the *k*-NN classification changes as the number of neighbors change and varies from case to case.

The granules were ejected based on the classification results. If a granule was detected as an error granule, its coordinates were stored in the circular buffer for ejecting.

#### 2.2.3. Evaluating the Classifier

A confusion matrix was used to demonstrate the effectiveness of the classification. Table 2 presents the confusion matrix for classification into several classes. For example, a classifier sorts a problem into specific classes. The output of the classifier may be one of the following possible cases. For example, if particle belongs to the class C2 and is classified as of class C2, the result is a True Positive. If the classifier predicts a C2 particle to be of some other class than C2, the result is a False Negative, designated by β in Table 2. If the classifier predicts a non C2 particle to be of class C2, the result is a *False Positive* (α) [40,41]. While making predictions on C2 particles, we did not provide data from any other classes, so each particle that was not analyzed and not classified as C2 was a True Negative (γ).


**Table 2.** Representation of the confusion matrix.

Positives, P: True Positives: TP = designated in the Table; False Positives: FP = α. Negatives, N: True Negatives: TN = γ; False Negatives: FN = β.

The performance of our classifier was evaluated by three metrics which determined its positive predictive value (1), hit rate (2), and accuracy (3). The precision of the class tells us how many predictions, which the model considered positive, were actually positive. It was calculated as a ratio between True Positives and all positive predictions for the class in question. Ideally, the precise model would never falsely classify other particles to be of the class in question (i.e., no False Positive predictions). However, even an ideally precise model can still make an error of not recognizing that a particle belongs to the certain class. That is the reason for a metric called recall. This tells us how many of class C particles were correctly classified. This was calculated as the ratio between the True Positives and all samples of the class in question. The model with the ideal recall would never make a mistake of falsely predicting a class C particle to be of some other class (i.e., no False Negative predictions). Lastly, we have the metric accuracy, which provided overall prediction quality for each class. It was calculated as a ratio between all true predictions and a total population.

Precision:

$$precision = \frac{TP}{TP + FP} \tag{1}$$

Recall:

$$recall = \frac{TP}{TP + FN} \tag{2}$$

Accuracy for one class:

$$accuracy = \frac{TP + TN}{P + N} \tag{3}$$

Accuracy for the whole confusion matrix:

$$ACC = \frac{\sum True\ Positives + \sum True\ Negatives}{\sum Total\ population} \tag{4}$$

#### 2.2.4. Sorting Procedure

The sorting algorithm worked based on a circular buffer. The location information of the granules was obtained from the particle analysis. The coordinate *x* was used (by the width of the image) to determine which valve should activate and eject the granule. The *y* coordinate (image length) was used to determine the moment of opening of the pneumatic valve to eject the granule, as shown in Figure 4.

The distance between the granule in the image to be ejected and the nozzle was measured by an encoder. The encoder was mounted on a driven conveyor roller. The belt moves were determined by the encoder pulses. The *y* coordinate of the granulate, which represented the distance from the image edge, is *Yr*. *Yr* was converted from the pixel value to the number of encoder pulses and added to a constant *Ye*, representing the number of pulses from the image to the air nozzles, as shown in Figure 4. The value of pulses for ejecting *Y*s was written to the circular buffer running on the FPGA. The FPGA monitored the encoder value, and opened the air valve for the nozzle, according to the values in the circular buffer.

#### **3. Results and Discussion**

The chapter, Results and Discussion, is divided into two parts. The first part presents the in silico results of the classification of the granules into nine classes and into two classes. In the second part, the in vivo results of physical sorting on the prototype sorting machine are presented.

#### *3.1. Classification Results*

#### 3.1.1. Classification into Nine Classes

Classification was done with the composite image database. The granulate images were uploaded into a *k*-Nearest Neighbors algorithm (*k*-NN) color classifier. The classes were designated by numbers 1–9. The class names are listed in Table 1. A set of 850 learning and 150 testing images were used for each class. Only one nearest neighbor (*k* = 1) was used. With the current image database, the *k*-NN algorithm with one neighbor got the best results. The same applied for the classification into two classes.

Table 3 shows the result of the classifications. They are represented in the confusion matrix. The columns for each class indicate how many granules were allocated to each class.


**Table 3.** Confusion matrix of sorting results into nine classes.

Calculation of the classification accuracy into nine classes, according to Equation (3). The calculated average accuracy (ACC) was 90.5%.

The results from Table 3 show that all the granules in class 1 (clean) were classified correctly. In testing other classes, only in one case was the granule recognized as clean, but was not clean. When sorting, it is important to classify clear granules and defective granules precisely, because the sorting machine also separates the material into clean granules and defective granules.

#### 3.1.2. Classification into Two Classes

The sorting algorithm sorts the material into clean granules and defective granules, so, classification was made on clean and defective granules. A set of 850 learning and 150 testing images were used for each class. Only one nearest neighbor (*k* = 1) was used, as with the nine-class classification. Images of clean granules without defects were used in the clean (OK) class. A mix of other images with defectives granules were used in the defective (NOK) class. Table 4 shows the results of the classifications. All the granules were classified correctly.


**Table 4.** Confusion matrix of results for two classes.

Calculation of the classification accuracy into two classes was performed using Equation (3). The calculated accuracy was 100%.

Table 4 shows the results of *k*-NN classifications on clean (OK) and defective (NOK) granules. The classifier was capable of separating granules with 100% accuracy. Because the classification worked with 100% accuracy, any errors that occurred in sorting were the result of other influences. These were the physical effects of the adhesion of the granules to the conveyor and cohesion forces between the granules.

#### *3.2. Sorting Results*

The classifier was tested on a prototype sorting machine. A test mixture of granules was prepared, into which clear granules and defective granules were placed. Defective granules represented 10% of the total mixture.

Testing was performed with five different settings for the feeder and the conveyor. The settings are given in Table 5. The parameters are explained in Table 6. Table 7 shows the sorting results.

**Table 5.** The settings of the feeder and the conveyor when testing the sorting efficiency.


OK: clean; NOK: defective.



Three repetitions of the test with the same sorting machine settings were made with the granule test mixture. Table 7 provides averages of the results of these three tests. The variable parameters were the speed of the conveyor and the speed of the feeder belt. The influence of how densely the granules were arranged on the conveyor was changed by adjusting these two parameters. The size of the opening on the feeder was constant.

Adjusting the conveyor speed also affected how many frames per second should be captured, and in what arc the granules would fall past the air nozzles unit below the conveyor. The faster the conveyor, the faster the image processing should be. At the full speed of the conveyor, image processing took up to 100 ms. This was 10 frames per second, so the camera was fast enough for image capturing.

A schematic presentation of the testing system is shown in Figure 5. The clean granules are represented by the representatives of the clean granules class. Defective granules are a mixture of all other classes. Figure 5 shows two boxes for sorted material. After sorting, Box 1 would ideally contain only clean material. Box 0 contains defective ejected granules.

**Figure 5.** Schematic presentation of the system prototype. The feeder, conveyor and air nozzles are shown. Below are the boxes to catch the material. A "separate" box for non-defective material (Box 1) and an "ejected" box for defective material (Box 0).


**Table 7.** Sorting results on sorting machine.

After examining Table 7, the following was determined:


#### **4. Conclusions**

A prototype sorting machine for the rejection of defective plastic granulates has been developed. Research started with capturing images of samples and preparing a training–testing database. There were nine classes in the database. Each class had 850 images to teach and 150 images to test the *k*-NN classifier. The classification performance in nine grades was 90.52%.

The classification of only two classes was initially carried out. These were defective granules (NOK) and clean granules (OK). Only clear transparent granules were in the OK class. In the NOK class was a mix of defective granules. The classification accuracy, using a k-NN classifier of backlit optical images for the two classes, was 100%. This means that the sorting machine was capable of at least separating the granules theoretically with 100% accuracy.

Particle localization was performed using the Modified Sauvola threshold algorithm. The location of the granulate is important for the operation of the sorting machine, as it is used to send individual granules to the classifier and possibly to eject the granules with air nozzles.

A classifier on a sorting machine was used in the second part. The testing of sorting accuracy was performed on the test samples. The highest purity of the accepted material (defect free) class contained 99.81% pure material (contamination by defective materials was 0.19%).

Classification OK/NOK worked with 100% accuracy, so the conclusion is that all sorting errors are possible due to other influences. These influences can be inaccurate air nozzles separation, error on the determination of granulate location, granule migration on conveyor during moving between camera, and air nozzles and possible software bugs.

The illumination could be more even using better lighting. Better lighting could only improve already good results. The lighting must be very intensive so that the exposure time of the camera can be very short. The speed of the conveyor affects the image quality in the case of too dim lighting and if the camera is rolling the shutter.

Further work could be performed to improve separate ejecting. As the results show, the classification works well, and all errors resulted from the physical manipulation of the granules. Ejecting logic software and hardware could be improved as it could eject the individual granules more accurately. To this end, the possible effects affecting sorting errors should be improved. The main influences are determining the location of the granule and transporting the granules from the feeder to the air nozzles.

Later, the classifier could be adapted for other materials in similar forms. The regrind polycarbonate, which has a very undefined shape, could be also sorted. With this material, the color of the material depends on the particle thickness, which further complicates the classification. The quality of the captured image can also be improved, by improving the illumination and using a telecentric lens. In this way, the images will be of better quality and can determine the location of the granule more precisely. The classification of granules, which is already good, could also be improved.

Other methods of artificial intelligence could be used to classify the granules. These are, for example, neural networks and deep learning. Since a large database for training and testing was made, these methods could be of use, but only major changes to the sorting machine software should be made.

The use of a sorting machine in an industrial environment would be possible. The capacity needs to be increased largely, while maintaining the sorting efficiency. The increase in capacity should follow the example of larger industrial sorting machines, which adjust the sorting capacity with the help of several cameras installed in parallel on the conveyor. So, a wider conveyor should be used and more cameras in parallel should be installed. Depending on the width of the conveyor, the air nozzles should also be adjusted.

Another way to increase the capacity of the sorting machine is the faster movement of the conveyor. However, then the camera must also capture images with more images per second, which also need to be processed. Thus, the speed of image processing must also be increased.

**Author Contributions:** Conceptualization, T.P. and S.K.; software, T.P. and J.H.; writing—original draft preparation, T.P.; writing—review and editing, T.P., J.H., S.K., and B.V.; project administration, S.K and B.V.; funding acquisition, S.K. and B.V. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research was funded by the European Regional Development Fund.

**Acknowledgments:** The authors acknowledge the financial support from the Slovenian Research Agency (Research Core Funding No. P2-0157).

**Conflicts of Interest:** The authors had no conflicts of interest in this article.

#### **References**


© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
