Next Article in Journal
Development and Optimization of a Highly Sensitive Sensor to Quinine-Based Saltiness Enhancement Effect
Previous Article in Journal
Examination of the ZXY Arena Tracking System for Association Football Pitches
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Determination of Munsell Soil Colour Using Smartphones

1
School of Computing, Mathematics and Engineering, Charles Sturt University, Bathurst, NSW 2795, Australia
2
Centre for eResearch and Digital Innovation, Federation University, Mount Helen, VIC 3350, Australia
3
Global Centre for Environmental Remediation, The University of Newcastle, Callaghan, NSW 2308, Australia
4
School of Computing, Mathematics and Engineering, Charles Sturt University, Port Macquarie, NSW 2444, Australia
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(6), 3181; https://doi.org/10.3390/s23063181
Submission received: 10 February 2023 / Revised: 11 March 2023 / Accepted: 13 March 2023 / Published: 16 March 2023
(This article belongs to the Section Smart Agriculture)

Abstract

:
Soil colour is one of the most important factors in agriculture for monitoring soil health and determining its properties. For this purpose, Munsell soil colour charts are widely used by archaeologists, scientists, and farmers. The process of determining soil colour from the chart is subjective and error-prone. In this study, we used popular smartphones to capture soil colours from images in the Munsell Soil Colour Book (MSCB) to determine the colour digitally. These captured soil colours are then compared with the true colour determined using a commonly used sensor (Nix Pro-2). We have observed that there are colour reading discrepancies between smartphone and Nix Pro-provided readings. To address this issue, we investigated different colour models and finally introduced a colour-intensity relationship between the images captured by Nix Pro and smartphones by exploring different distance functions. Thus, the aim of this study is to determine the Munsell soil colour accurately from the MSCB by adjusting the pixel intensity of the smartphone-captured images. Without any adjustment when the accuracy of individual Munsell soil colour determination is only 9 % for the top 5 predictions, the accuracy of the proposed method is 74 % , which is significant.

1. Introduction

Soil colour is a key indicator of soil physical and chemical properties such as mineral composition, organic matter, and moisture content, and it is an important feature used in soil classification [1]. During field research of soil, scientists frequently carry a soil colour book, because colour is typically employed to detect morphological properties [2]. The Munsell Soil Colour Book (MSCB) has been used in US soil surveys to reduce personal idiosyncrasy since around 1949 [3]. It is the most commonly used book for soil colour identification, and for this research project, the 2009 edition of MSCB has been used. The Munsell system is described by three variables: hue, value and chroma. Hue indicates shade, value indicates lightness, and chroma indicates saturation [3]. Soil colour determination using the soil colour charts has been undertaken for over 50 years by experienced soil and land surveyors in Australia. These observations of hue, value and chroma of the matrix colour often occur in field conditions using the MSCB. The method and procedure are detailed in the Australian Soil and Land Survey Field Handbook [4]. The use of cameras and phones has been problematic in the past in field conditions, therefore assessments of colour have been undertaken by trained soil and land surveyors. Neurobiological research shows that the reflection spectra of colour chips in Munsell colour charts match the sensitivity of cells of human eyes [5]. Therefore, the MSCB has been used for many decades for colour specification to determine soil properties and soil organic materials.
The traditional method of soil colour determination involves mapping the human perception to the colour chips of the MSCB [6]. The process of using human perception to determine soil colours does not guarantee an accurate determination, as it differs from person to person and varying external conditions such as light and time of the day, as highlighted by [7,8,9,10,11].
Soil colour is commonly determined by soil scientists and archaeologists under natural lighting conditions, but most of the studies in this area have been done in a controlled environment with controlled illumination conditions [12], or required external sensors [13]. In recent years, we have seen a revolution in the quality of smartphones with in-built good-quality cameras. Even some are almost equivalent to professional cameras, and we can see the growing use of smartphones in different domains [14,15]. In [12], the authors presented a comparison between smartphones and visual soil colour determination that shows that the smartphone performs better. This study highlights that smartphones provide one of the most efficient and cost-effective methods for soil colour classification.
However, almost all research in this area depends on external light or relevant instruments, and is conducted in a controlled environment. In a study conducted by [16], the authors have proposed a smartphone-based soil classification sensor, CMOS (Complementary Metal Oxide Semiconductor), which can be used as an attachment kit and includes a calibration card, shading bucket, and external lenses. In similar studies, [17,18], the authors have designed and presented a web application to recognize Munsell soil colour for archaeologists. They presented a demonstration application of their experiments based on their experimental results using a digital camera in controlled experimental settings. However, in field research, having a controlled experimental setup is not convenient, and procuring a digital camera can prove to be costlier than a smartphone itself.
As previously discussed, observing soil colour using MSCB is highly influenced by observant expertise, colour vision, and experience. Earlier in [19], the authors discussed the accuracy of visually determining Munsell soil colour based on field and weather conditions. Many subsequent studies have therefore looked at and proposed alternative ways to measure soil colour accurately. One of the most promising and commonly used technologies used today is smartphones, which contain built-in high-quality camera sensors, Global Positioning System (GPS) sensors, ambient light sensors, etc. The newer models emerging in the market have more developed camera sensors and image technologies, which makes the device a promising option for colour specification. Some researchers in [20] reviewed a series of research cases that used smartphone cameras in the field of agriculture, environment and food. They showed that there is huge potential for smartphones and their cameras in the agricultural industry. Table 1 shows the list of research that has been undertaken recently in the food, plants, water, and soil domains using smartphone cameras. Similarly, a study by [21] estimated soil structure, texture, and drainage, and indicated a good approach for estimating soil health and fertility. This suggests that smartphones and their cameras have great potential for colour detection. Although a smartphone camera has a high potential for colour detection, there are many potential confounding factors that should be considered, such as different camera sensors, lighting conditions, weather, colour calibration, etc. For that reason, research has been undertaken mostly in controlled environments using digital cameras and external sensors. Research by [22] was carried out on topsoil in Scotland using a Fuji digital camera to demonstrate calibration models for several soil variables (carbon, pH, nitrogen, etc.). This study was also done in a controlled environment, by maintaining constant distance, angle, and light levels.
In this research, our priority is to determine soil colours without the use of external sensors in order to provide a cost-effective and convenient solution to stakeholders. In the initial phase of this research, we used two commonly available smartphones: the Samsung S 10 and Google P i x e l 5 under indirect sunlight to capture soil colours as per MSCB. We used these two smartphones because, at the time of data collection, both scored very well for their camera and image quality [23,24]. The primary camera specification of Samsung S 10 is a 12 MP telephoto lens (45°) [25] and Google P i x e l 5 is 12.2 MP [23]. In the next phase of this study, we compare the images captured via smartphones with the true colour determined using a Nix Pro-2; a low-cost sensor used for colour-matching across various industries. The Nix colour sensor has been used for rapid quantitative prediction of soil colour. It has been used to define soil colour [6,13,26,27] and predict soil organic properties [28,29]. As the current literature claims that the Nix Pro [30] colour sensor offers highly accurate colour discrimination, we opted to use this sensor as our source of ground-truth data, and compared it with the images captured by a smartphone. We used the Nix Pro to verify if our system was working or not. Once we completed developing the system, the Nix Pro was of no further use. The relationship between colour acquired from images of Munsell soil colour chips and Nix Pro-based determination has not been compared in previous studies.
In this research, we found a discrepancy between smartphone-captured images and the actual colour acquired from the Nix Pro. There is an almost constant weight difference between the two data sets and the difference varies for different smartphones. Figure 1 shows the colour discrepancies in the three colour components (Red, Green, Blue) of the RGB colour model. The focus of this research is to determine the relationship based on the discrepancies so that we can adjust the smartphone-captured colour readings to obtain a true colour match. Obviously, there are many attributes to consider, such as time of the day, lighting conditions, different colour models, different smartphones, different distance functions, and geographical location to obtain more accurate soil colour profiling using a smartphone. We have also implemented a ranking-based method to determine Munsell soil colour using two colour difference models. We have ranked the closest determined colours and compared them with the Nix Pro-generated colours. If a colour is matched at rank 1, it means the true colour has been determined on the first prediction. Thus, we have calculated the accuracy of our proposed colour determination method.
The overall objective of this study was to investigate the accuracy and prediction rate of soil colour classification under indirect sunlight in natural outdoor conditions from mobile captured images. The key contributions of this work are summarised as follows:
  • Analyse the colour discrepancy between images captured by smartphones and the Nix Pro colour sensor;
  • Propose a novel approach to accurately capture soil colour, irrespective of the capturing method for a specific geographic area; and
  • Find the most suitable colour model and corresponding distance function by investigating different colour models and colour-matching distance functions.
The rest of the article is organised as follows: Section 2 gives information on the materials and methods used in this study. Section 3 consists of all the results and analysis that have been done in this study, and lastly, we conclude this article with Section 4.

2. Materials and Methodology

2.1. Munsell Soil Colour Book

The most commonly used soil colour references, which have been used since 1949 are the Munsell soil colour charts [3]. This chart, or the MSCB, is widely used by professional soil scientists for soil judging. The book consists of coloured papers or chips mounted on the hue cards, showing multiple variations of value and chroma in vertical and horizontal directions [10]. Users match the closest colour between their soil samples and the Munsell soil colour chips. We are using the 2009 edition of the MSCB for this study, which consists of 443 colour chips and 13 hue cards: 5 R , 7.5 R , 10 R , 2.5 Y R , 5 Y R , 7.5 Y R , 10 Y R , 2.5 Y , 5 Y , 10 Y 5 G Y , G L E Y 1 , G L L E Y 2 , and WHITE PAGE. For instance, the 5 R hue card is shown in Figure 2.
The Munsell colour system is a colour space that is based on three properties: hue (Basic Colour), chroma (colour intensity), and value (lightness) also known as HVC, which can be represented cylindrically in three dimensions as shown in Figure 3. The hue is divided into five dominants: red, yellow, blue, green, and purple, and is subdivided between 0 and 10, commonly in steps of 2.5 . Value is represented by the spine of hue which indicates the lightness of colour along a scale 0–10 (darkest to lightest) [47]. Chroma is represented as the purity of colour, with lower chroma being less pure and is measured vertically outward from the vertical axis.

2.2. Nix Pro Colour Sensor

Nix Pro has been used in many studies previously and has proven to be closely accurate to determine colours in various domains, such as soil and agriculture, paint and print, and the food industry [6,28]. Soil scientists also reported the usefulness of the Nix Pro colour sensor as it provides a reasonably accurate colour prediction [26]. Research [13] shows that the Nix Pro colour sensor provides true soil colour regardless of the moisture condition of the soil, enabling it to be used both on dry and wet soil. For this reason, we have employed the Nix Pro 2 to gather our ground-truth data to compare that with our image-generated data to determine the accuracy of our proposed method, as highlighted in Figure 4.
The overall approach is to determine the Munsell soil colour using a smartphone, and to do that we have compared 2 different colour models and identified which model is best to work with. To do that, we have compared the colour difference (Euclidean distance, CIE1976 and CIE2000) of each chip with the Nix Pro-generated data. Smartphones produce standard red-green-blue (sRGB) colour space where these 3 colours are primary colours, and this production is device-dependent. We have compared RGB colour spaces with device-independent colour space International Commission on Illumination or CIELAB. CIELAB colour space is a three-dimensional colour space and covers the entire human colour perception, where L indicates lightness, A colour in red and green, and B colour in yellow and blue.
For the RGB colour space, we have used Euclidean distance, which is the standard means of determining distance. For CIELAB-based colour difference, we have used and analysed CIE1976 [48] and CIE2000 [49] to determine the closest colour. The standard means of determining the distance between two colours (linear dimension) in an RGB colour model is Euclidean distance, as it is basically a straight distance between two points. There are several distance functions available, such as CIE1976, CIE1994, CIE2000, CMC l:c (1984), and so on. However, not all the colour models work based on that principle. CIE1976 was formulated to measure colour differences for CIELAB colour coordinates. It is similar to Euclidean distance but for the CIELAB colour model. On the other hand, CIE2000 is a more recent scheme that covers more corrections to determine the colour difference. The authors in [50,51] have mentioned the following three corrections:
  • Hue rotation term to mitigate the problem of blue region;
  • Compensate for neutral colours;
  • Compensate for lightness, chroma, and hue.
These computations do some adjustments for better results. Figure 5 shows the entire process that has been followed to determine the closest Munsell soil colour. After converting RGB to LAB value, we have calculated the colour difference for each chip with all 443 Nix Pro-generated data. Lowest colour difference refers to the closest determined colour. Theoretically, the lowest difference colour match should match with the Nix Pro-generated match, but the result shows that it matches after the 50th prediction. In our proposed method, the prediction accuracy increases to a great extent.

2.3. Data Collection and Pre-Processing

The data-collection process involves two steps. First, we have acquired our ground-truth data as shown in Figure 4. The Nix Pro 2 generates data for different colour models, such as RGB, LAB, CMYK etc. We took the RGB and CIELAB values from the Nix Pro, each of the 443 chips of the MSCB (2009 edition). In the next step, we captured images of each page 13 of the MSCB using a Samsung S 10 and Google P i x e l 5 under different environmental conditions. When capturing the images, we maintained an indirect sunlight condition by maintaining a shadow on the book to eliminate unnecessary reflection of sunlight. We have followed a free-style image acquisition method so the distance between the book and the smartphone is not constant. Generally, when an end user takes a photo, they use the auto settings of the camera app. In the proposed method, we used the auto camera setup available in smartphones. There are several data-processing steps behind the camera software when the auto camera setup is used. The auto settings automatically do image processing according to various things such as lighting conditions, focus, colour calibration, etc. We did not perform an analysis of the sensitivity of the processing software. Using a Samsung S 10 , we collected images from each hour from 9 a.m. to 5 p.m. on 20 March 2022 (9 sets) and using a Google P i x e l 5 we collected five sets of data from 10 a.m. to 3 p.m. on different days in January 2022. As we have implemented a free-style data-collection process, data pre-processing was needed before using the images. We used Photoshop to crop each of the 443 soil colour chips by 150 px/150 px and saved them as separate data sets. We also measured the ambient illuminance with the Samsung S 10 to determine the best time of the day for data collection. We used a lux meter app that captures illuminance using the device’s ambient light sensor.

2.4. Determining the Closest Munsell Soil Colour and Rank

First, we calculated the average RGB of a particular colour chip and then converted it to the CIELAB colour space. The colour difference using CIE1976 and CIE200 colour difference models of each chip with all 443 data generated from Nix Pro were calculated. The lower colour distance means the closer the colours, which means the lowest distanced colour is the best prediction. However, that does not happen all the time. To check colour determination, we ranked all the determined colours from lowest to highest and identified in which rank the Nix Pro-determined data are being matched (Table 2 and Table 3). Rank 1 means that exact determination has been found in the first prediction and rank 443 means the worst prediction. We can see that with the Samsung S 10 and using Euclidean distance and the RGB colour model, this gives us the worst average ranking. The CIE1976 and CIE2000, in contrast with the CIELAB colour model, achieve a much better ranking. For that reason, we have only focused on CIELAB for further analysis.

2.5. Average Colour Difference

From our analysis, we saw that there is a pattern between Nix Pro data and image data in almost all the chips. We calculated average weight differences for CIELAB values for both the Samsung S 10 and Google P i x e l 5 . For the Samsung S 10 , the LAB weight difference is 12.89 , 3.14 , and 5.06 and for the Google P i x e l 5 the differences are 8.06 , 3.02 , and 4.49 , as shown in Table 4. We then added this weight difference to the same image data and predicted again and found that the rank went to 13 from 49 for the Samsung S 10 and to 16 from 29 for the Google P i x e l 5 as shown in Table 2 and Table 3. This clearly highlights a better prediction rate. The phone’s processor performance does affect the data results, but for this study, we did not focus on the internal processor individually. We only used the smartphones to capture images, and noticed that for these two smartphones, the pixel performances are different, and, to obtain better accuracy, we need different kinds of adjustments as shown in Table 4.

2.6. Employ Location-Based Prediction

The MSCB has 13 different hues, but not all the colour is prominent in all parts of the world. Soil colour and characteristics depend on the environment, and for that reason, the colour of the soil is different in different places. Therefore, not all 13 hues are relevant or existent across all places. The colour of soil depends on organic matter, moisture content, and other physical and chemical properties. A study by [52] was carried out collating over 680,000 observations on Australian topsoil had 166, 7000 soil colour observations with the most common hues, including 5 Y , 2.5 Y , 10 Y R , 7.5 Y R , 5 Y R , 2.5 Y R and 10 R . For this study, we have employed a location-based prediction and only predicted the colour chips of the previously mentioned 7 hues as mentioned in Table 5 and Table 6.

3. Results

3.1. Better-Performing Colour Model

Measurement of soil colour is usually done using MSCB, which follows the Munsell HVC system, but the system is not uniform and relies on user perception and comparison [53]. There are other colour space models available, such as RGB, CIELAB, CIEXYZ, and CIELUV. As smartphone cameras produce RGB images directly, we have used this colour model as part of our comparison. We have compared the RGB colour model with the CIELAB colour model as it is device-independent. We have used Euclidean distance to rank RGB-based prediction and CIE1976, CIE2000 distance function for the CIELAB colour model. From Table 2 and Table 3 we can see that the average rank of predicting Munsell soil colour using RGB is as follows: for the Samsung S 10 it is 89 and for the Google P i x e l 5 it is 56. For the CIELAB colour model, the prediction rate increased for both the Samsung S 10 and Google P i x e l 5 with average prediction rank 49 (CIE2000) and 29 (CIE2000), respectively. Therefore, we can clearly prove that CIELAB is the better-performing colour model for Munsell soil colour prediction.

3.2. Best Time of the Day to Capture Images Using a Smartphone Camera

To determine which time of the day or which sunlight condition is best for capturing soil images using a smartphone camera, we have gathered 9 sets of data using the Samsung S 10 from 9 a.m. to 5 p.m. each hour. Additionally, lighting conditions or illuminance was measured under both generally and under indirect sunlight. In general, the illuminance gives us an idea on the weather conditions of the day at different times and the lux of indirect sunlight is the shadowed condition under which the soil images were taken, as shown in Table 7. The table clearly shows that from 10 a.m. to 4 p.m., the prediction rate is similar, and we can see a rise in rank in the early morning (9 a.m.) and late afternoon (5 p.m.). That means between 35,000 lx and 76,000 lx outside is preferable for capturing images of soil. Therefore, any time during the day except early morning and evening is a good time to capture images using a smartphone camera for soil colour prediction.

3.3. Added Weight and Location-Based Prediction

In Figure 1, we can see a constant pattern between the Nix Pro-generated data and the image-generated data for both employed smartphone cameras. We calculated the average difference between image-generated value and Nix Pro-generated value of L, A, and B of the CIELAB colour model in Table 4. We added the calculated weight difference with our image-produced data and saw that the prediction rate improved significantly, as shown in Table 2 and Table 3. For he Samsung S 10 , the average prediction rank went from 48 to 12 and for the Google P i x e l 5 from 28.93 to 16 with the CIE2000 distance method.
The location-based prediction model, where we only focused on 7 particular hues which are prominent in Australian agricultural soil, achieved an even better result. When we combined both added weight and the location-based method, we determined that the Samsung S 10 average prediction rank is approximately 8, as shown in Table 5 and for Google P i x e l 5 it is approximately 7 as shown in Table 6 for both CIE colour difference methods. For both smartphones, on average the exact match can be found under 10 predictions. This is considerably better than the normal prediction, which is ranked greater than 30.

3.4. Distance Function

For this study, we used 3 distance functions for Munsell soil colour prediction, namely Euclidean distance for the RGB colour model and CIE1976 and CIE2000 for the CIELAB colour model. As the RGB colour model did not perform better than CIELAB, this left the CIE1976 and CIE2000 distance functions for evaluation. However, for both smartphone images, CIE1976 and CIE2000 performed similarly, as highlighted in Figure 6 and Figure 7. We can see that the curves for the distance functions overlap with each other. The Google P i x e l 5 gives the best prediction, with added weight and a location-based method, with an average prediction rank of 6 t h when CIE1976 was used, whereas Samsung S 10 gives the second-best prediction with an average prediction rank of 8th when CIE1976 was used. The difference between the two distance functions is negligible, as shown in Table 8.
Our proposed method can determine the Munsell soil colour chips by taking images from the book. In addition, it is related to actual soil colour. The ultimate goal of this study is to determine the actual soil colour. At this initial stage, we verified our method using Nix Pro. In addition, the advantage is that we are currently using smartphone-captured images to identify the Munsell soil colour chips and the results are showing promising results. There are a couple of challenges to using actual soil colour. There are 443 Munsell soil colour chips in the book. However, because of various ingredients in soil, soil can be any colour. In future studies, we aim to quantify with the real soils and identify which is the closest Munsell soil colour. The key contributions are summarised as follows:
  • Between RGB and CIELAB colour model, CIELAB performs much better.
  • CIE1976 and CIE2000 distance functions perform similarly for both the Samsung S 10 and Google P i x e l 5
  • The best time to capture images to determine Munsell soil colour is between 10 a.m. and 4 p.m. (35,000 lux to 76,000 lux). The prediction rate decreased in the early morning and evening times.
  • According to our data and analysis between the Samsung S 10 and Google P i x e l 5 , the Google P i x e l 5 performed better for Munsell soil colour prediction (Figure 8).
  • The best prediction rate was achieved when we employed our proposed weight and location-based prediction model. After adding weight and focusing on soil colours in Australia, 70 % of the time, the exact match was found in the top 5 prediction when we employed Samsung S 10 . The Google P i x e l 5 (Table 9) was able to match the colour in the top 5 predictions 74 % of the time. However, we performed our study on sunny days, and the results may vary on cloudy and rainy days. To achieve similar determinations under different weather conditions, further research and investigations are necessary.

4. Conclusions

In this paper, we have evaluated the performance of two widely used smartphone cameras for determining Munsell soil colour at different times of the day, using three distance functions. We have also proposed a colour-intensity relationship between Nix Pro and smartphone images and the proposed weight increased the Munsell soil colour prediction rate. Predicting the colours directly from the images does not give an accurate result and the prediction rate is very low. Additionally, as different smartphones have different kinds of camera sensors, processing performance varies a lot. To minimize these issues, several adjustments are needed for soil colour prediction. For that reason, we did an analysis to find a suitable colour model and distance functions to work with. Through experiments, we have found a pattern between the captured images and the actual colour acquired from the Nix Pro. There is an almost constant weight difference between the two datasets and the difference varies for different smartphones. The accuracy of predicting the exact colour chip on top 5 prediction using the Samsung S 10 and Google P i x e l 5 is 7 % and 9 % , respectively, for CIE1976. This increased considerably when we balanced the weight difference, and the prediction rate went up to 52 % for the Samsung S 10 and 34 % for the Google P i x e l 5 . The results give an even higher soil colour prediction rate when we only focused on possible soil colours found in Australia. Overall, the Google Pixel5 gave the best result by balancing the weight difference and focusing on specifically Australian soil colours with 74 % accuracy rate on top 5 predictions. However, we performed our study on sunny days, and the results may vary on cloudy, rainy or any other days. To achieve similar determinations under different weather conditions, further research and investigations are necessary. In addition, we have used auto settings for the camera app in this study, and the result may vary with some specific settings. The sensitivity and prediction error of the proposed model also needs to be analysed. In future research, we aim to analyse our method with real soil and with more smartphones and their different settings under various environmental conditions, to make a robust and accurate identification.

Author Contributions

Conceptualization, S.S.N. and M.P.; methodology, S.S.N.; validation, S.S.N.; investigation, S.S.N.; resources, S.S.N., M.P., N.R., L.W. and S.u.R.; writing—original draft preparation, S.S.N.; writing—review and editing, M.P., N.R., L.W. and S.u.R.; supervision, M.P., N.R., L.W. and S.u.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Soil CRC Australia (No. 2.S.009 PhD Scholarship) and the APC was funded by Charles Sturt University and Soil CRC Australia.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors acknowledge the support received from Soil CRC Australia for this project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Owens, P.; Rutledge, E. Morphology. Encyclopedia of Soils in the Environment; Hillel, D., Ed.; Elsevier: Oxford, UK, 2005. [Google Scholar]
  2. Thompson, J.A.; Pollio, A.R.; Turk, P.J. Comparison of Munsell soil color charts and the GLOBE soil color book. Soil Sci. Soc. Am. J. 2013, 77, 2089–2093. [Google Scholar] [CrossRef]
  3. Pendleton, R.L.; Nickerson, D. Soil colors and special Munsell soil color charts. Soil Sci. 1951, 71, 35–44. [Google Scholar] [CrossRef]
  4. National Committee for Soil and Terrain; National Committee on Soil; Terrain (Australia); CSIRO Publishing. Australian Soil and Land Survey Field Handbook; Number 1; CSIRO Publishing: Collingwood, VIC, Australia, 2009. [Google Scholar]
  5. Conway, B.R.; Livingstone, M.S. A different point of hue. Proc. Natl. Acad. Sci. USA 2005, 102, 10761–10762. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Mancini, M.; Weindorf, D.C.; Monteiro, M.E.C.; de Faria, Á.J.G.; dos Santos Teixeira, A.F.; de Lima, W.; de Lima, F.R.D.; Dijair, T.S.B.; Marques, F.D.; Ribeiro, D.; et al. From sensor data to Munsell color system: Machine learning algorithm applied to tropical soil color classification via Nix™ Pro sensor. Geoderma 2020, 375, 114471. [Google Scholar] [CrossRef]
  7. Kirillova, N.P.; Grauer-Gray, J.; Hartemink, A.E.; Sileova, T.; Artemyeva, Z.S.; Burova, E. New perspectives to use Munsell color charts with electronic devices. Comput. Electron. Agric. 2018, 155, 378–385. [Google Scholar] [CrossRef]
  8. Kirillova, N.; Sileva, T.; Ul’yanova, T.Y.; Smirnova, I.; Ul’yanova, A.; Burova, E. Color diagnostics of soil horizons (by the example of soils from Moscow region). Eurasian Soil Sci. 2018, 51, 1348–1356. [Google Scholar] [CrossRef]
  9. Marqués-Mateu, Á.; Moreno-Ramón, H.; Balasch, S.; Ibáñez-Asensio, S. Quantifying the uncertainty of soil colour measurements with Munsell charts using a modified attribute agreement analysis. Catena 2018, 171, 44–53. [Google Scholar] [CrossRef]
  10. Pegalajar, M.C.; Sánchez-Marañón, M.; Baca Ruíz, L.G.; Mansilla, L.; Delgado, M. Artificial neural networks and fuzzy logic for specifying the color of an image using munsell soil-color charts. In Proceedings of the International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems, Cadiz, Spain, 11–15 June 2018; pp. 699–709. [Google Scholar]
  11. Sánchez-Marañón, M.; Huertas, R.; Melgosa, M. Colour variation in standard soil-colour charts. Soil Res. 2005, 43, 827–837. [Google Scholar] [CrossRef]
  12. Gómez-Robledo, L.; López-Ruiz, N.; Melgosa, M.; Palma, A.J.; Capitán-Vallvey, L.F.; Sánchez-Marañón, M. Using the mobile phone as Munsell soil-colour sensor: An experiment under controlled illumination conditions. Comput. Electron. Agric. 2013, 99, 200–208. [Google Scholar] [CrossRef]
  13. Stiglitz, R.; Mikhailova, E.; Post, C.; Schlautman, M.; Sharp, J. Evaluation of an inexpensive sensor to measure soil color. Comput. Electron. Agric. 2016, 121, 141–148. [Google Scholar] [CrossRef] [Green Version]
  14. Hogarty, D.T.; Hogarty, J.P.; Hewitt, A.W. Smartphone use in ophthalmology: What is their place in clinical practice? Surv. Ophthalmol. 2020, 65, 250–262. [Google Scholar] [CrossRef] [PubMed]
  15. Wang, N.; Deng, Z.; Wen, L.M.; Ding, Y.; He, G. Understanding the use of smartphone apps for health information among pregnant Chinese women: Mixed methods study. JMIR mHealth uHealth 2019, 7, e12631. [Google Scholar] [CrossRef] [PubMed]
  16. Han, P.; Dong, D.; Zhao, X.; Jiao, L.; Lang, Y. A smartphone-based soil color sensor: For soil type classification. Comput. Electron. Agric. 2016, 123, 232–241. [Google Scholar] [CrossRef]
  17. Milotta, F.L.; Stanco, F.; Tanasi, D.; Gueli, A.M. Munsell color specification using arca (automatic recognition of color for archaeology). J. Comput. Cult. Herit. JOCCH 2018, 11, 1–15. [Google Scholar] [CrossRef]
  18. Milotta, F.L.M.; Quattrocchi, C.; Stanco, F.; Tanasi, D.; Pasquale, S.; Gueli, A.M. ARCA 2.0: Automatic Recognition of Color for Archaeology through a Web-Application. In Proceedings of the 2018 Metrology for Archaeology and Cultural Heritage (MetroArchaeo), Cassino, Italy, 22–24 October 2018; pp. 466–470. [Google Scholar]
  19. Turk, J.K.; Young, R.A. Field conditions and the accuracy of visually determined Munsell soil color. Soil Sci. Soc. Am. J. 2020, 84, 163–169. [Google Scholar] [CrossRef]
  20. Kwon, O.; Park, T. Applications of smartphone cameras in agriculture, environment, and food: A review. J. Biosyst. Eng. 2017, 42, 330–338. [Google Scholar]
  21. Aitkenhead, M.; Coull, M.; Gwatkin, R.; Donnelly, D. Automated soil physical parameter assessment using Smartphone and digital camera imagery. J. Imaging 2016, 2, 35. [Google Scholar] [CrossRef] [Green Version]
  22. Aitkenhead, M.; Cameron, C.; Gaskin, G.; Choisy, B.; Coull, M.; Black, H. Digital RGB photography and visible-range spectroscopy for soil composition analysis. Geoderma 2018, 313, 265–275. [Google Scholar] [CrossRef]
  23. Google Pixel 5 Camera Test: Software Power. 2020. Available online: https://www.dxomark.com/google-pixel-5-camera-review-software-power/ (accessed on 3 March 2023).
  24. Updated: Samsung Galaxy S10 5G (Exynos) Camera Test. 2019. Available online: https://www.dxomark.com/samsung-galaxy-s10-5g-camera-review/ (accessed on 3 March 2023).
  25. Camera Specifications on the Samsung Galaxy S10. 2022. Available online: https://www.samsung.com/sg/support/mobile-devices/camera-specifications-on-the-galaxy-s10/ (accessed on 3 March 2023).
  26. Stiglitz, R.; Mikhailova, E.; Post, C.; Schlautman, M.; Sharp, J. Using an inexpensive color sensor for rapid assessment of soil organic carbon. Geoderma 2017, 286, 98–103. [Google Scholar] [CrossRef] [Green Version]
  27. Swetha, R.; Bende, P.; Singh, K.; Gorthi, S.; Biswas, A.; Li, B.; Weindorf, D.C.; Chakraborty, S. Predicting soil texture from smartphone-captured digital images and an application. Geoderma 2020, 376, 114562. [Google Scholar] [CrossRef]
  28. Jha, G.; Sihi, D.; Dari, B.; Kaur, H.; Nocco, M.A.; Ulery, A.; Lombard, K. Rapid and inexpensive assessment of soil total iron using Nix Pro color sensor. Agric. Environ. Lett. 2021, 6, e20050. [Google Scholar] [CrossRef]
  29. Mukhopadhyay, S.; Chakraborty, S.; Bhadoria, P.; Li, B.; Weindorf, D.C. Assessment of heavy metal and soil organic carbon by portable X-ray fluorescence spectrometry and NixPro™ sensor in landfill soils of India. Geoderma Reg. 2020, 20, e00249. [Google Scholar] [CrossRef]
  30. Nix Pro 2-Color Sensor. 2015. Available online: https://www.nixsensor.com/nix-pro/ (accessed on 18 January 2023).
  31. Li, Z.; Li, Z.; Zhao, D.; Wen, F.; Jiang, J.; Xu, D. Smartphone-based visualized microarray detection for multiplexed harmful substances in milk. Biosens. Bioelectron. 2017, 87, 874–880. [Google Scholar] [CrossRef] [PubMed]
  32. Yu, L.; Shi, Z.; Fang, C.; Zhang, Y.; Liu, Y.; Li, C. Disposable lateral flow-through strip for smartphone-camera to quantitatively detect alkaline phosphatase activity in milk. Biosens. Bioelectron. 2015, 69, 307–315. [Google Scholar] [CrossRef] [PubMed]
  33. de Oliveira Krambeck Franco, M.; Suarez, W.T.; Maia, M.V.; dos Santos, V.B. Smartphone application for methanol determination in sugar cane spirits employing digital image-based method. Food Anal. Methods 2017, 10, 2102–2109. [Google Scholar] [CrossRef]
  34. San Park, T.; Li, W.; McCracken, K.E.; Yoon, J.Y. Smartphone quantifies Salmonella from paper microfluidics. Lab Chip 2013, 13, 4832–4840. [Google Scholar] [CrossRef]
  35. Liang, P.S.; Park, T.S.; Yoon, J.Y. Rapid and reagentless detection of microbial contamination within meat utilizing a smartphone-based biosensor. Sci. Rep. 2014, 4, 1–8. [Google Scholar] [CrossRef] [Green Version]
  36. Cruz-Fernández, M.; Luque-Cobija, M.; Cervera, M.; Morales-Rubio, A.; De La Guardia, M. Smartphone determination of fat in cured meat products. Microchem. J. 2017, 132, 8–14. [Google Scholar] [CrossRef]
  37. Zhihong, M.; Yuhan, M.; Liang, G.; Chengliang, L. Smartphone-based visual measurement and portable instrumentation for crop seed phenotyping. IFAC-PapersOnLine 2016, 49, 259–264. [Google Scholar] [CrossRef]
  38. Machado, B.B.; Orue, J.P.; Arruda, M.S.; Santos, C.V.; Sarath, D.S.; Goncalves, W.N.; Silva, G.G.; Pistori, H.; Roel, A.R.; Rodrigues, J.F., Jr. BioLeaf: A professional mobile application to measure foliar damage caused by insect herbivory. Comput. Electron. Agric. 2016, 129, 44–55. [Google Scholar] [CrossRef] [Green Version]
  39. Vesali, F.; Omid, M.; Kaleita, A.; Mobli, H. Development of an android app to estimate chlorophyll content of corn leaves based on contact imaging. Comput. Electron. Agric. 2015, 116, 211–220. [Google Scholar] [CrossRef]
  40. Rahman, M.; Blackwell, B.; Banerjee, N.; Saraswat, D. Smartphone-based hierarchical crowdsourcing for weed identification. Comput. Electron. Agric. 2015, 113, 14–23. [Google Scholar] [CrossRef]
  41. Wang, Y.; Li, Y.; Bao, X.; Han, J.; Xia, J.; Tian, X.; Ni, L. A smartphone-based colorimetric reader coupled with a remote server for rapid on-site catechols analysis. Talanta 2016, 160, 194–204. [Google Scholar] [CrossRef]
  42. Fang, J.; Qiu, X.; Wan, Z.; Zou, Q.; Su, K.; Hu, N.; Wang, P. A sensing smartphone and its portable accessory for on-site rapid biochemical detection of marine toxins. Anal. Methods 2016, 8, 6895–6902. [Google Scholar] [CrossRef]
  43. Hussain, I.; Das, M.; Ahamad, K.U.; Nath, P. Water salinity detection using a smartphone. Sens. Actuators Chem. 2017, 239, 1042–1050. [Google Scholar] [CrossRef]
  44. Sumriddetchkajorn, S.; Chaitavon, K.; Intaravanne, Y. Mobile device-based self-referencing colorimeter for monitoring chlorine concentration in water. Sens. Actuators B Chem. 2013, 182, 592–597. [Google Scholar] [CrossRef]
  45. Dutta, S.; Sarma, D.; Nath, P. Ground and river water quality monitoring using a smartphone-based pH sensor. Aip Adv. 2015, 5, 057151. [Google Scholar] [CrossRef]
  46. Prosdocimi, M.; Burguet, M.; Di Prima, S.; Sofia, G.; Terol, E.; Comino, J.R.; Cerdà, A.; Tarolli, P. Rainfall simulation and Structure-from-Motion photogrammetry for the analysis of soil water erosion in Mediterranean vineyards. Sci. Total Environ. 2017, 574, 204–215. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Bloch, L.C.; Hosen, J.D.; Kracht, E.C.; LeFebvre, M.J.; Lopez, C.J.; Woodcock, R.; Keegan, W.F. Is it better to be objectively wrong or subjectively right?: Testing the accuracy and consistency of the Munsell capsure spectrocolorimeter for archaeological applications. Adv. Archaeol. Pract. 2021, 9, 132–144. [Google Scholar] [CrossRef]
  48. Hunt, R.W.G.; Pointer, M.R. Measuring Colour; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  49. Mokrzycki, W.; Tatol, M. Colour differenceΔ E-A survey. Mach. Graph. Vis. 2011, 20, 383–411. [Google Scholar]
  50. Sharma, G.; Wu, W.; Dalal, E.N. The CIEDE2000 color-difference formula: Implementation notes, supplementary test data, and mathematical observations. Color Res. Appl. 2005, 30, 21–30. [Google Scholar] [CrossRef]
  51. Lindbloom, B.J. Delta E (CIE 2000). 2017. Available online: http://www.brucelindbloom.com/index.html?EqnDeltaECIE2000.html (accessed on 18 January 2023).
  52. Searle, R. The Australian site data collation to support the GlobalSoilMap. In GlobalSoilMap: Basis of the Global Spatial Soil Information System; CRC Press: London, UK, 2014; p. 127. [Google Scholar]
  53. Rossel, R.V.; Minasny, B.; Roudier, P.; Mcbratney, A.B. Colour space models for soil science. Geoderma 2006, 133, 320–337. [Google Scholar] [CrossRef]
Figure 1. Colour discrepancies between the Nix Pro and Samsung smartphone provided pixel intensity of RGB colour model. The colour components are (a) Red colour, (b) Green colour, and (c) Blue colour.
Figure 1. Colour discrepancies between the Nix Pro and Samsung smartphone provided pixel intensity of RGB colour model. The colour components are (a) Red colour, (b) Green colour, and (c) Blue colour.
Sensors 23 03181 g001
Figure 2. Example of the 5R card of Munsell Soil Colour Book.
Figure 2. Example of the 5R card of Munsell Soil Colour Book.
Sensors 23 03181 g002
Figure 3. Munsell colour system.
Figure 3. Munsell colour system.
Sensors 23 03181 g003
Figure 4. Data collection of ground-truth data using Nix Pro.
Figure 4. Data collection of ground-truth data using Nix Pro.
Sensors 23 03181 g004
Figure 5. Flow diagram of the process.
Figure 5. Flow diagram of the process.
Sensors 23 03181 g005
Figure 6. Average prediction rank using Samsung S 10 ; Colour models used: RGB and CIELAB; Distance Functions employed: Euclidean distance, CIE1976, CIE2000; LB—Australian Location-Based, AW—added weight.
Figure 6. Average prediction rank using Samsung S 10 ; Colour models used: RGB and CIELAB; Distance Functions employed: Euclidean distance, CIE1976, CIE2000; LB—Australian Location-Based, AW—added weight.
Sensors 23 03181 g006
Figure 7. Average prediction rank using Google P i x e l 5 ; Colour models used: RGB and CIELAB; Distance Functions employed: Euclidean distance, CIE1976, CIE2000; LB—Australian Location-Based, AW—added weight.
Figure 7. Average prediction rank using Google P i x e l 5 ; Colour models used: RGB and CIELAB; Distance Functions employed: Euclidean distance, CIE1976, CIE2000; LB—Australian Location-Based, AW—added weight.
Sensors 23 03181 g007
Figure 8. Accuracy curve for top 5 prediction accuracy for Samsung S 10 and Google P i x e l 5 . Colour models used: RGB and CIELAB; Distance Functions employed: Euclidean distance, CIE1976, CIE2000; LB—Australian Location-Based, AW—added weight.
Figure 8. Accuracy curve for top 5 prediction accuracy for Samsung S 10 and Google P i x e l 5 . Colour models used: RGB and CIELAB; Distance Functions employed: Euclidean distance, CIE1976, CIE2000; LB—Australian Location-Based, AW—added weight.
Sensors 23 03181 g008
Table 1. List of research using smartphone cameras.
Table 1. List of research using smartphone cameras.
No.Application AreaPublication
1.Detect harmful substance in milk[31]
2.Detect alkaline phosphate[32]
3.Methanol determination in sugar cane[33]
4.Quantify salmonella[34]
5.Detect microbial contamination[35]
6.Determine fat[36]
7.Determine crop seed[37]
8.Measure foliar damage[38]
9.Estimate chlorophyll[39]
10.Identify weed[40]
11.Colorimetric reader[41]
12.Detect biochemicals[42]
13.Measure water salinity[43]
14.Chlorine monitor[44]
15.Water quality monitor[45]
16.Soil type classification[16]
17.Soil water erosion[46]
Table 2. Samsung Galaxy S 10 average prediction rank using RGB and CIELAB colour model and three distance functions—Euclidean distance, CIE1976, and CIE2000. AW—Proposed added weight difference from Table 4.
Table 2. Samsung Galaxy S 10 average prediction rank using RGB and CIELAB colour model and three distance functions—Euclidean distance, CIE1976, and CIE2000. AW—Proposed added weight difference from Table 4.
SetTimeRGB-Euclidean
Distance
CIELAB-CIE1976CIELAB-CIE2000CIELAB-
AW-CIE1976
CIELAB-
AW-CIE2000
Set 210:00 a.m.88.5943.0143.4415.5015.02
Set 311:00 a.m.89.6653.0350.5812.0212.08
Set 412:00 p.m.101.1658.1257.8012.3712.83
Set 51:00 p.m.83.8148.7646.6611.8611.30
Set 73:00 p.m.82.8148.1645.9211.7711.42
AVERAGE89.2150.2248.8812.7012.53
Table 3. Google Pixel average prediction rank using RGB and CIELAB colour model and three distance functions—Euclidean distance, CIE1976, and CIE2000. AW—Proposed added weight difference from Table 4.
Table 3. Google Pixel average prediction rank using RGB and CIELAB colour model and three distance functions—Euclidean distance, CIE1976, and CIE2000. AW—Proposed added weight difference from Table 4.
SetTimeRGB-Euclidean
Distance
CIELAB-CIE1976CIELAB-CIE2000CIELAB-
AW-CIE1976
CIELAB-
AW-CIE2000
Set 110:00 a.m.62.5334.4730.8113.4914.60
Set 211:00 a.m.40.6220.8918.2913.6713.85
Set 312:00 p.m.46.2926.3221.6313.2813.88
Set 41:00 p.m.51.5637.1229.6616.4515.74
Set 53:00 p.m.80.6744.8644.2521.3422.64
AVERAGE56.3332.7328.9315.6516.14
Table 4. Average weight difference between Nix Pro data and image data.
Table 4. Average weight difference between Nix Pro data and image data.
LAB
Samsung S 10 12.893.145.06
Google Pixel 58.063.024.49
Table 5. Samsung S 10 average prediction rank using CIELAB colour model and CIE1976 and CIE2000 distance functions. Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Table 5. Samsung S 10 average prediction rank using CIELAB colour model and CIE1976 and CIE2000 distance functions. Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
SetTIMELocation-Based
Focused Hue-CIE1976
Location-Based
CIE2000
Location-Based
AW-CIE1976
Location-Based
AW-CIE2000
Set 210:00 a.m.28.0833.419.929.58
Set 311:00 a.m.35.5538.957.067.37
Set 412:00 p.m.39.4244.578.048.64
Set 51:00 p.m.30.4733.106.126.62
Set 73:00 p.m.29.8232.706.787.18
AVERAGE32.7636.557.587.88
Table 6. Google Pixel average prediction rank using CIELAB colour model and CIE1976 and CIE2000 distance functions. Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Table 6. Google Pixel average prediction rank using CIELAB colour model and CIE1976 and CIE2000 distance functions. Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
SetTIMELocation-Based
Focused Hue-CIE1976
Location-Based
CIE2000
Location-Based
AW-CIE1976
Location-Based
AW-CIE2000
Set 110:00 a.m.17.8617.085.135.80
Set 211:00 a.m.13.8313.585.405.84
Set 312:00 p.m.13.6511.975.856.52
Set 41:00 p.m.25.0320.667.246.89
Set 53:00 p.m.28.0430.128.039.81
AVERAGE19.6818.686.336.97
Table 7. Luminous intensity captured from 9 a.m. to 5 p.m. using Samsung S 10 smartphone under direct sun light and indirect sunlight (shadow on the book). In addition, average prediction rank using CIELAB colour model and CIE1976 and CIE2000 distance functions. Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Table 7. Luminous intensity captured from 9 a.m. to 5 p.m. using Samsung S 10 smartphone under direct sun light and indirect sunlight (shadow on the book). In addition, average prediction rank using CIELAB colour model and CIE1976 and CIE2000 distance functions. Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
SetTIMEWeather
Condition
(lx)
Indirect Sun
Light Intensity
(lx)
CIELAB-
AW-CIE1976
CIELAB-
AW-CIE2000
Location-Based
AW-CIE1976
Location-Based
AW-CIE2000
Set 19:00 a.m.11,982309824.4321.1619.4315.39
Set 210:00 a.m.35,510267315.515.029.929.58
Set 311:00 a.m.46,329343212.0212.087.067.37
Set 412:00 p.m.75,952338012.3712.838.048.64
Set 51:00 p.m.74,220305411.8611.36.126.62
Set 62:00 p.m.75,856299311.0910.736.056.73
Set 73:00 p.m.59,037330011.7711.426.787.18
Set 84:00 p.m.38,204318011.6311.227.517.43
Set 95:00 p.m.13,715167522.5123.6814.4115.12
Table 8. Summary of average prediction rank for both Samsung S 10 and Google P i x e l 5 smartphones using CIELAB colour model and two distance functions (CIE1976 and CIE2000). Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Table 8. Summary of average prediction rank for both Samsung S 10 and Google P i x e l 5 smartphones using CIELAB colour model and two distance functions (CIE1976 and CIE2000). Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Colour DifferenceSamsung S10Google Pixel5Samsung S10-Location-BasedGoogle Pixel5-Location-Based
CIE197650.2232.7332.7619.68
CIE200048.8828.9336.5518.68
AW-CIE197612.7015.657.586.33
AW-CIE200012.5316.147.886.97
Table 9. Accuracy in percentage in top 5 prediction for both Samsung S 10 and Google P i x e l 5 smartphones using CIELAB colour model and two distance functions(CIE1976 and CIE2000). Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Table 9. Accuracy in percentage in top 5 prediction for both Samsung S 10 and Google P i x e l 5 smartphones using CIELAB colour model and two distance functions(CIE1976 and CIE2000). Location-Based—Prediction using only prominent soil colours in Australian topsoils. AW—Proposed weight difference from Table 4.
Colour DifferenceSamsung S10
TOP 5 Prediction
Accuracy (%)
Samsung S10
Location-Based
TOP 5 Prediction
Accuracy (%)
Google PIXEL5
TOP 5 Prediction
Accuracy (%)
Google PIXEL5
Location-Based
TOP 5 Prediction
Accuracy (%)
CIE19767.185.719.0718.99
CIE20006.324.9612.1924.54
AW-CIE197651.7466.8134.1373.70
AW-CIE200052.1969.7534.4071.51
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nodi, S.S.; Paul, M.; Robinson, N.; Wang, L.; Rehman, S.u. Determination of Munsell Soil Colour Using Smartphones. Sensors 2023, 23, 3181. https://doi.org/10.3390/s23063181

AMA Style

Nodi SS, Paul M, Robinson N, Wang L, Rehman Su. Determination of Munsell Soil Colour Using Smartphones. Sensors. 2023; 23(6):3181. https://doi.org/10.3390/s23063181

Chicago/Turabian Style

Nodi, Sadia Sabrin, Manoranjan Paul, Nathan Robinson, Liang Wang, and Sabih ur Rehman. 2023. "Determination of Munsell Soil Colour Using Smartphones" Sensors 23, no. 6: 3181. https://doi.org/10.3390/s23063181

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop