Toward a Comprehensive Domestic Dirt Dataset Curation for Cleaning Auditing Applications
Abstract
:1. Introduction
- Even though there are proven AI models that could yield superior accuracy, there is no comprehensive dataset that the experts can use to train the AI models for dirt analysis.
- The dirt particles are often detected by typical computer vision-based analysis. However, capturing the finer features of the dirt is essential for AI-based analysis.
- The domestic dirt has identical visual features, which makes classification challenging for the AI models, which demands a high-quality dataset highlighting the visually distinct features of the dirt particles.
- Gathering of magnified images of domestic dirt particles using adhesive dirt lifting;
- Analyze the usability of an acquired dataset in training and classifying the domestic dirt in standard classification models, using a cross-validation technique;
- Analyze the performance of the proposed scheme in a real-time scenario by rolling out the trained classification model for real-time dirt composition estimation for an in-house developed audit robot.
2. Related Works
3. Methodology
3.1. Robot Architecture
3.2. Sample Audit Sensor
4. Experiments and Analysis
4.1. Dataset Curation
4.2. Dirt Class Identification
- Frequent occurrence during sample collection;
- Presence in every location;
4.3. Dataset Preparation and Training
4.4. Dirt Dataset Validation
4.4.1. K-Fold Cross-Validation Method
- Select the fold .
- Split the dataset to k groups, which are also known as folds.
- Select folds for training the model and one fold for testing.
- For every iteration, a new model is trained independent of the previous iteration.
- Repeat the training and cross-validation k times; in every iteration, the remaining fold will serve as the test set.
- The accuracy is determined on the th iteration as the average of all iterations.
4.4.2. Dirt Dataset Validation through Statistical Measure
4.5. Real-Time Robot-Aided Cleaning Inspection
Comparison with Offline Test Results
5. Discussion
- Overlapping of multiple specks of dirt classes in a sample image (shown in Figure 7e);
- The shaking of adhesive tape during the actuation of the sensor may result in blur images that eventually lead to a wrong classification;
- Encountered dirt specks with very close visual resemblance make it indistinguishable for the model to classify.
6. Conclusions and Future Works
- Combining microbial and chemical analysis in the process of sample auditing;
- Incorporating novel autonomous algorithms toward dirt exploration;
- A comprehensive study comparing the different algorithms with respect to cleaning auditing;
- Exploring the usability of the current dataset for instance segmentation of dirt particles;
- Improving the current dataset by expanding the number of dirt classes and open-sourcing the dataset.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Global Cleaning Services Market Size to be $88.9 Billion by 2025, Says Beroe Inc. Available online: https://www.prnewswire.com/news-releases/global-cleaning-services-market-size-to-be-88-9-billion-by-2025–says-beroe-inc-301272305.html (accessed on 5 June 2022).
- How can Singapore’s cleaning industry prepare for an “endemic”?|GovInsider. Available online: https://govinsider.asia/future-of-work/how-can-singapores-cleaning-industry-prepare-for-an-endemic-nea-dalson-chung/ (accessed on 5 June 2022).
- Toprak, J.; Kamiloglu, A.; KIRCI, P. Design of a Smart Vacuum Cleaner with Indoor Localization. In Proceedings of the Conference of Open Innovations Association, FRUCT, Helsinki, Finland, 12–14 May 2021; pp. 495–499. [Google Scholar]
- Guettari, M.; Gharbi, I.; Hamza, S. UVC disinfection robot. Environ. Sci. Pollut. Res. 2021, 28, 40394–40399. [Google Scholar] [CrossRef]
- Shao, Z.; Xie, G.; Zhang, Z.; Wang, L. Design and analysis of the cable-driven parallel robot for cleaning exterior wall of buildings. Int. J. Adv. Robot. Syst. 2021, 18, 1729881421990313. [Google Scholar] [CrossRef]
- Prabakaran, V.; Mohan, R.E.; Sivanantham, V.; Pathmakumar, T.; Kumar, S.S. Tackling area coverage problems in a reconfigurable floor cleaning robot based on polyomino tiling theory. Appl. Sci. 2018, 8, 342. [Google Scholar] [CrossRef] [Green Version]
- Muthugala, M.; Vega-Heredia, M.; Mohan, R.E.; Vishaal, S.R. Design and control of a wall cleaning robot with adhesion-awareness. Symmetry 2020, 12, 122. [Google Scholar] [CrossRef] [Green Version]
- Sivanantham, V.; Le, A.V.; Shi, Y.; Elara, M.R.; Sheu, B.J. Adaptive Floor Cleaning Strategy by Human Density Surveillance Mapping with a Reconfigurable Multi-Purpose Service Robot. Sensors 2021, 21, 2965. [Google Scholar] [CrossRef]
- Chang, C.L.; Chang, C.Y.; Tang, Z.Y.; Chen, S.T. High-efficiency automatic recharging mechanism for cleaning robot using multi-sensor. Sensors 2018, 18, 3911. [Google Scholar] [CrossRef] [Green Version]
- Pathmakumar, T.; Sivanantham, V.; Anantha Padmanabha, S.G.; Elara, M.R.; Tun, T.T. Towards an Optimal Footprint Based Area Coverage Strategy for a False-Ceiling Inspection Robot. Sensors 2021, 21, 5168. [Google Scholar] [CrossRef]
- Giske, L.A.L.; Bjørlykhaug, E.; Løvdal, T.; Mork, O.J. Experimental study of effectiveness of robotic cleaning for fish-processing plants. Food Control 2019, 100, 269–277. [Google Scholar] [CrossRef]
- Lewis, T.; Griffith, C.; Gallo, M.; Weinbren, M. A modified ATP benchmark for evaluating the cleaning of some hospital environmental surfaces. J. Hosp. Infect. 2008, 69, 156–163. [Google Scholar] [CrossRef]
- Al-Hamad, A.; Maxwell, S. How clean is clean? Proposed methods for hospital cleaning assessment. J. Hosp. Infect. 2008, 70, 328–334. [Google Scholar] [CrossRef]
- Aziz, E.; Aouane, M.; Abdeljabbar, R.; Nabyl, B. Microbiological Study of Surfaces in the Hospital Environment Case of the Provincial Hospital of Sidi Kacem, Morocco. Indian J. Forensic Med. Toxicol. 2022, 16, 419–426. [Google Scholar]
- Aycicek, H.; Oguz, U.; Karci, K. Comparison of results of ATP bioluminescence and traditional hygiene swabbing methods for the determination of surface cleanliness at a hospital kitchen. Int. J. Hyg. Environ. Health 2006, 209, 203–206. [Google Scholar] [CrossRef] [PubMed]
- Pathmakumar, T.; Kalimuthu, M.; Elara, M.R.; Ramalingam, B. An autonomous robot-aided auditing scheme for floor cleaning. Sensors 2021, 21, 4332. [Google Scholar] [CrossRef] [PubMed]
- Pathmakumar, T.; Elara, M.R.; Gómez, B.F.; Ramalingam, B. A Reinforcement Learning Based Dirt-Exploration for Cleaning-Auditing Robot. Sensors 2021, 21, 8331. [Google Scholar] [CrossRef]
- Grünauer, A.; Halmetschlager-Funek, G.; Prankl, J.; Vincze, M. The power of GMMs: Unsupervised dirt spot detection for industrial floor cleaning robots. In Proceedings of the Annual Conference Towards Autonomous Robotic Systems, Lincoln, UK, 8–10 September 2017; pp. 436–449. [Google Scholar]
- Bormann, R.; Weisshardt, F.; Arbeiter, G.; Fischer, J. Autonomous dirt detection for cleaning in office environments. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1260–1267. [Google Scholar]
- Bormann, R.; Wang, X.; Xu, J.; Schmidt, J. DirtNet: Visual Dirt Detection for Autonomous Cleaning Robots. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Online, 31 May–31 August 2020; pp. 1977–1983. [Google Scholar]
- Ramalingam, B.; Lakshmanan, A.K.; Ilyas, M.; Le, A.V.; Elara, M.R. Cascaded machine-learning technique for debris classification in floor-cleaning robot application. Appl. Sci. 2018, 8, 2649. [Google Scholar] [CrossRef] [Green Version]
- Damsteeg-van Berkel, S.; Beemster, F.; Dankelman, J.; Loeve, A.J. The influence of contact force on forensic trace collection efficiency when sampling textiles with adhesive tape. Forensic Sci. Int. 2019, 298, 278–283. [Google Scholar] [CrossRef]
- Wiesner, S.; Tsach, T.; Belser, C.; Shor, Y. A comparative research of two lifting methods: Electrostatic lifter and gelatin lifter. J. Forensic Sci. 2011, 56, S58–S62. [Google Scholar] [CrossRef]
- Dris, R.; Gasperi, J.; Mirande, C.; Mandin, C.; Guerrouache, M.; Langlois, V.; Tassin, B. A first overview of textile fibers, including microplastics, in indoor and outdoor environments. Environ. Pollut. 2017, 221, 453–458. [Google Scholar] [CrossRef] [Green Version]
- Iskander, F. Cigarette ash as a possible source of environmental contamination. Environ. Pollut. Ser. Chem. Phys. 1986, 11, 291–301. [Google Scholar] [CrossRef]
- Luczynska, C.; Sterne, J.; Bond, J.; Azima, H.; Burney, P. Indoor factors associated with concentrations of house dust mite allergen, Der p 1, in a random sample of houses in Norwich, UK. Clin. Exp. Allergy J. Br. Soc. Allergy Clin. Immunol. 1998, 28, 1201–1209. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Gan, Y.; Yang, J.; Lai, W. Video object forgery detection algorithm based on VGG-11 convolutional neural network. In Proceedings of the 2019 International Conference on Intelligent Computing, Automation and Systems (ICICAS), Chongqing, China, 6–8 December 2019; pp. 575–580. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.C. Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4510–4520. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Targ, S.; Almeida, D.; Lyman, K. Resnet in resnet: Generalizing residual architectures. arXiv 2016, arXiv:1603.08029. [Google Scholar]
- Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Zhang, Z.; Sabuncu, M. Generalized cross entropy loss for training deep neural networks with noisy labels. Adv. Neural Inf. Process. Syst. 2018, 31, 8792–8802. [Google Scholar]
Class (Number) | Sample Images | ||||
---|---|---|---|---|---|
Ash | |||||
Hair | |||||
Sand | |||||
Soil | |||||
Paper | |||||
Paint | |||||
Food | |||||
Fibre | |||||
No-dirt |
Model | Average Accuracy (%) | K-Fold Standard Deviation |
---|---|---|
VGG-11 | 94.77 | 5.82 |
VGG-16 | 92.77 | 3.90 |
MobileNetV2 | 94.65 | 3.44 |
ResNet50 | 96.58 | 2.05 |
ResNet101 | 96.54 | 1.89 |
Darknet53 | 95.801 | 3.06 |
Model | Class | Measurements | |||
---|---|---|---|---|---|
Precision | Recall | F1 | Accuracy % | ||
VGG-16 | Ash | 0.9803 | 0.9970 | 0.9886 | 99.70 |
Food | 0.9918 | 0.9675 | 0.9795 | 96.65 | |
Fibre | 0.9559 | 0.9644 | 0.9601 | 96.44 | |
Paper | 0.8621 | 0.9336 | 0.8964 | 89.64 | |
Paint | 0.9689 | 0.9291 | 0.9485 | 92.91 | |
Soil | 0.9993 | 0.9970 | 0.9981 | 99.70 | |
Sand | 0.9878 | 0.9943 | 0.9910 | 99.43 | |
Hair | 0.9897 | 0.9998 | 0.9948 | 99.99 | |
No-dirt | 0.9971 | 0.9941 | 0.9956 | 99.41 | |
VGG-11 | Ash | 0.9900 | 0.9960 | 0.9930 | 99.60 |
Food | 0.9709 | 0.9787 | 0.9748 | 97.87 | |
Fibre | 0.9847 | 0.9555 | 0.9699 | 95.55 | |
Paper | 0.8482 | 0.9458 | 0.8944 | 94.58 | |
Paint | 0.9914 | 0.9110 | 0.9495 | 91.10 | |
Soil | 0.9983 | 0.9985 | 0.9982 | 99.85 | |
Sand | 0.9906 | 0.9820 | 0.9863 | 98.20 | |
Hair | 0.9622 | 0.9977 | 0.9796 | 99.77 | |
No-dirt | 0.9798 | 0.9985 | 0.9891 | 99.85 | |
MobileNet V2 | Ash | 0.9950 | 0.9920 | 0.9935 | 99.20 |
Food | 0.9907 | 0.9638 | 0.9770 | 96.38 | |
Fibre | 0.9700 | 0.9585 | 0.9642 | 95.85 | |
Paper | 0.9108 | 0.9038 | 0.9073 | 90.38 | |
Paint | 0.9707 | 0.9516 | 0.9610 | 95.16 | |
Soil | 0.9963 | 0.9993 | 0.9978 | 99.93 | |
Sand | 0.9843 | 0.9970 | 0.9906 | 99.70 | |
Hair | 0.9589 | 0.9954 | 0.9768 | 99.54 | |
No-dirt | 0.9812 | 0.9956 | 0.9883 | 99.56 | |
ResNet50 | Ash | 0.9930 | 0.9880 | 0.9905 | 99.30 |
Food | 0.9956 | 0.9606 | 0.9778 | 96.06 | |
Fibre | 0.9761 | 0.9703 | 0.9732 | 97.03 | |
Paper | 0.9454 | 0.9392 | 0.9423 | 93.92 | |
Paint | 0.9660 | 0.9758 | 0.9709 | 97.58 | |
Soil | 0.9991 | 0.9985 | 0.9993 | 99.85 | |
Sand | 0.9825 | 0.9973 | 0.9899 | 99.73 | |
Hair | 0.9852 | 0.9965 | 0.9908 | 99.65 | |
No-dirt | 0.9963 | 0.9985 | 0.9974 | 99.85 | |
ResNet101 | Ash | 0.9990 | 0.9970 | 0.9980 | 99.70 |
Food | 0.9919 | 0.9755 | 0.9836 | 97.55 | |
Fibre | 0.9910 | 0.9822 | 0.9866 | 98.22 | |
Paper | 0.9016 | 0.9624 | 0.9310 | 96.24 | |
Paint | 0.9889 | 0.9499 | 0.9690 | 94.99 | |
Soil | 0.9997 | 0.9985 | 0.9993 | 99.85 | |
Sand | 0.9899 | 0.9970 | 0.9934 | 99.70 | |
Hair | 0.9730 | 0.9965 | 0.9846 | 99.65 | |
No-dirt | 0.9971 | 0.9985 | 0.9978 | 99.85 | |
Darknet53 | Ash | 0.9979 | 0.9699 | 0.9837 | 96.99 |
Food | 0.9382 | 0.9307 | 0.9344 | 93.07 | |
Fibre | 0.9939 | 0.9614 | 0.9774 | 93.07 | |
Paper | 0.9001 | 0.6681 | 0.7570 | 66.81 | |
Paint | 0.9861 | 0.9583 | 0.9720 | 95.83 | |
Soil | 0.9901 | 0.9993 | 0.9996 | 99.93 | |
Sand | 0.9568 | 0.9111 | 0.9334 | 91.11 | |
Hair | 0.6506 | 0.9965 | 0.7872 | 99.65 | |
No-dirt | 0.9728 | 0.9985 | 0.9855 | 99.65 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pathmakumar, T.; Elara, M.R.; Soundararajan, S.V.; Ramalingam, B. Toward a Comprehensive Domestic Dirt Dataset Curation for Cleaning Auditing Applications. Sensors 2022, 22, 5201. https://doi.org/10.3390/s22145201
Pathmakumar T, Elara MR, Soundararajan SV, Ramalingam B. Toward a Comprehensive Domestic Dirt Dataset Curation for Cleaning Auditing Applications. Sensors. 2022; 22(14):5201. https://doi.org/10.3390/s22145201
Chicago/Turabian StylePathmakumar, Thejus, Mohan Rajesh Elara, Shreenhithy V Soundararajan, and Balakrishnan Ramalingam. 2022. "Toward a Comprehensive Domestic Dirt Dataset Curation for Cleaning Auditing Applications" Sensors 22, no. 14: 5201. https://doi.org/10.3390/s22145201
APA StylePathmakumar, T., Elara, M. R., Soundararajan, S. V., & Ramalingam, B. (2022). Toward a Comprehensive Domestic Dirt Dataset Curation for Cleaning Auditing Applications. Sensors, 22(14), 5201. https://doi.org/10.3390/s22145201