Next Article in Journal
Detection of Fusarium poae Infestation in Wheat Grain by Measurement with Two Electronic Noses
Previous Article in Journal
Urban Growth Analysis Using Multi-Temporal Remote Sensing Image and Landscape Metrics for Smart City Planning of Lucknow District, India
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Efficient Battery Management and Workflow Optimization in Warehouse Robotics Through Advanced Localization and Communication Systems †

by
Shakeel Dhanushka
1,‡,
Chamoda Hasaranga
1,‡,
Nipun Shantha Kahatapitiya
2,
Ruchire Eranga Wijesinghe
3,4,* and
Akila Wijethunge
1,*
1
Department of Materials and Mechanical Technology, Faculty of Technology, University of Sri Jayewardenepura, Pitipana 10200, Sri Lanka
2
Department of Computer Engineering, Faculty of Engineering, University of Sri Jayewardenepura, Nugegoda 10250, Sri Lanka
3
Department of Electrical and Electronic Engineering, Faculty of Engineering, Sri Lanka Institute of Information Technology, Malabe 10115, Sri Lanka
4
Center for Excellence in Informatics, Electronics & Transmission (CIET), Sri Lanka Institute of Information Technology, Malabe 10115, Sri Lanka
*
Authors to whom correspondence should be addressed.
Presented at the 11th International Electronic Conference on Sensors and Applications (ECSA-11), 26–28 November 2024; Available online: https://sciforum.net/event/ecsa-11.
These authors contributed equally to this work.
Eng. Proc. 2024, 82(1), 50; https://doi.org/10.3390/ecsa-11-20416
Published: 25 November 2024

Abstract

:
This study presents a Warehouse Robot Localization and Communication System prototype to optimize battery management and workflow in warehouses. Autonomous mobile robots equipped with advanced localization and wireless communication technologies coordinate to prevent downtime. When the battery level of the robot drops below a certain threshold, it communicates with the main computer to request assistance. Another robot then takes over its task, allowing the low-battery robot to reach a charging station. Using an overhead camera module and an A* algorithm for optimal pathfinding, robots navigate efficiently. A Python-based user interface enables monitoring and control. This prototype system has the potential for industrial applications with future enhancements.

1. Introduction

In the context of Industry 4.0, Autonomous Guided Vehicles (AGVs) have emerged as self-operating robotic systems, primarily used for material handling and transportation within controlled environments such as warehouses, factories, and distribution centers [1]. These systems operate autonomously, navigating without direct human intervention. AGVs have been widely adopted across various industries due to their versatility in loading options, quiet operation, ease of installation, cost-efficiency, higher throughput, and improved safety. Their implementation reduces labor costs, increases throughput, and improves operational safety. Integrating AGVs into smart manufacturing systems enhances production efficiency by automating logistics workflows, improving response times, and providing greater adaptability to dynamic production demands.
A key technology enabling AGV functionality is color detection, which allows robots to accurately identify and differentiate items, navigate designated paths, and perform quality checks [2]. This capability is achieved through mathematical models known as color spaces, with common models including RGB (Red, Green, and Blue), HSV (Hue, Saturation, and Value), and YCbCr (Luminance, Blue-Difference, and Red-Difference Chrominance) used for image segmentation [3,4]. Image segmentation is crucial for distinguishing objects from their surroundings and can be effectively performed using thresholding in the HSV color space [4,5]. However, optimal HSV values must be adjusted for different situations to ensure accurate detection. OpenCV Python is utilized for shape and color detection due to its accessibility and ease of use [6].
For Unmanned Ground Vehicles (UGVs) in warehouses, localization and mapping are vital for autonomous navigation and obstacle avoidance, contributing to a safer and more efficient environment. Positioning methods are categorized into relative and absolute measurements. Relative positioning employs encoders and gyroscopes, while absolute positioning uses Global Positioning System (GPS) or landmarks [7]. Although GPS is commonly used for AGVs, it struggles with indoor applications due to signal issues; thus, ZigBee has gained popularity for localization due to its higher accuracy. Ceiling lights can serve as effective landmarks for relative positioning, but may introduce inaccuracies from unwanted light sources if not filtered out properly. Increasing the number of mobile robots can be costly due to the need for additional camera modules; therefore, a more cost-effective localization method is necessary [7]. UGV systems leverage image processing algorithms to detect obstacles, using images from a fixed overhead camera, which is more suitable than geotags in dynamic environments [8].
Path planning algorithms are essential for identifying collision-free routes in environments with obstacles. While static environments require pre-planned solutions, dynamic environments necessitate frequent re-planning to adapt to changing conditions. Most algorithms use grid-based models for path representation, balancing map accuracy with planning time. The A* algorithm is particularly effective in finding optimal paths, by minimizing cost while avoiding collisions [9,10]. In mobile robot applications, wireless communication protocols are critical for maintaining productivity and mobility. Wireless Fidelity (Wi-Fi)-based systems offer the high-speed communication necessary for the effective control of UGVs.
Recent advancements in sensor-based technologies in robotics have optimized workflow management and enhanced operational efficiency [11]. Xia Xu et al. conducted a study on the warehouse logistics intelligent positioning information system [12]. It utilizes Zigbee technology and demonstrates significant improvements in data reliability and operational efficiency, while minimizing human errors and optimizing workflow and battery management in warehouse robotics. This highlights the need for implementing advanced technologies to enhance system efficiency and data reliability. Similarly, Zhi Li et al. proposed a warehouse management system based on intelligent robots that optimizes workflow through efficient localization, communication, and dynamic sorting, effectively reducing inventory costs and enhancing automation efficiency [13]. However, their work indicates a lack of discussion on the specific challenges faced during implementation, suggesting the need for further exploration of potential improvements in robotic automation within warehouse management systems.
P. Ganesan et al. introduced a microprocessor-based mobile robotic approach for warehouse management that enhances efficiency through advanced localization and communication systems [14]. Yet, their study lacks an in-depth discussion of the challenges encountered during implementation, indicating a gap in understanding the complexities involved in enhancing robot control for efficient warehouse management. Ahmad et al. presented a low-cost indoor localization system using Wi-Fi, Inertial Measurement Unit (IMU) sensors, and wheel encoders to improve inventory management efficiency without incurring additional labor costs [15]. Despite its advantages, the study acknowledges difficulties in accurately tracking warehouse robots and the challenges associated with deploying Wi-Fi-based indoor localization systems, emphasizing the need to enhance Wi-Fi accuracy and sensor fusion techniques for better tracking performance.
Likhouzova et al. proposed an EVIN-based solution that integrates robotics and neural networks for optimizing robot paths in warehouse management systems [16]. Their approach aims to reduce collisions and enhance navigation efficiency while cutting production maintenance costs. However, there is a call for further exploration of EVIN technology integration to maximize its potential benefits within warehouse management systems.
While advanced technologies have significantly improved workflow and operational efficiency in warehouse management, a critical gap remains in the integration of automated battery management and real-time task reassignment for AGVs. Conventional manual processes disrupt workflow, increasing downtime and reducing system efficiency. There is a need for an automated solution capable of real-time battery monitoring and intelligent task allocation to minimize operational interruptions and enhance the overall efficiency and reliability of AGV operations in warehouse environments. Addressing this gap could lead to improved productivity and cost reduction.
The objectives of this study are to develop an advanced localization system for precise AGV tracking, design a reliable wireless communication network for real-time data exchange, and implement an integrated battery management system to minimize downtime and enhance productivity. Therefore, this study introduces a prototype of the warehouse robot localization and communication system designed to optimize battery management and maintain an uninterrupted workflow in warehouse environments. The system includes autonomous mobile robots equipped with advanced localization and wireless communication technologies. When a robot currently assigned a task has its battery level drop below a predefined threshold, it communicates with the main computer via Wi-Fi to request assistance. An available robot then adjusts its task, navigating the shortest path to the low-battery robot’s location, guided by a webcam system. Then, the low-battery robot proceeds to a charging station after transferring its task to the arrived assisting robot. System operation can be monitored via a custom-designed Graphical User Interface (GUI).

2. Materials and Methods

2.1. System Functionalities

2.1.1. Main System Functionalities

The voltage divider principle was used to measure the voltage of the battery of the UGV, which sends a signal to the transmitter when the battery level reaches the threshold value. A button was designed on the GUI to manually send low battery signals to the control algorithm for testing purposes. The A* algorithm will be implemented to find the shortest route in a working environment, combining the advantages of Dijkstra’s algorithm and Best First Search algorithms. It utilizes a heuristic function to estimate the cost of reaching the destination from each node, guiding the search toward the most promising path while minimizing the number of nodes visited. The optimal path is determined by considering factors such as obstacles and other constraints for guiding an AGV, or route optimization. It navigates to the predefined charging point to recharge when the UGV’s battery level falls below the threshold and is replaced by another available AGV to continue the workflow. The AGV moves to the parking point awaiting a replacement request from the currently operational AGV, ensuring efficient operation and energy conservation.

2.1.2. Mapping Functionality

An overhead webcam captures video input at a low frame rate of 30 FPS, to facilitate real-time processing for location identification while minimizing Random Access Memory (RAM) usage, rather than high-speed motion capture. To manage the large matrix from the video input, a 10 × 10 pixel cell is converted into a single data point. If over 50% of the cell contains zeros, it is designated as 0, while if 50% or more contains ones, it is specified as 1, creating a simplified new matrix for further analysis. The edges of identified objects are expanded by two positions around areas marked as zeros to enhance object detection, allowing the UGV to navigate without collisions by treating these expanded areas as obstacles during route planning. The HSV color segmentation method is employed with calibrated threshold values for specific colors, utilizing filters to isolate relevant colors and contours to identify shapes, thereby effectively determining the direction and location of UGVs. Coordinates generated by the A* are as described above. Letters F, R, S, and E are used for forward, 45 degree clockwise rotation, 45 degree counterclockwise rotation, and stop, respectively, indicating UGV movements and ensuring the efficient navigation and operation of the UGVs within their environment, as shown in Figure 1.
Figure 2 illustrates the geometric relationship between the corresponding floor area captured in the image and the overhead camera module’s Field of View (FOV). The camera module, mounted at a height of 2.5 m, captures an area on the floor which is 2 m wide in the horizontal direction (Horizontal Field of View, HFOV) and 2.2 m in the vertical direction (Vertical Field of View, VFOV). These dimensions correspond to the pixel resolution of the camera module, which is 1920 pixels horizontally and 1080 pixels vertically. By analyzing the camera’s HFOV and VFOV, the angles α and β can be determined. The floor area can then be mapped onto pixel values based on the camera module’s resolution. For instance, each square meter of the floor can be converted into an equivalent number of pixels in both dimensions, allowing for the precise detection and analysis of objects within the camera’s view. This conversion is critical for accurately tracking movements and events detected by the camera module in a defined space.

2.1.3. Communication Functionalities

The main ESP32 board, equipped with a Tensilica Xtensa 32-bit LX6 microprocessor (San Jose, CA, USA), continuously monitors for incoming signals. Upon receiving a signal, it promptly detects and processes it. The board communicates route letters to all receiver boards using Wi-Fi, facilitating efficient data transmission across the system. Although data are forwarded to all receivers from the main ESP32 board, only the relevant UGV responds to the specific commands associated with those data, ensuring that operations are streamlined and targeted.

3. Results and Discussion

The threshold values for each color were determined using trackbars created in the GUI, which are essential for real-time object and contour detection through color segmentation algorithms, as shown in Figure 3. By utilizing lower and upper HSV values, a specific range of colors can be isolated from the acquired images. Pixels falling below the lower HSV value or above the upper HSV value are excluded from the segmentation process. While fine-tuning these HSV values can yield more accurate results, limitations arise from using a general-purpose webcam for color detection. The specified colors can be filtered out using a generated mask, which is further refined through morphological operations, as illustrated in Figure 3. It is important to note that color calibration is significantly influenced by the lighting conditions in the testing environment; natural lighting can alter the perceived color of objects. Therefore, implementing a controlled artificial lighting method is crucial for achieving accurate color representation. Additionally, various noise factors such as shadows, reflections, and illumination variations, as depicted in Figure 4, must be addressed to ensure clear and usable images.
The accuracy of color detection in this project is significantly influenced by the camera module settings, including white balance, exposure, and saturation. Using a general-purpose webcam often results in color shifts and inaccuracies in the acquired images, making it essential to adjust and optimize the settings of a professional-grade camera module for accurate color representation. Additionally, the selected color space plays a crucial role; thus, the system must be periodically calibrated to account for environmental factors, such as natural light, to determine optimal threshold values. In this project, the HSV color space is employed for color segmentation tasks, although alternative representations like RGB require different thresholding values. The texture and reflectivity of the mobile robots’ chassis also affect color detection, necessitating that color tags be placed on white surfaces for enhanced visibility. Moreover, shadows, occlusions, and lighting directions can further complicate detection accuracy. To improve the FOV, a 180-degree wide-angle lens is proposed; however, barrel and pincushion distortion may affect image analysis. Proper camera module mounting is critical for maximizing FOV and ensuring precise localization, thus requiring careful consideration of the camera’s angle. In a nutshell, employing a higher-resolution imaging device would facilitate easier detection and tracking of mobile robots by providing finer detail.

4. Conclusions

This study developed a Warehouse Robot Localization and Communication System to optimize battery management and enhance workflow efficiency in warehouse environments. The system integrated advanced localization techniques, the A* pathfinding algorithm, and a reliable wireless communication network to ensure the continuous and efficient operation of AGVs. By enabling real-time task reassignment and automated battery monitoring, the prototype minimized downtime and reduced the need for manual intervention. Experimental results demonstrated the effectiveness of the system in maintaining uninterrupted operations and improving productivity. Regular monitoring and the adjustment of calibration processes are vital for achieving reliable results in dynamic warehouse environments. With further enhancements, such as incorporating higher resolution camera modules and more robust communication protocols, this system has the potential for practical industrial applications, contributing to the goals of Industry 4.0.

Author Contributions

Conceptualization, S.D. and C.H.; methodology, S.D. and C.H.; software, S.D. and C.H.; validation, C.H.; formal analysis, S.D.; investigation, C.H.; resources, S.D. and C.H.; data curation, S.D.; writing—original draft preparation, S.D. and N.S.K.; writing—review and editing, N.S.K. and S.D.; visualization, S.D. and N.S.K.; supervision, R.E.W. and A.W.; project administration, S.D. and C.H.; funding acquisition, S.D. and C.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

AGVAutonomous Guided Vehicle
UGVUnmanned Ground Vehicle
GPSGlobal Positioning System
Wi-FiWireless Fidelity
IMUInertial Measurement Unit
GUIGraphical User Interface
RAMRandom Access Memory
FOVField of View
VFOVVertical Field of View

References

  1. Xie, W.; Peng, X.; Liu, Y.; Zeng, J.; Li, L.; Eisaka, T. Conflict-Free Coordination Planning for Multiple Automated Guided Vehicles in an Intelligent Warehousing System. Simul. Model. Pract. Theory 2024, 134, 102945. [Google Scholar] [CrossRef]
  2. Raguraman, P.; Meghana, A.; Navya, Y.; Karishma, S.; Iswarya, S. Color Detection of RGB Images Using Python and OpenCv. Int. J. Sci. Res. Comput. Sci. Eng. Inf. Technol. 2021, 7, 109–112. [Google Scholar] [CrossRef]
  3. Mohd Ali, N.; Md Rashid, N.K.A.; Mustafah, Y.M. Performance Comparison between RGB and HSV Color Segmentations for Road Signs Detection. Appl. Mech. Mater. 2013, 393, 550–555. [Google Scholar] [CrossRef]
  4. Department of Computer Science, Lady Doak College, Madurai; Hema, D.; Kannan, D.S.; Department of Computer Applications, School of Information Technology, Madurai Kamaraj University. Madurai Interactive Color Image Segmentation Using HSV Color Space. Sci. Technol. J. 2019, 7, 37–41. [Google Scholar] [CrossRef]
  5. Simbolon, S.; Jumadi, J.; Khairil, K.; Yupianti, Y.; Yulianti, L.; Supiyandi, S.; Windarto, A.P.; Wahyuni, S. Image Segmentation Using Color Value of the Hue in CT Scan Result. J. Phys. Conf. Ser. 2022, 2394, 012017. [Google Scholar] [CrossRef]
  6. Hossen, M.K.; Bari, S.M.; Barman, P.P.; Roy, R.; Das, P.K. Application of Python-OpenCV to Detect Contour of Shapes and Colour of a Real Image. Int. J. Nov. Res. Comput. Sci. Softw. Eng. 2022, 9, 20–25. [Google Scholar] [CrossRef]
  7. Wang, H.; Yu, K.; Yu, H. Mobile Robot Localisation Using ZigBee Wireless Sensor Networks and a Vision Sensor. Int. J. Model. Identif. Control. 2010, 10, 184. [Google Scholar] [CrossRef]
  8. Atali, G.; Garip, Z.; Karayel, D.; Ozkan, S.S. Localization of Mobile Robot Using Odometry, Camera Images and Extended Kalman Filter. Acta Phys. Pol. A 2018, 134, 204–207. [Google Scholar] [CrossRef]
  9. Rachmawati, D.; Gustin, L. Analysis of Dijkstra’s Algorithm and A* Algorithm in Shortest Path Problem. J. Phys. Conf. Ser. 2020, 1566, 012061. [Google Scholar] [CrossRef]
  10. Batik Garip, Z.; Karayel, D.; Ozkan, S.S.; Atali, G. Path Planning for Multiple Mobile Robots Using A* Algorithm. Acta Phys. Pol. A 2017, 132, 685–688. [Google Scholar] [CrossRef]
  11. Abdhul Rahuman, M.A.; Kahatapitiya, N.S.; Amarakoon, V.N.; Wijenayake, U.; Silva, B.N.; Jeon, M.; Kim, J.; Ravichandran, N.K.; Wijesinghe, R.E. Recent Technological Progress of Fiber-Optical Sensors for Bio-Mechatronics Applications. Technologies 2023, 11, 157. [Google Scholar] [CrossRef]
  12. Xu, X.; Yao, W.W.; Gong, L. Information Technology in Intelligent Warehouse Management System Based on ZigBee. Adv. Mater. Res. 2014, 977, 468–471. [Google Scholar] [CrossRef]
  13. Li, Z.; Barenji, A.V.; Jiang, J.; Zhong, R.Y.; Xu, G. A Mechanism for Scheduling Multi Robot Intelligent Warehouse System Face with Dynamic Demand. J. Intell. Manuf. 2020, 31, 469–480. [Google Scholar] [CrossRef]
  14. Ganesan, P.; Sajiv, G.; Leo, L.M. Warehouse Management System Using Microprocessor Based Mobile Robotic Approach. In Proceedings of the 2017 Third International Conference on Science Technology Engineering & Management (ICONSTEM), Chennai, India, 23–24 March 2017; pp. 868–872. [Google Scholar]
  15. Ahmad, U.; Poon, K.; Altayyari, A.M.; Almazrouei, M.R. A Low-Cost Localization System for Warehouse Inventory Management. In Proceedings of the 2019 International Conference on Electrical and Computing Technologies and Applications (ICECTA), Ras Al Khaimah, United Arab Emirates, 19–21 November 2019; pp. 1–5. [Google Scholar]
  16. Likhouzova, T.; Demianova, Y. Robot Path Optimization in Warehouse Management System. Evol. Intel. 2022, 15, 2589–2595. [Google Scholar] [CrossRef]
Figure 1. Navigational path representation using F (Forward), R (45° Clockwise Rotation), and S (45° Counterclockwise Rotation) commands.
Figure 1. Navigational path representation using F (Forward), R (45° Clockwise Rotation), and S (45° Counterclockwise Rotation) commands.
Engproc 82 00050 g001
Figure 2. The relationship between the horizontal field of view (HFOV) and vertical field of view (VFOV) of the camera to the detection floor area.
Figure 2. The relationship between the horizontal field of view (HFOV) and vertical field of view (VFOV) of the camera to the detection floor area.
Engproc 82 00050 g002
Figure 3. Color calibration for system tuning in developed GUI. (a) Generated mask for filtering out blue color from the captured image. (b)Threshold value in the range of 0–255 for different colors is found by changing the hue saturation value (HSV) values on the designed track bar for autonomous guided vehicles (AGV) identification. (c) Blue color rectangle detected by generated mask to identify the AGV.
Figure 3. Color calibration for system tuning in developed GUI. (a) Generated mask for filtering out blue color from the captured image. (b)Threshold value in the range of 0–255 for different colors is found by changing the hue saturation value (HSV) values on the designed track bar for autonomous guided vehicles (AGV) identification. (c) Blue color rectangle detected by generated mask to identify the AGV.
Engproc 82 00050 g003
Figure 4. (a) Original image captured by the overhead camera module when lighting conditions are not controlled; (b) improper lighting conditions in the testing environment highly affect the color calibration.
Figure 4. (a) Original image captured by the overhead camera module when lighting conditions are not controlled; (b) improper lighting conditions in the testing environment highly affect the color calibration.
Engproc 82 00050 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dhanushka, S.; Hasaranga, C.; Kahatapitiya, N.S.; Wijesinghe, R.E.; Wijethunge, A. Efficient Battery Management and Workflow Optimization in Warehouse Robotics Through Advanced Localization and Communication Systems. Eng. Proc. 2024, 82, 50. https://doi.org/10.3390/ecsa-11-20416

AMA Style

Dhanushka S, Hasaranga C, Kahatapitiya NS, Wijesinghe RE, Wijethunge A. Efficient Battery Management and Workflow Optimization in Warehouse Robotics Through Advanced Localization and Communication Systems. Engineering Proceedings. 2024; 82(1):50. https://doi.org/10.3390/ecsa-11-20416

Chicago/Turabian Style

Dhanushka, Shakeel, Chamoda Hasaranga, Nipun Shantha Kahatapitiya, Ruchire Eranga Wijesinghe, and Akila Wijethunge. 2024. "Efficient Battery Management and Workflow Optimization in Warehouse Robotics Through Advanced Localization and Communication Systems" Engineering Proceedings 82, no. 1: 50. https://doi.org/10.3390/ecsa-11-20416

APA Style

Dhanushka, S., Hasaranga, C., Kahatapitiya, N. S., Wijesinghe, R. E., & Wijethunge, A. (2024). Efficient Battery Management and Workflow Optimization in Warehouse Robotics Through Advanced Localization and Communication Systems. Engineering Proceedings, 82(1), 50. https://doi.org/10.3390/ecsa-11-20416

Article Metrics

Back to TopTop