Next Article in Journal
Combining CS Unplugged and L2T2L to Bridge the Computing Illiteracy Gap of the Elderly Population: A Case Study
Previous Article in Journal
Physicochemical and Sensory Evaluation of Gummy Candies Fortified with Microcapsules of Guinea Pig (Cavia porcellus) Blood Erythrocytes and Tumbo (Passiflora tarminiana) Juice
Previous Article in Special Issue
Robust Position Control of a Knee-Joint Rehabilitation Exoskeleton Using a Linear Matrix Inequalities-Based Design Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Indoor Positioning Systems and Autonomous Mechatronic Systems for Surveillance of Intrabuilding Zones

Faculty of Mechanical Engineering and Mechatronics, National University of Science and Technology POLITEHNICA Bucharest, 313 Splaiul Independentei, 060042 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(2), 918; https://doi.org/10.3390/app15020918
Submission received: 22 November 2024 / Revised: 24 December 2024 / Accepted: 14 January 2025 / Published: 17 January 2025

Abstract

:
Given increasingly complex threats, adapting surveillance approaches to meet the necessary security levels is essential. The aim of this paper is to develop a surveillance architecture based on autonomous mechatronic systems (mobile robots and drones) designed to secure areas of strategic interest, covering a predefined space while providing enhanced security at minimal costs. Designing such a system is challenging due to the need for continuous monitoring, raising issues around autonomy, as well as the design of the navigation and command and control systems. This paper presents key surveillance systems, demonstrating their efficiency and further development potential. This paper discusses the steps taken to enable a team of autonomous mobile robots to monitor a known indoor location by using a specialized positioning system. Steps are taken to deploy, implement, and configure the said indoor global positioning system (GPS). Among those steps, a study is performed by attaching one of the mobile beacons to a linear axis and moving said axis to known points. This position is read by both the axis and the positioning system, and the results are compared.

1. Introduction

An unmanned ground or aerial system includes a set of subsystems, such as the vehicle or aircraft, payloads, control stations (often additional remote stations), launch and recovery elements for aircrafts, support subsystems, communication subsystems, transportation subsystems, etc. It also must be considered as part of a land or air environment, capable of operating over restricted or extended areas, considering specific rules and regulations.
Unmanned systems typically have similar elements to manned vehicle or aircraft systems, but in this case, they are designed for operation without an onboard crew. The crew (as a subsystem), with its interface with vehicle/aircraft controls and position, is replaced with an electronic information and control subsystem.
Therefore, a robot is considered a complex, computer-programable system equipped with microprocessors, sensors, actuators, mechanical structures, capable of action, perception, decision making, and communication, able to operate in complex and variable environments.
Initially, a remotely piloted vehicle (RPV) was used for unmanned aircraft, but as systems began to incorporate terrestrial or underwater vehicles, other acronyms emerged to clarify reference to aerial vehicle systems. Currently, unmanned aerial vehicle (UAV) is the general term for an aircraft (aerial vehicle) in an unmanned aerial system (UAS) [1,2]. While useful, such systems are generally better suited to being integrated in outdoor surveillance systems.
The purpose of this work is to create an architecture using unmanned ground and aerial vehicles for the surveillance of objectives. Designing such a system is challenging due to the continuous monitoring required, raising issues of autonomy as well as the design of navigation, command, and control systems. Supplementarily, our work pertains to usage for the monitoring of large, indoor objectives, such as warehouses, public institutions, stadiums, concert halls, and so on.
Combining mobile robots with drones ensures increased system autonomy, lower costs for creation and operation, and easier maintenance over extended periods [3]. Mobile robots and drones can replace human security patrols, reducing costs and eliminating repetitive and monotonous tasks. These systems must possess key features such as autonomy, intelligence, flexibility, scalability, and precision. Their implementation requires meticulous planning, selecting appropriate equipment, and complying with local regulations. Such systems have been discussed in detail in papers such as [4,5].
This work proposes the development of an autonomous mechatronic system for monitoring strategic objectives, aiming to cut costs and enhance security. It emphasizes the importance of miniaturization, modularity, energy efficiency, and eco-friendly solutions in the design of such systems. Securing strategic objectives is a fundamental concern for modern states [6,7], with profound implications for economic stability, national security, and citizen protection. These objectives include critical infrastructures such as power plants, transportation networks, military units, and government institutions, all playing essential roles in societal functions. With global interdependence and emerging threats, protecting these resources becomes a strategic priority.
Regarding indoor positioning, most papers focus on the development and challenges of autonomous indoor surveillance systems using robots, addressing critical aspects such as navigation, sensing, and cooperative strategies. As such, paper [8] provides an overview of the opportunities and challenges in deploying autonomous robots for indoor surveillance. The paper identifies navigation, real-time decision making, and advanced sensing as key areas requiring innovation. It emphasizes the importance of robust algorithms and sensor technologies to enable robots to operate effectively in dynamic environments. Further, paper [9] describes the design and implementation of an autonomous indoor surveillance robot. Their system uses a combination of sensors, such as ultrasonic and infrared, along with mapping and obstacle-avoidance algorithms to enable robots to patrol predefined indoor spaces. The paper highlights the importance of efficient navigation and real-time environmental awareness. Paper [10] proposes a hybrid vision-based system that integrates optical cameras and depth sensors. Their approach improves target detection and environmental monitoring accuracy, enabling robots to adapt to complex indoor settings. The hybrid system enhances surveillance capabilities by combining the strengths of multiple sensing technologies. Further on, paper [11] focuses on a multi-sensor fusion approach for autonomous surveillance. The system integrates data from LiDAR, ultrasonic sensors, and cameras to improve environmental perception and obstacle detection. By combining sensor inputs, the robots achieve higher navigation precision, ensuring reliable surveillance in dynamic and cluttered environments. Paper [12] introduces a multi-agent system for indoor surveillance, where multiple robots collaborate to optimize task allocation and area coverage. Their study highlights cooperative strategies and communication protocols that allow robots to patrol efficiently while minimizing overlap and maximizing coverage. Collectively, these papers demonstrate innovative approaches to indoor surveillance using autonomous robots. They emphasize the importance of integrating sensors, advanced algorithms, and multi-robot cooperation to address challenges such as navigation, obstacle avoidance, and real-time monitoring in dynamic environments, being a topic of great interest among researchers for different approaches of indoor positioning ssystems [13,14,15,16,17,18,19,20,21,22,23,24].
The present study proposes an autonomous monitoring system using mobile robots and drones to secure strategic objectives. This system integrates advanced sensors and software algorithms to ensure efficient and secure surveillance of predefined areas, meeting modern security requirements. It offers benefits such as reduced operational costs, elimination of repetitive human tasks, and an eco-friendly, energy-efficient infrastructure. Unmanned terrestrial and aerial systems are explored, highlighting their applications in security, emergency response, environmental monitoring, agriculture, and infrastructure protection. These systems leverage artificial intelligence and advanced technologies for monitoring, rapid response, and safeguarding assets against evolving threats.
Mobile robots are equipped with technologies such as AI and sensors, enabling autonomous navigation, route planning, mapping, and obstacle avoidance. Various locomotion mechanisms—wheels, tracks, and legs—are utilized, catering to environments such as industrial inspections, search and rescue, and exploration. Aerial mechatronic systems combine mechanics, electronics, and control engineering, emphasizing automation, miniaturization, safety, and predictive maintenance. The study also classifies drones based on design: fixed-wing for long distances, rotary-wing for stability, hybrids, single-rotor for heavy payloads, ornithopters, nano/micro drones for confined spaces, and submersibles for air and underwater use. Applications range from smart city surveillance to ocean monitoring, addressing disaster detection and wide-area oversight. Collaborative systems of aerial and terrestrial robots are also discussed, capable of executing tasks such as strategic surveillance and disaster response. The challenges of using robots in diverse environments are addressed, emphasizing the need for safe human–robot interaction. The proposed system includes mobile robots patrolling indoor areas. Theoretical and experimental tests validated the design, including a mobile flying robot with calculations for thrust, motion, and battery life. Marvelmind beacons ensured accurate indoor localization (±2 cm), tested through various stages, including mobile robots and quadcopters. Implementation details include tracked robots and quadrupeds with advanced features such as visual AI and inverse kinematics for precise movements. Two drone prototypes, one teleoperated and another autonomous, demonstrated reliable performance through rigorous testing. Indoor positioning tests confirmed system accuracy, and a synchronization algorithm was proposed for multi-stage surveillance: area mapping, robot design, trajectory optimization, and sensor integration. Robots and drones collaborate autonomously to detect and report anomalies, with drones providing additional data. The system is customizable and operates efficiently to meet specific needs.

2. Establishing the Monitoring Scenario and Imposed Restrictions

In the new paradigm of large-area geographic and interior space monitoring, using autonomous mobile robots amid current security challenges presents an area where current studies do not necessarily offer a clear solution [25]. A critical aspect to consider is ensuring cybersecurity for these systems. Autonomous robots rely on wireless communications and sensors, making them vulnerable to cyber attacks such as data interception, spoofing, and malware [26,27]. Successful attacks could compromise robot functionality, leading to system failures or even physical harm. Security assurance requires encryption, secure communication protocols, and continuous monitoring against threats. Additionally, AI-driven robots must be trained to autonomously recognize and mitigate potential cybersecurity risks [28,29,30]. As robotics evolves, proactive cybersecurity measures will be essential to maintain trust and safety in these systems.
Beyond cybersecurity, it is crucial to recognize that the diverse configurations of geographical areas or building interiors navigated by these robots may require custom configurations for each scenario. Additionally, special attention must be given to the coexistence of humans and robots, ensuring that each operates without disrupting the other. Effective communication, trust, and clear safety protocols are vital for seamless interaction.
Indoor surveillance with robots is an evolving technology that uses autonomous robots equipped with cameras, sensors, and advanced algorithms to monitor and secure indoor spaces. These robots are designed to autonomously navigate environments such as offices, warehouses, hospitals, or homes, providing real-time surveillance without requiring human operators.
In this system, a team of autonomous robots will patrol an indoor building area. Robot trajectories will be both random and predefined based on specific rules developed for each situation.
The system will follow key stages, obtaining relevant data on events within the robots’ operational area. The algorithm will allow a variable number of mobile robots to patrol an area, equipped with a minimal set of sensors necessary only for movement, obstacle avoidance, and basic detection of disturbances in the area of interest. In this case, the robots follow either a predetermined or random trajectory, while a flying robot, designed to serve the entire system and equipped with high-performance, costly sensors, will intervene only in the event of a detected incident.
Robots patrol according to a predefined schedule until one detects a potential threat, at which point it sends an alarm signal to the drone to initiate an intervention, as depicted schematically in Figure 1. By initiating an automated intervention sequence, the drone enters an alert state, powers up its engines, and prepares to take off and inspect the area of interest. It quickly heads to the location indicated by the mobile robot, covering the distance in a much shorter time than another mobile robot or human intervention agent could. Simultaneously, all its equipment (sensors and camera) is activated.
The drone begins capturing images and videos from the air using its high-resolution cameras. Additionally, modern drones can utilize thermal cameras to identify heat sources at night or in low-visibility conditions, an option that could be implemented in future projects or viewed as an enhancement to the current project.
The data captured by the drone are transmitted in real time to the control center, where human operators or AI algorithms assess whether a genuine threat exists.
The drone maintains aerial surveillance, providing additional information for mobile robots or other units available for response. Based on observation results, the drone will either continue to monitor the area or collaborate with the mobile robot to track the target. If a physical intervention is needed, it can be carried out by other specialized units, supported by detailed information provided by the drone. Thus, the drone functions as an aerial observer within the system, offering continuous surveillance when required and enabling a rapid response to incidents detected by the mobile robot.

3. Implementation and Testing of an Indoor GPS System

The Marvelmind navigation system is an off-the-shelf indoor navigation system designed to provide precise location data (±2 cm) for autonomous robots and vehicles (AGV—automated guided vehicles). It can also track moving objects through attached mobile beacons. This navigation system includes fixed ultrasonic beacons connected via radio, one or more mobile beacons on objects being tracked, and a router providing access to a computer or peripheral device for monitoring. The mobile beacon’s location is calculated based on the ultrasonic time of flight (TOF) delay between stationary and mobile beacons using a trilateration algorithm.
The system operates by measuring the distance between beacons and mobile devices using ultrasonic signals and trilateration algorithms. Marvelmind systems can cover distances up to 50 m between beacons, depending on environmental conditions such as obstacles or interference. The total coverage area can be extended by adding more beacons to create a larger grid. This scalability makes it suitable for warehouses, robotics, and industrial applications, ensuring reliable and accurate indoor tracking. The system uses ultrasonic signals combined with trilateration to calculate positions with centimeter-level accuracy. To cover larger areas, additional beacons can be deployed, forming a scalable grid system that significantly extends the overall coverage area. By strategically placing beacons, Marvelmind can efficiently track objects or robots across extensive indoor spaces such as warehouses, factories, and office buildings.
The system is composed of stationary and mobile beacons as well as a modem needed to communicate both with the beacons and with an appropriate application.
Regarding the stationary beacons (Figure 2), they are usually mounted on walls or ceilings above the robot, with ultrasonic sensors facing downwards to achieve optimal ultrasonic signal coverage. For automatic landing and indoor navigation of copters, it is recommended to install a mobile beacon on the bottom of the flying system, oriented downwards. The placement and orientation of the beacons should be configured to ensure maximum coverage of the ultrasonic signal. The system’s effectiveness largely depends on the quality of the ultrasonic signal received by the stationary beacons. During the initial mapping configuration, stationary beacons emit and receive ultrasonic signals. Switching between roles (stationary and mobile) is accomplished via the application software and does not require and hardware modifications.
As stated before, an important part of the system is the router, which acts as the central data aggregation point for the system. It must remain powered while the navigation system is operational. The router also enables configuration, monitoring, and dashboard interaction. It can be placed anywhere within radio coverage, typically up to 100 m with antennas from the starter kit. The router is represented in Figure 3.
The indoor positioning system is accompanied by an application for the initial programming and configuration of the beacons. The application’s front-end dashboard is shown in Figure 4.
To obtain preliminary results and perform a general functionality test, the system was set up in a laboratory, where initial measurements were conducted. Figure 5 shows the equipment setup used for configuring the workspace, consisting of five beacons and a modem.
For the initial testing of the Marvelmind indoor positioning system, a configuration with two fixed and three mobile beacons (2D setup) was chosen. In this configuration, two stationary beacons were mounted at a height of 1 m from the perimeter of the test area. This setup is demonstrated in Figure 5. By utilizing this arrangement, the system was able to map and track positions within the designated test space, allowing for real-time monitoring and testing of the system’s capabilities under controlled conditions.
The mobile beacons will be introduced into the designated work perimeter one by one, manually. After each beacon’s introduction, the system’s reported coordinates will be verified for accuracy, and results will be documented. Figure 5 illustrates the placement of all beacons along with the output of the Marvelmind application.
Figure 6a–h are screenshots from the Marvelmind dashboard for different scenarios, including a single robot moving in a straight line in different directions, a curvilinear path for one, two, or three robots, etc.
Figure 7a–h are graphs plotted on the horizontal plane based on the (x, y) coordinates provided by the mobile beacons. These were generated using a Python 3 program and show trajectories that resemble those in Figure 7a–h, though not identical.
As expected, the two types are similar. The first set of images provides a qualitative view of the process, while the second set, based on real-time coordinates, is more precise and quantitatively valuable.
The objective of this test was to develop an application that not only displays beacon positions quantitatively and visually, but also provides coordinates in a format easily processed and used for the control of mobile robots and the entire system.

4. Theoretical Considerations on the Positions Obtained from the Indoor Positioning System

In order to establish a mathematical means of confirming the positions returned by the indoor positioning system, a simple geometrical model was used. In the following, we will consider four fixed beacons arranged as in Figure 8, halfway along the sides of the considered work volume and all placed at the same height. In this situation, the dimensions of the work volume Lx, Ly, and Lz are known. We will attach to this volume a Cartesian coordinate system Oxyz with the origin in the center of the base surface. The robot will move on the base surface considered flat. The current coordinates of the robot are (x*, y*). The Marvelmind system used allows the real-time measurement of the distances between the fixed beacons BF1, BF2, BF3, and BF4 and the beacon mounted on the robot R, i.e., the distances d1, d2, d3, and d4 (Figure 8). Next, the robot coordinates will be determined depending on the geometry considered above and the measured distances.
  • BF1, BF2, BF3, BF4—fixed beacons;
  • R—mobile beacon;
  • Lx, Ly, Lz—dimensions of workspace;
  • d1, d2, d3, d4—distances to fixed beacons, respectively;
  • x*, y*—robot current coordinates.
In the right triangle ABR, the Pythagorean relation can be written:
A R = x * 2 + y * + L y 2 2 ,
and in right triangle B F 1 _ A R :
d 1 = L z 2 + x * 2 + y * + L y / 2 2 ,
and finally:
x * 2 + y * + L y / 2 2 = d 1 2 L z 2 .
Finally, the equations to determine the position of the robot relative to the fixed beacons are as follows:
BeaconEquation
BF1 x * 2 + ( L y 2 + y * ) 2 = d 1 2 L z 2
BF2 y * 2 + ( L x 2 + x * ) 2 = d 2 2 L z 2
BF3 x * 2 + ( L y 2 + y * ) 2 = d 3 2 L z 2
BF4 y * 2 + ( L x 2 + x * ) 2 = d 4 2 L z 2
Note: in the considered situation, since the robot moves on a flat surface, two fixed beacons are sufficient to determine the current coordinates of the robot.
Next, the two coordinates will be determined based on the information received from beacons BF1 și BF3.
Once the distances d1 and d3 from are determined, the current coordinates of the robot x* and y* can be determined. To do this, the difference between the two expressions can be obtained:
y * + L y / 2 2 L y 2 y * 2 = d 1 2 d 3 2 ,
and the y * , coordinate as:
y * = d 1 2 d 3 2 2 · L y .
By substituting this expression into relation (1) we obtain:
x * = d 1 2 L z 2 d 1 2 d 3 2 2 · L y L z 2 .
In the same way, the coordinates of x* and y* can be determined by considering the information from other pairs of fixed beacons. In this situation, the coordinates can be determined with very good precision, each of them representing the arithmetic mean of the coordinates calculated for each individual case.
Following this model, a second variant was considered, one that will allow the positions of the mobile beacons to be determined by using their coordinates relative to the fixed beacons. In order to perform this second model, the working area was divided into squares, each with a side of 750 mm. Figure 9 depicts a schematic representation of the work area, where points A, with coordinates XY (0, 140 cm), and B, with coordinates (0, 280 cm), were marked. Alongside this division, a position calculation model for the beacons was established based on their coordinates. The model will be used to verify the data provided by the indoor positioning system. To perform this verification, the positions of the fixed beacons BF1 and BF2, as well as the initial points in the plane where the mobile beacons BM1, BM2, and BM3 will be placed, must be determined.
Once these positions are established, the distances *d1*, *d2*, and *d3* (Figure 10) between the fixed beacon BF1 and the mobile beacons BM1, BM2, and BM3, respectively, can be calculated, as well as the distances *D1*, *D2*, and *D3* (Figure 11) between the fixed beacon BF1 and the mobile beacons BM1, BM2, and BM3, respectively.
Mobile beacon coordinates are as follows:
B M 1 L x 4 , L y 2 , 0 B M 2 L x 2 , L y 2 , 0 B M 3 3 · L x 4 , L y 2 , 0 .
Right triangles Δ A B B M _ 1 ,   Δ A B B M _ 2 ,   a n d   Δ A B B M _ 3 allow determining:
a 1 = L x 4 2 + L y 2 2 a 2 = 4 · L x 4 2 + L y 2 2 a 3 = 9 · L x 4 2 + L y 2 2 .
Additionally, right triangles Δ B B F _ 1 B M _ 1 ,   Δ B B F _ 1 B M _ 2 , a n d   Δ B B F _ 1 B M _ 3 reveal:
d 1 = L z 2 + L x 4 2 + L y 2 2   d 2 = L z 2 + 4 · L x 4 2 + L y 2 2   d 3 = L z 2 + 9 · L x 4 2 + L y 2 2   .
Right triangles Δ C D B M _ 1 ,   Δ C D B M _ 2 , a n d   Δ C D B M _ 3 allow for the following:
{ b 1 = 9 · L x 4 2 + L y 2 2   b 2 = 4 · L x 4 2 + L y 2 2   b 3 = L x 4 2 + L y 2 2   ,
and from right triangles Δ B B F _ 2 B M _ 1 ,   Δ B B F _ 2 B M _ 2 , a n d   Δ B B F _ 2 B M _ 3 :
{ D 1 = L z 2 + 9 · L x 4 2 + L y 2 2   D 2 = L z 2 + 4 · L x 4 2 + L y 2 2   D 3 = L z 2 + L x 4 2 + L y 2 2   .
Finally, this model allowed us to compare the results obtained from the system to the ones calculated.

5. Testing of Indoor Positioning System

In order to test the indoor positioning system from both a precision and functional standpoint, a test bench was set up consisting of an electrically driven translation axis on which the mobile beacon will be mounted. Since the functional characteristics of the axis are known in advance (length, speed, and positioning accuracy), it will be possible to compare the results obtained by moving the mobile slider of the axis in different positions and the coordinates given by the indoor GPS system.
The translation axis is driven by a DC motor equipped with a reducing gear and an incremental transducer. The rotational movement of the shaft at the output of the gear is transmitted to a mobile sled by means of a toothed belt.
The motor drives a toothed belt through a toothed pulley (D = 16 mm, 18 teeth, and GT2), a belt on which a translation sled is fixed, guided in turn by means of ball bushings and a guide shaft with a diameter of 8 mm. The axis is presented in Figure 12 and Figure 13.
To obtain fixed reference points in the analyzed workspace, a matrix composed of squares was constructed, each with a side of approximately 750 mm. A schematic of the surface is presented in Figure 14. Figure 15 shows the positioning of the axis over one of the points of the matrix, allowing for a somewhat precise positioning relative to the working area.
The functionality of the incremental encoder was tested by programing the microcontroller to process signals received from the position transducer. These signals were interpreted using a PID positioning algorithm to achieve accurate positioning of the mobile sled. Motor control is accomplished through a dedicated driver, which requires two digital control signals. One of these signals is a PWM signal used to regulate motor speed, where the speed is directly proportional to the duty cycle of the control signal.
The closed-loop axis positioning test utilized a PID algorithm to record specific axis positions, which were verified using the beacon system. As a result, the axis achieved a positioning accuracy of ±1 mm. Although this level of error may typically be considered significant, it was deemed acceptable since the axis is intended to test a positioning system with an accuracy of ±20 mm. For this test, two fixed beacons were installed at a height of 1 m above the test surface. Their placement is illustrated in Figure 16.
Figure 17 shows the position of the mobile beacon, fixed on the mobile sled of the translation axis. It will move according to the commands received from the microcontroller.
In the Marvelmind application, the workspace was created, in which the system origin (coordinate points 0,0) was defined, as well as the positions of the fixed beacons, marked in Figure 18 with numbers 4 and 8. In the same figure, the mobile beacon is marked with number 6.
Following these stages, the functionality of both the translation axis and the Marvelmind indoor positioning system was successfully demonstrated, enabling controlled positioning and accurate position determination. Subsequently, a test program was developed in Python to simultaneously evaluate the two systems. The program aimed to validate the coordinates provided by the indoor GPS system using the translation axis. For this purpose, an algorithm was implemented to monitor the state of the end-of-stroke sensor and control the operation of the electric motor until the sensor is triggered. At this point, the encoder counter is reset, defining the zero position of the translation axis. Starting from this reference point, the axis is moved across its entire working stroke based on the encoder data. Simultaneously, signals from the mobile beacon are collected, and the commands issued to the axis are compared with the positions reported by the indoor positioning system. Following the tests conducted using an application created in Python, the commands transmitted to the axis were synchronized with the information received from the indoor positioning system.
Results are presented in the graph in Figure 19. This graph shows the evolution of the Y coordinate of the mobile beacon over time. It can be observed that the appearance of some stops when the axis stops in the positions (0, 100, 200, 300, 400, 500, and 600 mm), as well as the transitions of the mobile sled between these positions, marked with colored areas in the graph.
Figure 20 shows the comparative evolution of the axis position (orange line) and the position recorded by the mobile beacon (blue line). The error obtained (the difference between the two measurements) is graphically presented in Figure 21, where it can be seen that the value proposed by the system manufacturer, of +/−20 mm, is not exceeded.

6. Discussion

After testing the indoor positioning system, a small team of mobile robots was put together to better understand how the system might behave in certain scenarios.
Figure 22a shows a structure of a tracked robot. For this type of robot, a commercially available chassis model was used. The chosen chassis is equipped with two DC motors with gearboxes, one for each track. In order to navigate the environment, robots have been equipped with four ultrasonic sensors, as depicted in Figure 22b. The tracked robots are programed using a very simple algorithm that allows them to switch between random movement or heading straight along the longest distance, as indicated by the ultrasonic sensors. Certainly, 3D printing and autonomous robots transform industries by enabling rapid, customizable manufacturing and precise automation [31]. All inter-robot communication is performed by using Wi-Fi communication, using the model included in either the microcontroller or SBCs in their construction. This is also true for the drone included in the system’s construction.
In order to diversify the team, two four-legged robots were included. A commercially available variant is a robotic dog equipped with visual artificial intelligence, featuring 12 degrees of freedom. It is built using six servomotors, aluminum alloy brackets, and a video camera. The robot can perform a variety of human-like actions and move in any direction, controlling its attitude across six dimensions/six degrees of freedom—the robot’s position along the X, Y, and Z axes, as well as orientation control, including pitch, roll, and yaw. Equipped with IMU sensors and angle sensors for the servomotors, DOGZILLA provides real-time feedback on its position and joint angles. Using inverse kinematics algorithms, the robot can execute various types of movements. A Raspberry Pi serves as the main controller, complemented by additional configurations such as Lidar and a voice module. By programming in Python, the robot can perform diverse functions, including AI-based visual recognition, navigation using Lidar maps, and voice control. An image of the robot is shown in Figure 23.
For enhanced efficiency and remote monitoring, the surveillance system has been upgraded by introducing a drone equipped with the same type of beacon for indoor testing. This will assist with positioning the drone in the environment. In this stage, only 2D positioning of the device is needed. Figure 24 shows a view of all the robots to be used.
Following this stage, after the system is installed and activated, it operates through a series of coordinated steps to ensure efficient monitoring and response. First, the Marvelmind measurement systems continuously track and read the position of each team member in real time, ensuring precise localization at any moment. Next, the system identifies the location of an event or anomaly using data collected by the sensors integrated into the mobile robots. These sensors are designed to detect unusual activities, environmental changes, or critical conditions in the monitored area.
If an anomaly is detected, the system immediately dispatches a drone to the exact coordinates of the mobile robot that initially reported the event. The drone, equipped with advanced sensors and real-time monitoring capabilities, positions itself close to the critical location to gather supplementary information. This allows the system to refine its understanding of the situation, ensuring more accurate assessments.
The drone transmits detailed data back to the command systems, including visual, positional, or environmental feedback. With this additional input, the command system analyzes the event and determines the most appropriate course of action based on the nature and severity of the anomaly. This step ensures that responses are efficient, targeted, and tailored to the specific circumstances, ultimately improving the overall reliability and effectiveness of the surveillance and intervention system.

7. Conclusions

The paper describes the steps taken to install, deploy, and test an indoor positioning system that will be used to synchronize the movements of mobile autonomous robots in order to implement a new surveillance algorithm. The surveillance of strategic targets, such as critical infrastructures, military areas, borders, transport networks, or sensitive industrial spaces, poses significant technological and security challenges. These challenges are generated by the increasing complexity of threats, from cyber attacks and physical sabotage to industrial security incidents or terrorism.
The involvement of humans in the surveillance of these strategic targets may remain important, but it has a number of limitations and challenges. Long-term surveillance can lead to fatigue and decreased attention, which can compromise human efficiency, especially in the case of monitoring complex systems or physical patrols. Human personnel can be a direct target for attackers, and their physical security is a concern in itself. Security personnel require ongoing training, salaries, and the provision of a safe working environment, which can involve significant costs, especially in areas where constant surveillance is required. In hard-to-reach places or in extreme environmental conditions, people may be physically limited, and the use of robots and drones becomes more efficient and safer.
Following the work presented in this paper, a synchronization algorithm coordinates the movements of all robots to avoid overlapping tasks and maximize target coverage. The robots communicate with each other, by radio, to update each other’s positions and states in real time. Each robot has an optimized path, avoiding collisions and optimizing reaction time. Using proximity sensors, each robot detects activities or anomalies in its area of action, and in case of danger, transmits the data to the drone. The synchronization algorithm manages this data in real time and transmits commands to the drone.
Furthermore, further testing the system will be performed in a 3D environment, allowing for all the autonomous robots (terrestrial and flying) to be concurrently used without limiting usage to a single plane. Following the analysis of the research results and the experimental determinations presented in this paper, several further development directions have been identified, including the implementation of the proposed system for various beneficiaries, the development of an external monitoring system for strategic objectives, the integration of artificial intelligence to develop new control and command algorithms for the system, conducting an economic study for the developed system, as well as the implementation of other types of robots and drones capable of operating in external conditions. Additionally, it is necessary to include in the control algorithm conditions related to integrating human partners into the system.

Author Contributions

Conceptualization, V.C. and A.V.; methodology, V.C., A.V. and E.M.; software, V.C. and A.V.; validation, M.A. and E.M.; formal analysis, M.A. and E.M.; investigation, V.C., E.M. and A.V.; resources; data curation, V.C.; writing—original draft preparation, E.M. and V.C.; writing—review and editing, A.V., M.A. and E.M.; visualization, V.C. and E.M.; supervision, E.M. and M.A.; project administration, V.C.; funding acquisition, V.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Austin, R. Unmanned Aircraft Systems: UAVS Design, Development and Deployment; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  2. Raj, R.; Kos, A. A comprehensive study of mobile robot: History, developments, applications, and future research perspectives. Appl. Sci. 2022, 12, 6951. [Google Scholar] [CrossRef]
  3. Sharma, N.; Pandey, J.K.; Mondal, S. A review of mobile robots: Applications and future prospect. Int. J. Precis. Eng. Manuf. 2023, 24, 1695–1706. [Google Scholar] [CrossRef]
  4. Chun, W.H.; Papanikolopoulos, N. Robot surveillance and security. In Springer Handbook of Robotics; Springer: Cham, Switzerland, 2016; pp. 1605–1626. [Google Scholar]
  5. Chang, H.C.; Hsu, Y.L.; Hsiao, C.Y.; Chen, Y.F. Design and implementation of an intelligent autonomous surveillance system for indoor environments. IEEE Sens. J. 2021, 21, 17335–17349. [Google Scholar] [CrossRef]
  6. Nitzan, D. Development of intelligent robots: Achievements and issues. IEEE J. Robot. Autom. 1985, 1, 3–13. [Google Scholar] [CrossRef]
  7. Rojas, I.; Joya, G.; Catala, A. (Eds.) Advances in Computational Intelligence. In Proceedings of the 16th International Work-Conference on Artificial Neural Networks, IWANN 2021, Virtual Event, 16–18 June 2021; Proceedings, Part I. Springer Nature: Berlin/Heidelberg, Germany, 2021; Volume 12861. [Google Scholar]
  8. Di Paola, D.; Milella, A.; Cicirelli, G.; Distante, A. An autonomous mobile robotic system for surveillance of indoor environments. Int. J. Adv. Robot. Syst. 2010, 7, 8. [Google Scholar] [CrossRef]
  9. Dharmasena, T.; Abeygunawardhana, P. Design and implementation of an autonomous indoor surveillance robot based on raspberry pi. In Proceedings of the 2019 International Conference on Advancements in Computing (ICAC), Malabe, Sri Lanka, 5–6 December 2019; pp. 244–248. [Google Scholar]
  10. De Cristóforis, P.; Nitsche, M.; Krajník, T.; Pire, T.; Mejail, M. Hybrid vision-based navigation for mobile robots in mixed indoor/outdoor environments. Pattern Recognit. Lett. 2015, 53, 118–128. [Google Scholar] [CrossRef]
  11. Ciuffreda, I.; Casaccia, S.; Revel, G.M. A multi-sensor fusion approach based on pir and ultrasonic sensors installed on a robot to localise people in indoor environments. Sensors 2023, 23, 6963. [Google Scholar] [CrossRef] [PubMed]
  12. Chiperi, M.; Trascau, M.; Mocanu, I.; Florea, A.M. Data fusion in a multi agent system for person detection and tracking in an intelligent room. In Intelligent Distributed Computing VIII; Springer International Publishing: New York, NY, USA, 2015; pp. 385–394. [Google Scholar]
  13. Huang, J.; Junginger, S.; Liu, H.; Thurow, K. Indoor positioning systems of mobile robots: A review. Robotics 2023, 12, 47. [Google Scholar] [CrossRef]
  14. Sandamini, C.; Maduranga, M.W.P.; Tilwari, V.; Yahaya, J.; Qamar, F.; Nguyen, Q.N.; Ibrahim, S.R.A. A review of indoor positioning systems for UAV localization with machine learning algorithms. Electronics 2023, 12, 1533. [Google Scholar] [CrossRef]
  15. Che, F.; Ahmed, Q.Z.; Lazaridis, P.I.; Sureephong, P.; Alade, T. Indoor positioning system (ips) using ultra-wide bandwidth (uwb)—For industrial internet of things (iiot). Sensors 2023, 23, 5710. [Google Scholar] [CrossRef]
  16. Albraheem, L.; Alawad, S. A hybrid indoor positioning system based on visible light communication and bluetooth RSS trilateration. Sensors 2023, 23, 7199. [Google Scholar] [CrossRef]
  17. Wan, Q.; Wu, T.; Zhang, K.; Liu, X.; Cheng, K.; Liu, J.; Zhu, J. A high precision indoor positioning system of BLE AOA based on ISSS algorithm. Measurement 2024, 224, 113801. [Google Scholar] [CrossRef]
  18. Pascacio, P.; Casteleyn, S.; Torres-Sospedra, J.; Lohan, E.S.; Nurmi, J. Collaborative indoor positioning systems: A systematic review. Sensors 2021, 21, 1002. [Google Scholar] [CrossRef] [PubMed]
  19. Subedi, S.; Pyun, J.Y. A survey of smartphone-based indoor positioning system using RF-based wireless technologies. Sensors 2020, 20, 7230. [Google Scholar] [CrossRef]
  20. Lin, P.T.; Liao, C.A.; Liang, S.H. Probabilistic indoor positioning and navigation (PIPN) of autonomous ground vehicle (AGV) based on wireless measurements. IEEE Access 2021, 9, 25200–25207. [Google Scholar] [CrossRef]
  21. Kunhoth, J.; Karkar, A.; Al-Maadeed, S.; Al-Ali, A. Indoor positioning and wayfinding systems: A survey. Hum.-Centric Comput. Inf. Sci. 2020, 10, 18. [Google Scholar] [CrossRef]
  22. Kim Geok, T.; Zar Aung, K.; Sandar Aung, M.; Thu Soe, M.; Abdaziz, A.; Pao Liew, C.; Hossain, F.; Tso, C.P.; Yong, W.H. Review of indoor positioning: Radio wave technology. Appl. Sci. 2020, 11, 279. [Google Scholar] [CrossRef]
  23. Khan, M.N.; Jamil, M.; Gilani, S.O.; Ahmad, I.; Uzair, M.; Omer, H. Photo detector-based indoor positioning systems variants: A new look. Comput. Electr. Eng. 2020, 83, 106607. [Google Scholar] [CrossRef]
  24. Ridolfi, M.; Kaya, A.; Berkvens, R.; Weyn, M.; Joseph, W.; Poorter, E.D. Self-calibration and collaborative localization for UWB positioning systems: A survey and future research directions. ACM Comput. Surv. (CSUR) 2021, 54, 1–27. [Google Scholar] [CrossRef]
  25. Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
  26. Available online: https://www.esa.int/Enabling_Support/Space_Engineering_Technology/Automation_and_Robotics/Nanokhod (accessed on 3 November 2024).
  27. Ezra, N.; Cohen, A.; Zarrouk, D. Modeling, simulation, and experiments of a flexible track robot over rigid horizontal and inclined surfaces. Mech. Mach. Theory 2024, 199, 105689. [Google Scholar] [CrossRef]
  28. Rubio, F.; Valero, F.; Llopis-Albert, C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst. 2019, 16, 1729881419839596. [Google Scholar] [CrossRef]
  29. Tanaka, K.; Okamoto, Y.; Ishii, H.; Kuroiwa, D.; Mitsuzuka, J.; Yokoyama, H.; Yokoyama, H.; Inoue, S.; Shi, Q.; Okabayashi, S.; et al. Hardware and control design considerations for a monitoring system of autonomous mobile robots in extreme environment. In Proceedings of the 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Munich, Germany, 3–7 July 2017. [Google Scholar] [CrossRef]
  30. Svasta, P.M.; Hapenciuc, I.-A. Autonomous vehicle for the investigation of dangerous enviroments. Univ. Politeh. Buchar. Sci. Bull. Ser. C Electr. Eng. 2011, 73, 221–240. [Google Scholar]
  31. Besnea, D.; Rizescu, D.; Rizescu, C.; Dinu, E.; Constantin, V.; Moraru, E. Additive technologies and materials for realization of elastic elements. In Proceedings of the International Conference of Mechatronics and Cyber-MixMechatronics; Springer International Publishing: Cham, Switzerland, 2019; pp. 62–70. [Google Scholar]
Figure 1. Basic algorithm—steps.
Figure 1. Basic algorithm—steps.
Applsci 15 00918 g001
Figure 2. Stationary/mobile beacon.
Figure 2. Stationary/mobile beacon.
Applsci 15 00918 g002
Figure 3. Router.
Figure 3. Router.
Applsci 15 00918 g003
Figure 4. Marvelmind dashboard.
Figure 4. Marvelmind dashboard.
Applsci 15 00918 g004
Figure 5. Using three beacons.
Figure 5. Using three beacons.
Applsci 15 00918 g005aApplsci 15 00918 g005b
Figure 6. Marvelmind application tests (ah).
Figure 6. Marvelmind application tests (ah).
Applsci 15 00918 g006aApplsci 15 00918 g006b
Figure 7. Proposed application results (ah). Different color plots are for different beacons.
Figure 7. Proposed application results (ah). Different color plots are for different beacons.
Applsci 15 00918 g007aApplsci 15 00918 g007b
Figure 8. Coordinates of fixed beacon BF1.
Figure 8. Coordinates of fixed beacon BF1.
Applsci 15 00918 g008
Figure 9. Coordinates of fixed beacon BF1 in matrix.
Figure 9. Coordinates of fixed beacon BF1 in matrix.
Applsci 15 00918 g009
Figure 10. Determining d1,d2,d3.
Figure 10. Determining d1,d2,d3.
Applsci 15 00918 g010
Figure 11. Determining D1,D2,D3.
Figure 11. Determining D1,D2,D3.
Applsci 15 00918 g011
Figure 12. Translation axis.
Figure 12. Translation axis.
Applsci 15 00918 g012
Figure 13. Motor structure and gearbox detail.
Figure 13. Motor structure and gearbox detail.
Applsci 15 00918 g013
Figure 14. Reference point matrix diagram.
Figure 14. Reference point matrix diagram.
Applsci 15 00918 g014
Figure 15. Translation axis mounting diagram.
Figure 15. Translation axis mounting diagram.
Applsci 15 00918 g015
Figure 16. Position of the mobile beacon and fixed beacon.
Figure 16. Position of the mobile beacon and fixed beacon.
Applsci 15 00918 g016
Figure 17. Mobile beacon installation detail.
Figure 17. Mobile beacon installation detail.
Applsci 15 00918 g017
Figure 18. Defining the Marvelmind application work area.
Figure 18. Defining the Marvelmind application work area.
Applsci 15 00918 g018
Figure 19. The position in time of the mobile beacon.
Figure 19. The position in time of the mobile beacon.
Applsci 15 00918 g019
Figure 20. Comparative evolution of indoor GPS system.
Figure 20. Comparative evolution of indoor GPS system.
Applsci 15 00918 g020
Figure 21. Error of mobile beacon against linear axis.
Figure 21. Error of mobile beacon against linear axis.
Applsci 15 00918 g021
Figure 22. (a) Tracked robot. (b) Ultrasonic sensors mounting.
Figure 22. (a) Tracked robot. (b) Ultrasonic sensors mounting.
Applsci 15 00918 g022aApplsci 15 00918 g022b
Figure 23. Robotic dogs.
Figure 23. Robotic dogs.
Applsci 15 00918 g023
Figure 24. Team of robots.
Figure 24. Team of robots.
Applsci 15 00918 g024
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Vlăsceanu, A.; Avram, M.; Constantin, V.; Moraru, E. Research on Indoor Positioning Systems and Autonomous Mechatronic Systems for Surveillance of Intrabuilding Zones. Appl. Sci. 2025, 15, 918. https://doi.org/10.3390/app15020918

AMA Style

Vlăsceanu A, Avram M, Constantin V, Moraru E. Research on Indoor Positioning Systems and Autonomous Mechatronic Systems for Surveillance of Intrabuilding Zones. Applied Sciences. 2025; 15(2):918. https://doi.org/10.3390/app15020918

Chicago/Turabian Style

Vlăsceanu, Alexandru, Mihai Avram, Victor Constantin, and Edgar Moraru. 2025. "Research on Indoor Positioning Systems and Autonomous Mechatronic Systems for Surveillance of Intrabuilding Zones" Applied Sciences 15, no. 2: 918. https://doi.org/10.3390/app15020918

APA Style

Vlăsceanu, A., Avram, M., Constantin, V., & Moraru, E. (2025). Research on Indoor Positioning Systems and Autonomous Mechatronic Systems for Surveillance of Intrabuilding Zones. Applied Sciences, 15(2), 918. https://doi.org/10.3390/app15020918

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop