Next Article in Journal
A Middleware Solution for Wireless IoT Applications in Sparse Smart Cities
Next Article in Special Issue
Comparison of Measurement Models for 3D Magnetic Localization and Tracking
Previous Article in Journal
Plasmonic Waveguide Coupled Ring Cavity for a Non-Resonant Type Refractive Index Sensor
Previous Article in Special Issue
Indoor Multipath Assisted Angle of Arrival Localization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Local Positioning System Using Flickering Infrared LEDs

1
Aix Marseille University, CNRS, ISM, Marseille 13009 , France
2
Temasek Labs, National University of Singapore, Singapore 117411, Singapore
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(11), 2518; https://doi.org/10.3390/s17112518
Submission received: 29 September 2017 / Revised: 25 October 2017 / Accepted: 26 October 2017 / Published: 3 November 2017

Abstract

:
A minimalistic optical sensing device for the indoor localization is proposed to estimate the relative position between the sensor and active markers using amplitude modulated infrared light. The innovative insect-based sensor can measure azimuth and elevation angles with respect to two small and cheap active infrared light emitting diodes (LEDs) flickering at two different frequencies. In comparison to a previous lensless visual sensor that we proposed for proximal localization (less than 30 cm), we implemented: (i) a minimalistic sensor in terms of small size (10 cm 3 ), light weight (6 g) and low power consumption (0.4 W); (ii) an Arduino-compatible demodulator for fast analog signal processing requiring low computational resources; and (iii) an indoor positioning system for a mobile robotic application. Our results confirmed that the proposed sensor was able to estimate the position at a distance of 2 m with an accuracy as small as 2-cm at a sampling frequency of 100 Hz. Our sensor can be also suitable to be implemented in a position feedback loop for indoor robotic applications in GPS-denied environment.

1. Introduction

In the absence of global positioning system (GPS), an indoor positioning system must be used to provide a local position of the autonomous robot in a constrained environment with dust, smoke and various lighting conditions (darkness, half-light, and flickering light).
There are numerous existing technical solutions for the indoor localization using: (i) infrastructure such as ultrasonic signals [1], ultra wideband technology [2] and fingerprinting approaches with wireless sensors networks [3]; and (ii) onboard sensors such as monocular cameras, stereo imaging and Light Detection and Ranging (LIDAR). For industrial applications inside warehouses in [4], automated guided vehicles (AGVs) localize themselves by triangulation based on reflector landmarks detected by laser scanners. In [5], visual odometry for localization aims at estimating the pose of a vehicle through examination of the changes that motion induces on the images acquired by onboard cameras. In [6], in conjunction with signals produced from inertial sensors and wheel encoders, a map of the magnetic field was used to precisely localize an indoor robot without any additional infrastructure. Moreover, the inertial sensors such as accelerometers, rate gyros combined magnetometers can be used to estimate the angular position, velocity or acceleration of a mobile robot. However, the effect of noise on the integrated signals strongly affects the position estimation and leads to difficulties in getting the information with a high precision over a large amount of time. In [7], in the context of visible light communication, an indoor positioning system using multiple optical receivers composed of photodiodes (PD) provides coordinates and orientation of the mobile receiver with an achievable position error less than 0.1 m. In [8], a linear positioning system based on infrared (IR) beacon aims at localizing indoors pedestrians using a trigonometrical survey. The IR beacon is attached to the shopping bag. The receiver installed in the ceiling at the height of 2.3 m is a PD array and measures the angle of incidence of the beacon ray. Therefore, the indoor position of the IR beacon is calculated and an identifier signal is sent to a computer for processing via wireless communication. In [9], using visible light communication, a novel indoor localization system is presented, where LED beacons determine the position of the target sensor, including a camera, an inclinometer, and a magnetometer. The localization is performed using geometric- and consensus-based techniques adapted to a high number of beacons and outliers. The tests presented show that the accuracy of the system is in the low decimeter range.
This paper proposes the development of a very different kind of indoor localization technique. A novel local positioning system is developed based on a bio-inspired optical and minimalistic sensor in terms of mass, size, cost and computational resources using photodiodes and flickering IR LEDs. As presented in [10], the comparison between charge couple device (CCD) image sensors, complementary metal oxide semiconductor (CMOS) image sensors and PDs pointed out the advantages of PDs in terms of speed, sensitivity, energy consumption and system complexity. Moreover, PD is easy to fabricate and has low production costs. As proposed in [11], LEDs offer advantageous properties such as reliability, lower power consumption, long lifetime and can be used as a communication device. For optical communications in free space under fog and smoke conditions, Ijaz et al. showed that near infrared light sources are the most robust wavelengths to link failure [12]. In [13], a lensless sensor was prototyped to estimate the position of active IR LEDs for proximal localization (up to 30 cm). To increase the operating range (≥1.5 m) the paper addresses a brand new design for indoor localization in 2D. Using PD and IR LEDs, our indoor positioning system embeds an innovative optical sensor robust to lighting conditions.
Section 2 introduces the computer-aided design of the sensor in 3D. The fabrication of a new tiny optical sensing device with a custom-made signal processing board shielded to an Arduino board is presented in Section 3. Section 4 gives a short description of the bio-inspired optical sensor modeling and the principle underlying the signal processing algorithm. In Section 5, indoor localization is performed for the estimation of position in 2D. The localization of a mobile robot in 2D was also tested with the new sensor device implemented in the feedback control loop for trajectory tracking purpose. Section 6 concludes the paper.

2. Sensor Design

The optical sensor device, called HyperCube, designed and developed in this study, is equipped with three photodiodes (Figure 1). Each photodiode is mounted on the face of a tetrahedron. As depicted in Figure 1A,B, the optical axis of each photodiode is separated by an inter-receptor angle Δ ϕ = 60 , which defines the spatial acuity of the visual system [14].

3. Fabrication

The prototype of the sensor was obtained by 3D printing as shown on Figure 2A. It is equipped with three photodiodes made by Vishay Semiconductors with the reference BPV22F. Each photodiode has a maximum absorption at the wavelength of 950 nm which corresponds to the maximum emissive power of the infrared LEDs SF4249. As presented in Figure 2B, a custom-made electronic board for the frequency modulation of the flickering IR LEDs can produce two separate signals at 11 kHz and 17 kHz, respectively. The same electronic board composed of two analog demodulation circuits performs the acquisition and demodulation. Shielded on the Arduino board, the latter performs the visual signal processing and provides an estimation of the HyperCube’s angular position (i.e., azimuth and elevation angles) with respect to the infrared LEDs.

4. Modeling

Each photodiode features an angular sensitivity which is defined by the angle of acceptance denoted Δ ρ , i.e., the full width at half maximum of the angular sensitivity. A bell-shaped sensitivity function models the angular sensitivity of each photodiode. It was inspired by the Gaussian angular sensitivity function of flies’ photoreceptors as described in [15].

4.1. Angular Sensitivity of the Photosensors

As presented in Figure 3, the angular sensitivity of each photodiode in solid line is compared to the cosine-like angular sensitivity in dotted line. One can show that the cosine-like angular sensitivity of the model fits well to the experimental data and finally fits better than a Gaussian function. Moreover, the angle of acceptance Δ ρ is equal to 120 and Δ ϕ between P h l and P h r is equal to 60 .

4.2. Principle of the Sensor

Two demodulated photodiode output signals are processed by an Arduino microcontroller. The analog demodulation steps were achieved by our custom-made shield board connected to the Arduino (see Figure 4). As shown in Figure 4A,B, the digital processing computes the relative difference over the sum of two adjacent demodulated photosensor output signals in order to assess the angular measurements (azimuth and elevation) [16]. The demodulation steps consist of using classical lock-in detection to demodulate the signal. However, the lock-in amplifier required a modulation signal which is provided here by a Phase-lock-loop circuit due to the fact that there is no physical link between the source (LED) and the receiver (photodiode).
As depicted in Figure 4, the digital processing operated in the microcontroller returns an output signal S φ for the azimuth φ with S φ = S p h r S p h l S p h r + S p h l and an output signal S ψ for the elevation ψ with S ψ = S p h m S p h v i r t S p h m + S p h v i r t where S v i r t = S p h l + S p h r 2 . According to the visual sensor model mentioned in [13], S φ t a n φ and S ψ t a n ψ . Therefore, the relative position ( X ^ , Y ^ ) of the sensor with respect to the IR LED can be estimated with X ^ = t a n φ Z ^ and Y ^ t a n ψ Z ^ , where Z ^ is the a priori known fixed height as shown on Figure 5.

5. Experimental Results

5.1. Position Estimation in 2D

A calibration, which consists of adjusting the sensor outputs S φ and S ψ to the ratios X / Z and Y / Z , is processed using the Vicon system.
minimize 1 n ( X r e f X ^ ) 2 + ( Y r e f Y ^ ) 2 subject to X r e f Z ^ = a φ 1 . S φ 2 + a φ 2 . S φ + a φ 3 . S ψ 2 + a φ 4 . S ψ + b φ Y r e f Z ^ = a ψ 1 . S φ 2 + a ψ 2 . S φ + a ψ 3 . S ψ 2 + a ψ 4 . S ψ + b ψ
The coefficients a φ i and a ψ i are determined using Matlab®. The optimization Criterion (1) is to minimize the mean square error between the reference values and the actual data values provided by the sensor. The Matlab® function fminunc is used to compute the coefficients. The localization was tested indoors for several lighting conditions. The optical sensing device was moving in X and Y at a fixed height as presented in Figure 5. After the calibration, the indoor localization is performed with the sole use of the optical sensor device connected to the demodulation board and the Arduino board.
Figure 6A gives a description of the experimental setup. One can see the location of the modulated infrared emitter and the coverage of the system. In this experiment, HyperCube is fixed to a XY table and moved by hand. One can note from the indoor localization results (Figure 6B) that the position estimation at the fixed height of 150 cm is accurate. The precision obtained features a standard deviation about 1 cm for X and inferior to 2 cm for Y measurements. Experiments over a longer distance are presented in the next section.

5.2. Localization of a Mobile Robot in 2D

In this section, we present the results obtained using HyperCube for indoor localizatin in 2D of a mobile robot. A new custom-made electronic board for the frequency modulation of the flickering IR LEDs is built. It can produce two separate signals at 5 kHz and 11 kHz. The mobile robot is equipped with a new custom-made electronic board composed of two analog demodulation circuits in charge of acquisition and demodulation. The experimental setup is presented in Figure 7. The ground height is H = 2 m and the localization coverage area is 2 m × 2 m.

5.2.1. Kinematics and Dynamics Modeling of the Mobile Robot

Complete kinematics and dynamics modeling of the omni-directional robot with mecanum wheels are detailed in [17,18]. Figure 8 shows the disposition of the wheels related to the frames Σ 0 , Σ i ω , (i = 1, 2, 3, 4). We define: V i ω (i = 1, 2, 3, 4) the velocity vector corresponding to the wheels revolutions where V i ω = R w × ω i . R w is the radius of the wheel and ω i is the revolution velocity of the wheel. V i r (i = 1, 2, 3, 4) is the tangential velocity vector of the free roller touching the floor and V 0 = [ x ˙ m y ˙ m ψ ˙ m ] T is the velocity vector in the local frame ( X m , Y m , Z m ) .
The state vector X = [ x y ψ ] T is composed of the positions x, y and the heading ψ in the global frame ( X G , Y G , Z G ) . Kinematics equation describing the relationship between V ω and V 0 is given by:
V ω = J 0 · V 0
J 0 = 1 1 ( l + L ) 1 1 ( l + L ) 1 1 ( l + L ) 1 1 ( l + L ) R 4 × 3 is a transformation matrix and V ω = [ V 1 ω V 2 ω V 3 ω V 4 ω ] T is the wheel velocity vector corresponding to the angular velocity. Oppositely, the mobile robot velocity can be derived from the wheel velocity using a pseudo inverse matrix as in (3):
V 0 = J 0 + · V ω
where J 0 + = J 0 T · J 0 1 J 0 T . As a result, each element of V 0 is given by the following equations:
x ˙ m = R w 4 ω 1 + ω 2 + ω 3 + ω 4
y ˙ m = R w 4 ω 1 + ω 2 + ω 3 ω 4
ψ ˙ m = R w 4 ( L + l ) ω 1 + ω 2 ω 3 + ω 4
The velocity X ˙ in the global frame ( X G , Y G , Z G ) is expressed in (7):
x ˙ y ˙ ψ ˙ = cos ( ψ ) sin ( ψ ) 0 sin ( ψ ) cos ( ψ ) 0 0 0 1 · x ˙ m y ˙ m ψ ˙ m
As presented in [18,19], the vehicle dynamics is given by (8):
θ ˙ θ ¨ = 0 4 × 4 I 4 × 4 0 4 × 4 M 1 D θ θ θ ˙ + 0 4 × 4 M 1 τ
θ = [ θ 1 θ 2 θ 3 θ 4 ] T is the vector of the angular position of each wheel. τ = [ τ 1 τ 2 τ 3 τ 4 ] T is the control input vector composed of the torque applied to each wheel with: M = A + B + I w B B A B B A + B + I w A B B B A B A + B + I w B A B B B A + B + I w , A = m R w 2 8 , B = I z R w 2 16 ( L + l ) 2 and D θ is the coefficient of the wheel’s viscous friction. I z is the vehicle moment of inertia around the Z axis. I w is the wheel’s moment of inertia around the center of revolution.

5.2.2. Design of Position Control

The standard robot motion control have been designed using a sliding mode dynamic controller to track a desired trajectory as detailed in [19]. Let us consider the state vector X = [ x 11 , x 12 , x 21 , x 22 , x 31 , x 32 ] T = [ x , x , y , y , ψ , ψ ] T and the control input vector u = [ u 1 u 2 u 3 ] T = [ x ˙ m y ˙ m ψ ˙ m ] T . From (7), the state space representation of the system is given by:
x ˙ 11 = x 12 x ˙ 12 = u 1 cos ( x 32 ) u 2 sin ( x 32 ) x ˙ 21 = x 22 x ˙ 12 = u 1 sin ( x 32 ) + u 2 cos ( x 32 ) x ˙ 31 = x 32 x ˙ 32 = u 3
The control input vector r = [ r 1 r 2 r 3 ] T is defined to compensate for the nonlinear terms in (9)
u 1 = cos ( x 32 ) r 1 + sin ( x 32 ) r 2 u 2 = sin ( x 32 ) r 1 + cos ( x 32 ) r 2 u 3 = r 3
Using (10) in the state space representation (9), a system of equations is written as following:
1 = x ˙ 11 = x 12 x ˙ 12 = r 1
2 = x ˙ 21 = x 22 x ˙ 12 = r 2
3 = x ˙ 31 = x 32 x ˙ 32 = r 3
The Equation (11a,b) stand for the equations of translation and the Equation (11c) describes the movement of rotation. As defined in [19], the positions of reference ξ 1 d and ξ 2 d in the inertial frame are introduced. The orientation reference in the same frame is also noted ξ 3 d . Therefore, the following subsystem of equations is written:
i e = z ˙ i 1 = z i 2 z ˙ i 2 = r i ξ ˙ i d i { 1 , 2 , 3 } ϕ ˙ i d = ξ i d
with z i 1 = x i 1 ϕ i d , z i 2 = x i 2 ξ i d i { 1 , 2 , 3 } . The saturation function denoted σ M : R R is defined as:
σ M ( S ) = S i f | S | < M s i g n ( S ) × M o t h e r w i s e
The sliding surfaces S i 1 and S i 2 are defined for each axis X and Y such that:
S i 1 = a i 1 a i 2 z i 1 a i 2 z i 2 S i 2 = a i 1 z i 2
where the coefficients a i 1 and a i 2 are chosen to ensure the attractiveness of the sliding surfaces. Therefore, we propose the candidate Lyapunov functions V i 1 and V i 2 for each sliding surface:
V i 1 = S i 1 2 V i 2 = S i 2 2
As explained in [19], the exponential stability of the system is ensured and the control input signal r i can be written as:
r i = σ i 3 ( ξ ˙ i d σ M i 2 ( a i 1 z i 2 + σ M i 1 ( a i 2 z i 2 + a i 1 a i 2 z i 1 ) ) ) i { 1 , 2 , 3 } .
Using (10) and the expression provided by (16), the control input signals u 1 , u 2 and u 3 are bounded such as:
| u 1 | = m a x ( M 13 , M 23 , 0.707 ( M 13 + M 23 ) ) | u 2 | = m a x ( M 13 , M 23 , 0.707 ( M 13 + M 23 ) ) | u 3 | = M 33
Therefore, the angular speed of each wheel ω i of the mobile robot is bounded such as the following:
| ω 1 | = 1 R w ( ( L + l ) M 33 ) | ω 2 | = 1 R w ( 2 · m a x ( M 13 , M 23 , 0.707 ( M 13 + M 23 ) ) + ( L + l ) M 33 ) | ω 3 | = 1 R w ( 2 · m a x ( M 13 , M 23 , 0.707 ( M 13 + M 23 ) ) + ( L + l ) M 33 ) | ω 4 | = 1 R w ( ( L + l ) M 33 )
where M i j , i , j { 1 , 2 , 3 } is the saturation parameter as defined in (13a). Figure 9 presents the block diagram of the system. The position and heading of the mobile robot are controlled in closed loop. It is worth noting that:
  • The nonlinear control block computes each angular reference speed ω i * for the wheel i. The control law minimizes the error between the reference position X * and the position estimate X ^ .
  • The angular speed of each wheel is controlled in closed loop using a local proportional integral controller (PI).
  • The estimated position and heading of the mobile robot collected in the vector X ^ = ( x ^ , y ^ , ϕ ^ ) are provided by HyperCube or the Vicon motion capture system .

5.2.3. Implementation of the Indoor Localization for the Mobile Robot

To evaluate the performances of the indoor localization, the setup detailed in Figure 10 is composed of the following:
  • The Vicon motion capture system featuring sub-millimetric accuracy. It provides the localization estimation of the mobile robot. The motion capture data are used for comparison purposes.
  • The ground station connected to the Vicon system runs Matlab/Simulink® and QUARC® software programs. The nonlinear control law of the mobile robot presented in Section 5.2.2 is designed with Matlab/Simulink® and compiled. The program is transferred via WIFI radio link to the Gumstix microcontroller embedded on the mobile robot. The control algorithm runs onboard the robot.
The mobile robot aims at tracking a desired trajectory using HyperCube in a feedback control loop. An Arduino 380 board controls in closed-loop the angular velocity of each wheel. An Arduino Mega 2560 board is connected to the custom-made demodulation board of HyperCube which provides analog signals. The analog processing of the photodiode’s output signal is depicted in Figure 4. The Gumstix board uses the estimation of the robot position X ^ = ( x ^ , y ^ , ϕ ^ ) for control purpose.
Figure 11A gives a schematic of the hardware embedded on the mobile robot for indoor localization using HyperCube. Figure 11B shows the hardware implementation and HyperCube mounted onboard the mecanum wheeled omni-directional robot. Four reflective markers make the mobile robot visible by the Vicon system.

5.2.4. Application to the 2D Localization

The aim of this section is twofold: (i) using HyperCube in open loop, reconstruct the trajectory ( x , y ) of the mobile robot in the coverage area presented in Figure 7; and (ii) using HyperCube in closed loop, track a desired trajectory. For that purpose, the following experiments are performed.

5.2.5. Validation of the Nonlinear Control Law for Trajectory Tracking

The Vicon cameras provide to the nonlinear controller detailed in Section 5.2.2 the accurate localization and orientation of the robot in real-time. The mobile robot moves on a circular path of diameter 1 m. In the same time, the desired orientation ψ is a sinus. The results are given in Figure 12 where Figure 12A–C shows in red the positions ( x , y ) and the heading ψ versus time of the mobile robot compared to the reference in green. Figure 12D plots y versus x. Using accurate estimations of position and heading, the mobile robot follows the desired trajectory with precision without overshoot. This result validates the good performances of the sliding mode nonlinear control law.

5.2.6. Trajectory Reconstruction Using HyperCube

While keeping the circular path of diameter 1 m as reference, the desired orientation ψ is regulated at zero. The mecanum wheeled omni-directional robot maintains its orientation along the path. In this experiment, the trajectory is reconstructed using the measurements provided by HyperCube and only one IR LED that flickers at 11 kHz. However, the estimated positions provided by HyperCube were not used to control in closed-loop the robot. Figure 13 shows the results of the reconstruction. In Figure 13A,B, the position estimations in X and Y are plotted versus time and compared to the ground truth provided by the Vicon system. Figure 13C presents the heading ψ which is maintained at zero. There is no curve for HyperCube since HyperCube is not used to estimate the heading. Figure 13D depicts the XY graph. Satisfactory trajectory reconstruction can be verified by comparing the actual path of the mobile robot to the estimated one given by HyperCube.
One can see in Figure 14A,B the plots of the autocorrelation functions of the residuals. The noise for each axis has nearly white characteristics.
Moreover, given H = 2 m, the precision obtained reaches a standard deviation as small as 1.86 cm for X and 1.37 cm for Y (see Figure 15). Looking accross the precision of the reconstruction, the mobile robot was then controlled by means of HyperCube.

5.2.7. Robot Closed-Loop Control Based on HyperCube

In this section, the goal was to use the measurements provided by HyperCube to control in closed-loop the linear positions (x, y) of the robot. The current version of the sensor can only estimate the positions but not the rotations. For a robotic application using aerial robots for instance, HyperCube could be associated to a stabilizing gimbal system to reduce the effects of the variations of roll and picth angles which could affect the measurements. In the present application, the orientation ψ is given by the Vicon cameras. If we denote x s e n s o r and y s e n s o r the coordinates of the mobile robot in the local frame, the estimates of position x ^ and y ^ in the global frame are obtained by applying the rotation matrix as following:
x ^ y ^ ψ ^ = cos ( ψ ^ ) sin ( ψ ^ ) 0 sin ( ψ ^ ) cos ( ψ ^ ) 0 0 0 1 x s e n s o r y s e n s o r ψ v i c o n
where ψ v i c o n is the estimation of the orientation given by the Vicon cameras. Figure 16 shows the results while the sensor is used in closed loop. The orientation ψ is estimated with the motion capture system. Figure 16A,B superimposes the positions x and y measured by HyperCube (blue curve) and by the Vicon system (red curve). Figure 16C presents the regulation of the heading versus time. Since the orientation is given by the Vicon cameras and not by HyperCube, there is no curve related to HyperCube in Figure 16C. Once equipped with HyperCube, the experimental results show that HyperCube can be suitable to make the mobile robot follow faithfully the reference trajectory. Therefore, the performance of the HyperCube sensor have been validated indoors by using only one flickering IR LED. A limitation of this solution is the need of the orientation ψ provided by the Vicon system. However, for this kind of application, onboard sensors such as a low-cost inertial measurement unit (IMU) for instance in [20], magnetometer as proposed in [21] or odometry could be used to get the heading in real time.

6. Conclusions

In this paper, we proposed a novel positioning system for indoor application which can measure the angular position of a moving optical sensor, HyperCube. The latter was coupled to an Arduino-compatible demodulator through a custom-made shield board. Based on infrared light emission, the proposed solution for indoor localization spotlights the following features: (i) a minimalistic sensor in terms of small size (10 cm 3 ), light weight (6 g) and low power consumption (0.4 W for the sensor and the analog demodulation board); (ii) a fast analog signal processing and a digital processing implemented on an Arduino microcontroller (difference over the sum); and (iii) an online accurate position estimation in 2D.
We have shown that the proposed sensor was able to estimate the position in 2D at a distance of 1.5 m from the LEDs with an accuracy as small as 1-cm at a sampling frequency of 100 Hz using only one IR LED flickering at 17 kHz. We also proposed a robotic application which consisted in localizing a moving mecanum wheeled omnidirectional robot indoors. For a trajectory reconstruction purpose, the precision of the position estimation reached good performances with a standard deviation as small as 1.86 cm for X and 1.37 cm for Y with only one flickering IR LED placed at 2 m above the robot. To show the performance of HyperCube, the sensor was implemented in the position feedback control loop to make the mobile robot able to track a reference trajectory (a circle of 1 m in diameter). It turned out that the mobile robot followed the desired path faithfully, validating that the indoor local positioning system using flickering IR LEDs is a reliable and an efficient solution.
To further improve the performance of the proposed system, the sensor measurements provided by HyperCube will be fused with other sensors such as IMU to improve the position estimation. The height could also be estimated using two IR LEDs at two different frequencies. Moreover, one could investigate the estimation of the heading ψ using several IR LEDs at the same time.

Supplementary Materials

The following are available online at https://www.mdpi.com/1424-8220/17/11/2518/s1, Video S1: Robot closed-loop control based on HyperCube.

Acknowledgments

This work was supported by CNRS Institutes (Life Science; Information Science; Engineering Science and Technology; and Humanities and Social Science) and Aix-Marseille University.

Author Contributions

Thibaut Raharijaona contributed to implementing the algorithm for position estimation and signal processing, to guiding the experiments and to manuscript writing. Rodolphe Mawonou performed all the experiments related to the localization of the mobile robot in 2D. Thanh Vu Nguyen performed all the experiments related to the position estimation in 2D using the shielded demodulation board. Fabien Colonnier was involved in the implementation of the position nonlinear control law. Marc Boyron designed and built the analaog demodulation board. Julien Diperi designed and built the sensor. Stéphane Viollet contributed to guiding the experiments and to manuscript writing.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AGVAutomated Guided Vehicle
CCDCharge Couple Device
CMOSComplementary Metal Oxide Semiconductor
GPSGlobal Positioning System
IMUInertial Measurement Unit
IRInfrared
LEDLight Emitting Diode
LIDARLIght Detection And Ranging
PDPhotodiode
WIFIWireless Fidelity

References

  1. Bahl, P.; Padmanabhan, V.N. RADAR: An in-building RF-based user location and tracking system. In Proceedings of the INFOCOM 2000 Nineteenth Annual Joint Conference of the IEEE Computer and Communications Societies, Tel Aviv, Israel, 26–30 March 2000; Volume 2, pp. 775–784. [Google Scholar]
  2. Raja, A.K.; Pang, Z. High accuracy indoor localization for robot-based fine-grain inspection of smart buildings. In Proceedings of the 2016 IEEE International Conference on Industrial Technology (ICIT), Taipei, Taiwan, 14–17 March 2016; pp. 2010–2015. [Google Scholar]
  3. Kjærgaard, M.B. A taxonomy for radio location fingerprinting. In Proceedings of the International Symposium on Location-and Context-Awareness, Oberpfaffenhofen, Germany, 20–21 September 2007; Springer: Berlin, Germany, 2007; pp. 139–156. [Google Scholar]
  4. Reinke, C.; Beinschob, P. Strategies for contour-based self-localization in large-scale modern warehouses. In Proceedings of the 2013 IEEE 9th International Conference on Intelligent Computer Communication and Processing (ICCP), Cluj-Napoca, Romania, 5–7 September 2013; pp. 223–227. [Google Scholar]
  5. Scaramuzza, D.; Fraundorfer, F. Visual Odometry [Tutorial]. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
  6. Frassl, M.; Angermann, M.; Lichtenstern, M.; Robertson, P.; Julian, B.J.; Doniec, M. Magnetic maps of indoor environments for precise localization of legged and non-legged locomotion. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 913–920. [Google Scholar]
  7. Yasir, M.; Ho, S.W.; Vellambi, B.N. Indoor Position Tracking Using Multiple Optical Receivers. J. Lightw. Technol. 2016, 34, 1166–1176. [Google Scholar] [CrossRef]
  8. Sakai, N.; Zempo, K.; Mizutani, K.; Wakatsuki, N. Linear Positioning System based on IR Beacon and Angular Detection Photodiode Array. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcalá de Henares, Spain, 4–7 October 2016; pp. 4–7. [Google Scholar]
  9. Simon, G.; Zachár, G.; Vakulya, G. Lookup: Robust and Accurate Indoor Localization Using Visible Light Communication. IEEE Trans. Instrum. Meas. 2017, 66, 2337–2348. [Google Scholar] [CrossRef]
  10. Arnon, S. Visible Light Communication; Cambridge University Press: Cambridge, UK, 2015. [Google Scholar]
  11. Komine, T.; Nakagawa, M. Fundamental analysis for visible-light communication system using LED lights. IEEE Trans. Consum. Electron. 2004, 50, 100–107. [Google Scholar] [CrossRef]
  12. Ijaz, F.; Yang, H.K.; Ahmad, A.; Lee, C. Indoor positioning: A review of indoor ultrasonic positioning systems. In Proceedings of the 2013 15th International Conference on Advanced Communication Technology (ICACT), PyeongChang, Korea, 27–30 January 2013; pp. 1146–1150. [Google Scholar]
  13. Raharijaona, T.; Mignon, P.; Juston, R.; Kerhuel, L.; Viollet, S. HyperCube: A Small Lensless Position Sensing Device for the Tracking of Flickering Infrared LEDs. Sensors 2015, 15, 16484–16502. [Google Scholar] [CrossRef] [PubMed]
  14. Land, M.F. Visual acuity in insects. Annu. Rev. Entomol. 1997, 42, 147–177. [Google Scholar] [CrossRef] [PubMed]
  15. Stavenga, D. Angular and spectral sensitivity of fly photoreceptors. II. Dependence on facet lens F-number and rhabdomere type in Drosophila. J. Comp. Physiol. A 2003, 189, 189–202. [Google Scholar]
  16. Kerhuel, L.; Viollet, S.; Franceschini, N. The VODKA Sensor: A Bio-Inspired Hyperacute Optical Position Sensing Device. IEEE Sens. J. 2012, 12, 315–324. [Google Scholar] [CrossRef] [Green Version]
  17. Viboonchaicheep, P.; Shimada, A.; Kosaka, Y. Position rectification control for Mecanum wheeled omni-directional vehicles. In Proceedings of the IECON’03 29th Annual Conference of the IEEE Industrial Electronics Society, Roanoke, VA, USA, 2–6 November 2003; IEEE: Piscataway, NJ, USA, 2003; Volume 1, pp. 854–859. [Google Scholar]
  18. Lin, L.C.; Shih, H.Y. Modeling and adaptive control of an omni-mecanum-wheeled robot. Intell. Control Autom. 2013, 4, 166–179. [Google Scholar] [CrossRef]
  19. Guerrero-Castellanos, J.; Villarreal-Cervantes, M.; Sanchez-Santana, J.; Ramirez-Martinez, S. Seguimiento de trayectorias de un robot movil (3,0) mediante control acotado. Revista Iberoamericana de Automática e Informática Industrial RIAI 2014, 11, 426–434. [Google Scholar] [CrossRef]
  20. Euston, M.; Coote, P.; Mahony, R.; Kim, J.; Hamel, T. A complementary filter for attitude estimation of a fixed-wing UAV. In Proceedings of the IROS 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 340–345. [Google Scholar]
  21. Kim, H.S.; Seo, W.; Baek, K.R. Indoor Positioning System Using Magnetic Field Map Navigation and an Encoder System. Sensors 2017, 17, 651. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Computer-Aided Design of the optical sensor called HyperCube. Each side of the sensor consists of one photodiode soldered to a small printed circuit board ( P h l , P h r , P h m respectively): (A) top view; and (B) side view.
Figure 1. Computer-Aided Design of the optical sensor called HyperCube. Each side of the sensor consists of one photodiode soldered to a small printed circuit board ( P h l , P h r , P h m respectively): (A) top view; and (B) side view.
Sensors 17 02518 g001
Figure 2. Description of the HyperCube hardware: (A) each photodiode is connected to an analog amplifier for the conversion of the photodiode current into an output voltage; and (B) the signal provided by the photodiodes is processed by a demodulation board shielded on an Arduino board.
Figure 2. Description of the HyperCube hardware: (A) each photodiode is connected to an analog amplifier for the conversion of the photodiode current into an output voltage; and (B) the signal provided by the photodiodes is processed by a demodulation board shielded on an Arduino board.
Sensors 17 02518 g002
Figure 3. Sensor characterization and comparison between the theoretical (dotted lines) and measured angular sensitivities (continuous lines) of the photodiodes plotted here in polar coordinates.
Figure 3. Sensor characterization and comparison between the theoretical (dotted lines) and measured angular sensitivities (continuous lines) of the photodiodes plotted here in polar coordinates.
Sensors 17 02518 g003
Figure 4. Sketch diagram of the signal processing algorithm. (A). Top view: The sensor measures the azimuth φ . The left part shows the IR LED modulated at a frequency noted F d e m o d (11 kHz or 17 kHz). In this view, HyperCube is composed of two photosensors P h l and P h r with their respective cosine-like angular sensitivities (see Figure 3B). (B). Side View: The same signal processing is applied on the signal provided by the photosensor P h m of HyperCube and a virtual photosensor where S v i r t = S p h l + S p h r 2 .
Figure 4. Sketch diagram of the signal processing algorithm. (A). Top view: The sensor measures the azimuth φ . The left part shows the IR LED modulated at a frequency noted F d e m o d (11 kHz or 17 kHz). In this view, HyperCube is composed of two photosensors P h l and P h r with their respective cosine-like angular sensitivities (see Figure 3B). (B). Side View: The same signal processing is applied on the signal provided by the photosensor P h m of HyperCube and a virtual photosensor where S v i r t = S p h l + S p h r 2 .
Sensors 17 02518 g004
Figure 5. The principle of the indoor positioning solution at a fixed height consists of two steps: (i) the calibration procedure aims at finding the parameters which minimize the quadratic error metrics between the reference position given by the Vicon system and the data provided by the sensor; and (ii) given the calibration parameters, the sole use of the sensor device connected to the shielded demodulation board allows to estimate the positions X and Y at a fixed height.
Figure 5. The principle of the indoor positioning solution at a fixed height consists of two steps: (i) the calibration procedure aims at finding the parameters which minimize the quadratic error metrics between the reference position given by the Vicon system and the data provided by the sensor; and (ii) given the calibration parameters, the sole use of the sensor device connected to the shielded demodulation board allows to estimate the positions X and Y at a fixed height.
Sensors 17 02518 g005
Figure 6. 2D localization of HyperCube moved by hand using a XY table with respect to a fixed IR LED flickering at 17 kHz placed ahead of the sensor at a distance of 150 cm. The position estimation is based on the parameters obtained in the calibration procedure: (A) experimental setup; and (B) experimental results. The standard deviation for X estimation is only 1 cm and the standard deviation for Y estimation is 1.65 cm.
Figure 6. 2D localization of HyperCube moved by hand using a XY table with respect to a fixed IR LED flickering at 17 kHz placed ahead of the sensor at a distance of 150 cm. The position estimation is based on the parameters obtained in the calibration procedure: (A) experimental setup; and (B) experimental results. The standard deviation for X estimation is only 1 cm and the standard deviation for Y estimation is 1.65 cm.
Sensors 17 02518 g006
Figure 7. Picture of the experimental setup inside the motion capture system. The aim is to localize in 2D the mobile robot using HyperCube. The ground thruth is given by the Vicon cameras. The mecanum wheeled omni-directional robot is equipped with HyperCube, the localization coverage area is 2 m × 2 m. Two IR LEDs flickering at 5 kHz and 11 kHz were fixed on a horizontal bar placed above the robot at a height of 2 m.
Figure 7. Picture of the experimental setup inside the motion capture system. The aim is to localize in 2D the mobile robot using HyperCube. The ground thruth is given by the Vicon cameras. The mecanum wheeled omni-directional robot is equipped with HyperCube, the localization coverage area is 2 m × 2 m. Two IR LEDs flickering at 5 kHz and 11 kHz were fixed on a horizontal bar placed above the robot at a height of 2 m.
Sensors 17 02518 g007
Figure 8. Disposition of the mecanum wheels and the frames.
Figure 8. Disposition of the mecanum wheels and the frames.
Sensors 17 02518 g008
Figure 9. Block diagram of the robot autopilot. The position and heading are controlled in closed-loop. The mobile robot is equipped with HyperCube
Figure 9. Block diagram of the robot autopilot. The position and heading are controlled in closed-loop. The mobile robot is equipped with HyperCube
Sensors 17 02518 g009
Figure 10. Sketch diagram of the implementation of the indoor localization system for mobile robots. The Arduino Mega 2560 gets the analog signals provided by the demodulation board (Arduino shield, see Figure 2) and the output voltage of each photodiode P h i of HyperCube. The Arduino 380 board is devoted to control the angular velocity of each wheel.
Figure 10. Sketch diagram of the implementation of the indoor localization system for mobile robots. The Arduino Mega 2560 gets the analog signals provided by the demodulation board (Arduino shield, see Figure 2) and the output voltage of each photodiode P h i of HyperCube. The Arduino 380 board is devoted to control the angular velocity of each wheel.
Sensors 17 02518 g010
Figure 11. Top view of the mobile robot: (A) schematic of the embedded hardware; and (B) picture of the mobile robot used for the experiments.
Figure 11. Top view of the mobile robot: (A) schematic of the embedded hardware; and (B) picture of the mobile robot used for the experiments.
Sensors 17 02518 g011
Figure 12. (AC) plots of the position and orientation of the mobile robot. The accurate measurements provided here only by means of the Vicon system provide to the sliding mode nonlinear controller the measured robot’s positions (x,y) and heading. The path is compared to the desired trajectory.
Figure 12. (AC) plots of the position and orientation of the mobile robot. The accurate measurements provided here only by means of the Vicon system provide to the sliding mode nonlinear controller the measured robot’s positions (x,y) and heading. The path is compared to the desired trajectory.
Sensors 17 02518 g012aSensors 17 02518 g012b
Figure 13. Plots of the position and orientation of the mobile robot. The accurate measurements provided by the Vicon system feed the sliding mode nonlinear controller. The trajectory is reconstructed using HyperCube and the IR LED that flickers at 11 kHz. The closed-loop control of the robot’s positions and heading is only based here on the measurements provided by the Vicon system.
Figure 13. Plots of the position and orientation of the mobile robot. The accurate measurements provided by the Vicon system feed the sliding mode nonlinear controller. The trajectory is reconstructed using HyperCube and the IR LED that flickers at 11 kHz. The closed-loop control of the robot’s positions and heading is only based here on the measurements provided by the Vicon system.
Sensors 17 02518 g013
Figure 14. Plots of the autocorrelation functions of the residuals for each axis X and Y.
Figure 14. Plots of the autocorrelation functions of the residuals for each axis X and Y.
Sensors 17 02518 g014
Figure 15. Histograms of the localization error in 2D using HyperCube. The standard deviations for the X and Y axis are σ X = 1.86 cm and σ Y = 1.37 cm for H = 2 m. Only one IR LED that flickers at 11 kHz is used.
Figure 15. Histograms of the localization error in 2D using HyperCube. The standard deviations for the X and Y axis are σ X = 1.86 cm and σ Y = 1.37 cm for H = 2 m. Only one IR LED that flickers at 11 kHz is used.
Sensors 17 02518 g015
Figure 16. Plots of the position and orientation of the mobile robot while the measurements provided by HyperCube are used in closed loop and the heading is given by the Vicon cameras. Only one IR LED that flickers at 11 kHz is used and H = 2 m.
Figure 16. Plots of the position and orientation of the mobile robot while the measurements provided by HyperCube are used in closed loop and the heading is given by the Vicon cameras. Only one IR LED that flickers at 11 kHz is used and H = 2 m.
Sensors 17 02518 g016

Share and Cite

MDPI and ACS Style

Raharijaona, T.; Mawonou, R.; Nguyen, T.V.; Colonnier, F.; Boyron, M.; Diperi, J.; Viollet, S. Local Positioning System Using Flickering Infrared LEDs. Sensors 2017, 17, 2518. https://doi.org/10.3390/s17112518

AMA Style

Raharijaona T, Mawonou R, Nguyen TV, Colonnier F, Boyron M, Diperi J, Viollet S. Local Positioning System Using Flickering Infrared LEDs. Sensors. 2017; 17(11):2518. https://doi.org/10.3390/s17112518

Chicago/Turabian Style

Raharijaona, Thibaut, Rodolphe Mawonou, Thanh Vu Nguyen, Fabien Colonnier, Marc Boyron, Julien Diperi, and Stéphane Viollet. 2017. "Local Positioning System Using Flickering Infrared LEDs" Sensors 17, no. 11: 2518. https://doi.org/10.3390/s17112518

APA Style

Raharijaona, T., Mawonou, R., Nguyen, T. V., Colonnier, F., Boyron, M., Diperi, J., & Viollet, S. (2017). Local Positioning System Using Flickering Infrared LEDs. Sensors, 17(11), 2518. https://doi.org/10.3390/s17112518

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop