Next Article in Journal
Chatter Mitigation in Milling Process Using Discrete Time Sliding Mode Control with Type 2-Fuzzy Logic System
Next Article in Special Issue
A Closed-Form Localization Algorithm and GDOP Analysis for Multiple TDOAs and Single TOA Based Hybrid Positioning
Previous Article in Journal
Modeling Implicit Trust in Matrix Factorization-Based Collaborative Filtering
Previous Article in Special Issue
Improving Accuracy and Reliability of Bluetooth Low-Energy-Based Localization Systems Using Proximity Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Sensor Fusion Framework for Indoor Localization Using Smartphone Sensors and Wi-Fi RSSI Measurements

School of Electronics Engineering, Kyungpook National University, 80 Daehak-ro, Buk-gu, Daegu 41566, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(20), 4379; https://doi.org/10.3390/app9204379
Submission received: 7 September 2019 / Revised: 8 October 2019 / Accepted: 14 October 2019 / Published: 16 October 2019
(This article belongs to the Special Issue Recent Advances in Indoor Localization Systems and Technologies)

Abstract

:
Sensor fusion frameworks for indoor localization are developed with the specific goal of reducing positioning errors. Although many conventional localization frameworks without fusion have been improved to reduce positioning error, sensor fusion frameworks generally provide a further improvement in positioning accuracy. In this paper, we propose a sensor fusion framework for indoor localization using the smartphone inertial measurement unit (IMU) sensor data and Wi-Fi received signal strength indication (RSSI) measurements. The proposed sensor fusion framework uses location fingerprinting and trilateration for Wi-Fi positioning. Additionally, a pedestrian dead reckoning (PDR) algorithm is used for position estimation in indoor scenarios. The proposed framework achieves a maximum of 1.17 m localization error for the rectangular motion of a pedestrian and a maximum of 0.44 m localization error for linear motion.

Graphical Abstract

1. Introduction

Accurate positioning for indoor or outdoor scenarios requires that a positioning system’s displacement error be minimized. For locating user position in outdoor environments, localization systems such as global positioning system (GPS) [1] and base transceiver station (BTS) [2] based approaches exist. Both GPS and BTS face significant challenges when used for indoor localization. These include signal interference and spatial coverage limitations. The BTS cell phone technology does not achieve accurate results for indoor localization due to constrained signal coverage of the target area and dense urban environments characterized by high-rise buildings. To overcome these challenges, it is necessary to develop alternative localization systems for indoor localization. Existing indoor localization systems are based on ultra-wideband (UWB) technology [3], pedestrian dead reckoning (PDR) [4], radio frequency identification (RFID) [5], Bluetooth [6], visible light communication (VLC) [7], Zigbee [8] and Wi-Fi systems [9]. Each of these techniques achieves high position accuracy for indoor localization. However, combined systems give better performance compared to these individual systems. PDR systems use data from smartphone accelerometers, magnetometers, and gyroscopes for localization.
It is possible to use received signal strength indication (RSSI) signals for indoor localization with Wi-Fi access points (APs) in indoor environments. Many studies [10,11,12] have used smartphones equipped with a Wi-Fi receiving module for the RSSI data collection. The fusion of smartphone sensor data and Wi-Fi generated data for indoor localization has become prominent in indoor localization studies [13,14,15]. This paper proposes a sensor data fusion framework for both PDR and Wi-Fi systems. The proposed sensor fusion framework combines the location fingerprinting and trilateration algorithms for Wi-Fi indoor positioning and it fuses with PDR position results. The experiment results demonstrate that the proposed fusion framework reduced the localization errors when compared with conventional localization approaches.
The conventional approaches used for sensor fusion are PDR+Wi-Fi trilateration and PDR+Wi-Fi fingerprint algorithms. The most popular and easiest conventional approach used for indoor localization is the PDR+Wi-Fi trilateration algorithm. However, this algorithm does not provide very good accuracy (average error of 1.5–2 m) [16]. The multipath effects and non-line of sight (NLOS) conditions in the experiment areas reduce the trilateration algorithm performance. In trilateration algorithm, we use a free space propagation model for estimating the user distance from the Wi-Fi AP. The estimated user distance depends on the RSSI measurements. However, the measured RSSI signals tend to fluctuate according to the indoor environment’s physical characteristics and contained objects. Wi-Fi fingerprint algorithms can solve the multipath fading effect by creating a fingerprint map, but changes in the indoor environment affect the fingerprint maps and require updates to the fingerprint database. To overcome the challenges faced by conventional approaches, we present a sensor fusion framework for indoor localization by utilizing the PDR and Wi-Fi positioning results.
A recent trend in indoor localization is to use the hybrid localization system for locating the user position with minimum errors. The hybrid localization systems give better performance than individual localization systems. In hybrid localization systems, we combine multiple localization technologies with the help of sensor fusion frameworks. The most common hybrid localization system is the PDR with Wi-Fi localization system. In this paper, we propose a PDR with Wi-Fi localization system with high position accuracy for indoor localization. The proposed system uses a sensor fusion framework for combining the PDR results with Wi-Fi localization system results. For PDR approach, we used our previous model proposed in [17]. Our previous PDR model reduced the smartphone sensor errors and provides accurate localization results. In the case of Wi-Fi localization, we proposed a fusion algorithm and it uses the results from Wi-Fi trilateration and Wi-Fi fingerprint algorithms to enhance the position accuracy. Finally, we proposed a sensor fusion framework to combine the Wi-Fi fusion algorithm and PDR for indoor localization using the Kalman filter. Through extensive experiments, we demonstrated that a higher indoor positioning accuracy based on the framework is achievable.
The rest of the paper is organized as follows: The related works on the sensor fusion for PDR and Wi-Fi localization systems are discussed in Section 2. The proposed sensor fusion framework model is presented in Section 3. In the Section 4, the experiment results and analysis are discussed. Finally, the conclusions and future work are presented in Section 5.

2. Related Works

Sensor data fusion frameworks have been studied in the past for PDR and Wi-Fi localization systems. In this section, we discuss the related work on sensor fusion frameworks. The proposed sensor fusion framework model depends on three research areas, which are PDR, Wi-Fi localization systems and sensor data fusion frameworks for combined PDR and Wi-Fi localization systems. The performance of a PDR system depends on a smartphone’s sensor data error. When pedestrian walking distance increases, a PDR system will drift, and this error reduces the location accuracy. PDR algorithms depend on step detection [18], heading [19] and position estimation [20]. The localization accuracy of a Wi-Fi system depends on RSSI fluctuation. In Wi-Fi localization systems, the distance between a user and APs is estimated from free space path loss model. The Wi-Fi signal fluctuation affects distance estimation and degrades system performance. To overcome the problems related to PDR and Wi-Fi localization systems, a more accurate sensor data fusion framework for those systems is proposed in this paper.
The first sensor data fusion framework for Wi-Fi and PDR was introduced by Evennou and Marx [21]. In [21], Evennou and Marx proposed a sensor fusion framework which uses a Kalman filter and a particle filter. The benefits of the Evennou and Marx architecture were evaluated and compared with pure Wi-Fi localization systems and inertial navigation system (INS) positioning systems. A sensor fusion framework using a particle filter is explained in [22,23,24]. The real-world experiment results from [22,23,24] indicate remarkable performance improvements for indoor localization. An effective implementation of the extended Kalman filter (EKF) for sensor fusion is explained in [25,26,27]. The experiment results from [25,26,27] show that the author’s proposed sensor fusion frameworks achieve high localization accuracy when compared to individual localization systems. A complementary extended Kalman filter for the sensor fusion framework is explained by Leppäkoski et al. [28]. The results from [28] show that both the map information and wireless local area network (WLAN) signals can be used to improve the PDR system accuracy. The idea of using an unscented Kalman filter (UKF) algorithm for a sensor fusion framework is introduced by Chen et al. [29]. In [29], an integrated technique for merging Wi-Fi localization system, PDR and smartphone sensor data using a UKF algorithm for 3D indoor localization is proposed. In [30,31], a detailed analysis of sensor fusion frameworks of Wi-Fi fingerprint with PDR systems are discussed and the results indicate that a hybrid localization system’s performance is better than that of individual localization systems. An adaptive and robust filter for indoor localization using Wi-Fi and PDR is proposed by Li et al. [32]. Results from [32] show that the sensor fusion framework reduced the gross errors in the Wi-Fi localization system and gives accurate position results for indoor localization.
So far, we have discussed different types of sensor fusion frameworks used for Wi-Fi and PDR localization systems fusion. These sensor fusion frameworks reduced the localization errors; however, these require further position accuracy improvement for indoor localization. The proposed sensor fusion framework of this paper uses the position results from Wi-Fi localization and PDR systems with the help of a linear Kalman filter (LKF). For Wi-Fi localization systems, the proposed model uses a fusion algorithm to combine the trilateration and fingerprint algorithms. The fusion algorithm compensates for the line of sight (LOS) problem by taking advantage of fingerprinting to enhance the Wi-Fi system’s performance. The trilateration algorithm achieves better results when applied in different environments and the system is free from a calibration phase. The Wi-Fi fingerprint technology is useful for non-line of sight (NLOS) conditions when the APs and the user face interference from other objects. Combining these two technologies offers better performance than the individual systems. The PDR algorithm in the proposed model uses two sensor fusion techniques to reduce the smartphone sensor errors. The accelerometer and gyroscope sensors in the smartphone are used for pitch and roll estimation. The accelerometer and gyroscope sensor fusion reduces the accumulated errors and drift errors from the sensors and gives better results for pitch and roll estimation. In the case of heading estimation in the PDR algorithm, the proposed model uses another sensor fusion which combines the magnetometer and gyroscope headings together for better heading estimation. The step length from pitch-based step detector and heading results from heading estimator are for position estimation and the position results from PDR is free from smartphone sensor errors and gives better results than conventional localization systems.

3. Proposed Sensor Fusion Framework Model Using PDR and W-Fi Localization Systems

The proposed sensor fusion framework model uses a Wi-Fi fusion algorithm results with PDR to enhance the position accuracy for indoor localization. The Wi-Fi fusion model in the proposed algorithm utilizes the effective features of two classical localization algorithms such as trilateration and fingerprint for user position estimation. The Wi-Fi fusion results combined with PDR results and the results from the proposed model are free from smartphone sensor errors and Wi-Fi RSSI signal strength problems. Figure 1 shows the proposed sensor fusion framework model.
The position results from the PDR algorithm in the proposed model utilizes different smartphone sensors such as the accelerometer, gyroscope and magnetometer for position estimation. The data from accelerometer and gyroscope is used for pitch and roll estimation and the step detector uses the pitch values for step detection. The pitch and roll values together are used for heading estimation from the magnetometer. The data from gyroscope is also estimating the user heading and an LKF filter combines the gyroscope and magnetometer heading together for better performance. The step length is estimated from step detector and the position estimation algorithm uses the step length values and heading together for identifying the current user position. The results from PDR algorithm used in the sensor fusion framework for further position improvements. In the case of Wi-Fi fusion algorithm in the proposed model, it uses the results from trilateration and fingerprint algorithms together for user position estimation. The results of the Wi-Fi fusion algorithm are used in the sensor fusion framework for combining with PDR results. The linear Kalman filter explained in [33] is used for sensor fusion framework implementation.

3.1. PDR Positioning

The PDR positioning in the proposed model uses our previous work [17] for user position estimation. In [17], we proposed a sensor fusion technique for pitch and roll estimation, a pitch-based step detector algorithm for step detection, step length estimation from step detector, a sensor fusion technique for heading estimation and a position estimator. The pitch values from the proposed sensor fusion technique are used for user step detection. The pitch-based step detector identifies the user steps from pitch amplitude. A step is detected when the pitch amplitude crosses the threshold level. The step length estimator uses the results from step detector and the step length estimator follows a model presented in [34]. The sensor fusion technique for heading estimation uses the heading results from the magnetometer and gyroscope sensors. The individual heading results of the magnetometer and gyroscope are not free from smartphone sensor errors and these heading results are not sufficient for position estimation. To compensate the magnetometer and gyroscope sensor errors, the PDR model combines the gyroscope heading with the magnetometer heading and the combined results are more accurate than individual sensor heading results. The LKF used in [33] is used for heading sensor fusion. The position estimation algorithm proposed in [35] is used for current user position estimation. The position estimation algorithm takes the user step length and heading values and estimates the current user position. For more details of PDR implementation, refer to our previous work in [17].

3.2. Wi-Fi Positioning

The classical localization approaches for Wi-Fi positioning are Wi-Fi trilateration and Wi-Fi fingerprinting. Of these approaches, Wi-Fi fingerprinting is the most popular and it does not require the coordinates of Wi-Fi APs in the experiment area. However, the fingerprint approach has two drawbacks when used in practical applications. The fingerprint approach needs a priori knowledge of the local area and thus requires a lot of time for location survey and manpower to generate the fingerprints. The second drawback is the position accuracy. As opposed to other localization approaches, the position accuracy of the fingerprint approach depends on the amount of fingerprint data that is generated for a specific coverage area. If the amount of data used in the fingerprint maps is not sufficient, it is difficult to estimate the user position with high accuracy. The computation time for estimating user position in fingerprinting algorithm is very high as compared to other Wi-Fi localization approaches. In the case of Wi-Fi trilateration algorithm, we estimated the distance from the Wi-Fi AP to the user using a free-space path loss model [36]. The localization algorithm uses the distance values from path loss model to estimate the current user position. The performance of trilateration algorithm depends on the channel conditions between APs and the user. For position estimation using trilateration algorithm, the user should be within a limited Wi-Fi signal coverage area with LOS conditions. In order to improve the localization accuracy of Wi-Fi systems and to compensate the position errors of classical Wi-Fi localization approaches, we propose a Wi-Fi fusion algorithm for indoor localization. The proposed Wi-Fi fusion algorithm improves the indoor position accuracy by utilizing the advantages of fingerprint and trilateration algorithms.

3.2.1. Wi-Fi Trilateration Algorithm

The model presented in our previous work [16] is used for trilateration algorithm implementation. For accurate position estimation using trilateration algorithm, the experiment area should contain at least three APs. In our experiment we use four APs and these APs are placed at four corners of the experiment area. A model for Wi-Fi localization using trilateration is shown in Figure 2.
A free space path loss, F is used to estimate the user position from APs and is expressed as [37]:
F [ dBm ] = 20 log ( f ) 22.55 + 20 log 10 ( d )
where d is the user distance from AP in meters, f is the signal frequency expressed in megahertz. The trilateration algorithm follows the models explained in [38,39]. The distance d i between smartphone ( x p , y p ) and the APs A i ( x i , y i ) is expressed as:
d i 2 = ( x i x p ) 2 + ( y i y p ) 2
The expanded form of the equation is;
d i 2 = x i 2 + x p 2 2 x i x p + y i 2 + y p 2 2 y i y p
The above equation can be rewritten as
d k 2 = x k 2 + x p 2 2 x k x p + y k 2 + y p 2 2 y k y p
Subtracting Equation (3) from Equation (2), then the equation can be expressed as;
d i 2 d k 2 + x k 2 + y k 2 x i 2 y i 2 = 2 ( x k x i ) x p + 2 ( y k y i ) y p
With i = 1 and by varying index k = 2 , 3 , 4 we obtain Equation (6).
2 ( x 2 x 1 ) 2 ( y 2 y 1 ) 2 ( x 3 x 1 ) 2 ( x 4 x 1 ) 2 ( y 3 y 1 ) 2 ( y 4 y 1 ) x p y p = d 1 2 d 2 2 + x 2 2 + y 2 2 x 1 2 y 1 2 d 1 2 d 3 2 + x 3 2 + y 3 2 x 1 2 y 1 2 d 1 2 d 4 2 + x 4 2 + y 4 2 x 1 2 y 1 2
The coordinates ( x p , y p ) of the smartphone is obtained from the above system of equations and we express the equation in a linear form with three unknowns, A x = b where
A = 2 ( x 2 x 1 ) 2 ( y 2 y 1 ) 2 ( x 3 x 1 ) 2 ( y 3 y 1 ) 2 ( x 4 x 1 ) 2 ( y 4 y 1 ) ,
x = x p y p
and
b = d 1 2 d 2 2 + x 2 2 + y 2 2 x 1 2 y 1 2 d 1 2 d 3 2 + x 3 2 + y 3 2 x 1 2 y 1 2 d 1 2 d 4 2 + x 4 2 + y 4 2 x 1 2 y 1 2
The solution of the equations can be the x p * , y p * that minimizes the δ defined by the following:
δ = A x * b T A x * b
x * = x p * y p *
Applying Minimum Mean Square Error (MMSE) method in the above equation, we express user position, x * in the following form.
x * = A T A 1 A T b

3.2.2. Wi-Fi Fingerprint Algorithm

The basic idea of creating fingerprint maps for Wi-Fi positioning is explained in [40,41]. In this localization approach, we use a grid-based representation of the indoor environment. The fingerprint algorithm consists of two stages. The creation of fingerprint maps of the localization area from the sampled RSSI values is the first stage of the fingerprint algorithm. To accomplish this, we divide the location area into different zones and collect RSSI samples at all the grid points. In the second stage, we estimate the position of the receiving module from the fingerprint maps using the nearest neighbor method. Other methods used in position estimation are the support vector machine (SVM) and the hidden Markov model. Figure 3 shows the Wi-Fi fingerprint flow chart.
The first stage of Wi-Fi fingerprint positioning is explained in Figure 3. The localization area is divided into grid points and RSSI samples are collected from each grid point. The fingerprint gathering utility is used for data collection at each access point. The fingerprint parsing utility takes the RSSI samples as input and generates the data necessary to build the RSSI probability distribution for each reference point. The location estimation procedure for the Wi-Fi fingerprint algorithm is shown in Figure 4. The location estimation procedure uses the localization algorithm on the fingerprint maps of the four APs and the received RSSI samples on-the-fly. The nearest neighbor method is used to determine the user location based on the fingerprint data. The nearest neighbor approach calculates the Euclidean distance between the live RSSI sample and each reference point fingerprint. The minimum Euclidean distance is the approximated user location.
Let the RSSI samples from N Wi-Fi APs in the fingerprint maps be expressed as a vector ρ = ρ 1 , ρ 2 , ρ 3 , . . . . . . . , ρ N | ρ i R N to generate fingerprints database and i = 1 , 2 , 3 , , N be the number of grid points. The measured RSSI of a user from N Wi-Fi access points at a particular location during the experiment time is expressed as the vector δ = δ 1 , δ 2 , δ 3 , . . . . . . . . , δ N | δ k R k . The Euclidean distance D i between fingerprint maps and the corresponding measured values can be expressed as in [42]:
D i = ρ i δ = i = 1 N ρ i δ 2
From the resulting matrix D i , the minimum value represents the approximated user location and is expressed as
x , y = arg i min D i

3.2.3. Proposed Wi-Fi Fusion Algorithm

The idea of combining Wi-Fi fingerprint algorithm with trilateration algorithm is explained in [43,44]. The position results from the Wi-Fi fusion algorithm show better position accuracy than individual Wi-Fi position estimation algorithm results. A state dynamic model and a measurement model are used for the Wi-Fi fusion process. The Wi-Fi fusion algorithm uses a simple linear model and it combines the position results for better user position estimation. The Wi-Fi fusion algorithm utilizes the features of LKF, and its implementation is shown in Figure 5.
The unique characteristics of LKF give Wi-Fi fusion results which are free from localization errors and it improves the user position accuracy. The variables used for LKF implementation are shown in Table 1.

3.3. Proposed Sensor Fusion Framework Algorithm using Wi-Fi Fusion and PDR Position Results

The proposed sensor fusion framework as well uses the advantages of the LKF for combining the Wi-Fi fusion and PDR position results. As compared to other sensor fusion frameworks such as EKF, complementary EKF, UKF and Markov model [45], the LKF is simple and easy to use for the fusion process. In our experiment scenarios, we tackled the problem formulation in the linear perspective and the LKF is the best option for combining the user position results. The proposed sensor fusion framework uses the same LKF implementation explained in Figure 5 for Wi-Fi and PDR systems fusion. The LKF is a recursive filter that estimates the state in a linear system. The estimation in LKF is repeated to minimize errors from inaccurate measurements and observations and optimum performance can be achieved when the noise is Gaussian noise. The LKF implementation consists of two phases, prediction and correction as illustrated in Figure 5.
The position results from Wi-Fi and PDR are fused by LKF filter and the results from LKF shows better performance than individual systems. In the LKF prediction phase, the state variable x ^ k and error covariance matrix P k are estimated using state transition matrix ( A ) . It also predicted the corrected state x ^ k 1 and the error covariance P k 1 at previous time. The Gaussian noise ( Q ) is used for error covariance prediction. In the correction phase, the Kalman gain represents the reliability of the predicted value and the measured value. The larger the Kalman gain, the more reliable the measured value, and the smaller the Kalman gain, the more reliable the predicted value. The final output value P k 1 of the LKF is determined according to the Kalman gain.

4. Experiment and Result Analysis

The performance of the proposed sensor fusion framework is evaluated by two experiment scenarios based on the user motion. We used two user motions; rectangular and linear motion with predefined paths. For collecting data, a user of age 27 and height 172 cm walked on the reference path in the experiment areas with a smartphone in his hand. Figure 6 shows the experiment scenarios and the red line indicates the reference paths used for the experiment.
For collecting the smartphone sensor and RSSI data, we used a Samsung Galaxy Note 8 smartphone with a Snapdragon 835 processor and 6 GB RAM. The experiment area uses a LG U plus Wi-Fi APs for providing RSSI signals. In the first experiment, the user took a rectangular motion with a test area of 45 m × 37 m. The test area was divided into 4 × 20 grid points and the Wi-Fi RSSI for each of the points was recorded. To collect the Wi-Fi RSSI, we used a fingerprint Android application which can store the RSSI measurements in form of comma-separated values (CSV). This application provides an option for selecting the required APs for the experiment. Four APs were chosen for localization based on their signal strength in the experiment area. For the second experiment, a linear motion of the user was considered in a 75 m × 3 m corridor as shown in Figure 6. The corridor was divided into 2 × 60 grid points and the user walked in the reference paths. For PDR data collection, the sensor stream inertial measurement unit and global positioning system Android application were used. The Android application can access the accelerometer, gyroscope, and magnetometer sensors in the smartphone. The user can select any of the sensors and analyze the current values of the sensors. The application has an option for adjusting sensor frequency based on the experiment requirements.
The performance of the proposed sensor fusion framework was analyzed using the data from paths 1 and 2. Figure 7 and Figure 8 show the results from the proposed sensor fusion framework and conventional sensor fusion frameworks for paths 1 and 2. From Figure 7 and Figure 8, it can be seen that the proposed sensor fusion framework can minimize localization errors when compared to conventional sensor fusion approaches. In these experiments, the starting position was the origin and the experiments were carried out strictly along the predefined paths. The red line in Figure 7 shows the reference path for rectangular pedestrian motion and the dashed red line in Figure 8 shows the reference path for linear user motion. The accuracy of the proposed sensor fusion framework was evaluated by computing the average localization error and probability distribution of the localization errors. The average localization errors for the proposed sensor fusion framework and conventional sensor fusion frameworks for paths 1 and 2 are shown in Figure 9 and Figure 10.
From Figure 9 and Figure 10, the proposed sensor fusion framework has less position errors and gives better performance when compared to conventional sensor fusion approaches. The average localization error in the proposed sensor fusion framework is lower than in conventional sensor fusion frameworks. The average localization error from the PDR+Wi-Fi fingerprint approach shows a constant average localization error for most of the experiment time. This is due to the fingerprint map approximation between online data and offline data. The PDR+Wi-Fi trilateration approach shows the worst performance compared to other sensor fusion approaches. The localization accuracy of PDR+Wi-Fi trilateration approach depends on the multipath effects. The NLOS conditions and free space path loss model in the PDR+Wi-Fi trilateration approach yields larger position errors than other sensor fusion approaches. The probability distribution of localization errors for sensor fusion frameworks in both experiment scenarios is shown in Figure 11 and Figure 12.
From probability distribution of the localization error plots, it is clear that the proposed sensor fusion framework is better than other methods and gives high position accuracy for indoor localization. When the experiment starts, the proposed sensor fusion framework and PDR+Wi-Fi fingerprint approach show the same localization errors. When the experiment time increases the proposed sensor fusion framework shows the highest position accuracy as compared to the other sensor fusion frameworks. At some point in the experiment, the PDR+Wi-Fi fingerprint and PDR+Wi-Fi trilateration approaches show the same localization errors. Table 2 and Table 3 summarize the performance of the proposed sensor fusion framework versus conventional sensor fusion frameworks in terms of mean error, maximum error, minimum error, and standard deviation of error (STD).
From Table 2 and Table 3, the proposed sensor fusion framework has a lower mean error and maximum error than conventional sensor fusion approaches. The mean error results show that the PDR+Wi-Fi trilateration and PDR+Wi-Fi fingerprint approaches have almost similar error results compared to the proposed sensor fusion approach. However, the maximum error from PDR+Wi-Fi trilateration is very high compared to the other two sensor fusion approaches. The minimum error results indicate that the PDR+Wi-Fi trilateration approach has the least minimum error compared to the other two approaches. The standard deviation of error results validates the significance of the proposed sensor fusion framework with respect to conventional sensor fusion frameworks. From all the experiment results and analysis, we conclude that the proposed sensor fusion approach significantly outperforms that of conventional sensor fusion approaches and shows significant position accuracy for indoor localization.

5. Conclusions and Future Work

This paper proposed a sensor fusion framework for indoor localization using Wi-Fi RSSI signals and smartphone sensors. In the proposed sensor fusion framework model, we used a PDR algorithm which reduces the smartphone sensor errors and gives accurate position results for indoor localization. In the case of Wi-Fi localization systems, we proposed a Wi-Fi fusion algorithm for localization that achieves better performance than conventional Wi-Fi localization systems. To combine the position results from PDR and Wi-Fi systems, we proposed a sensor fusion framework model that shows better indoor localization accuracy than conventional Wi-Fi localization systems. The results from the rectangular motion experiment shows that the proposed sensor fusion framework has a maximum of 1.17 m error when compared to the predefined path. In the case of linear motion, the results show that the proposed sensor fusion framework has a maximum of 0.44 m errors when compared to predefined path. From the experiments and results, the proposed sensor fusion framework showed reasonable localization accuracy for indoor localization with fewer IMU sensor errors and solved the Wi-Fi RSSI signal fluctuation problems. However, in the future, we will mainly focus on multiple pedestrian localization. We will carry out experiments for multiple users in indoor scenarios.

Author Contributions

Writing—original draft, A.P. and J.K.; Writing—review & editing, D.S.H.

Funding

This work was supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (2016-0-00564, Development of Intelligent Interaction Technology Based on Context Awareness and Human Intention Understanding).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xu, R.; Chen, W.; Xu, Y.; Ji, S. A new indoor positioning system architecture using GPS signals. Sensors 2015, 15, 10074–10087. [Google Scholar] [CrossRef]
  2. Oussar, Y.; Ahriz, I.; Denby, B.; Dreyfus, G. Indoor localization based on cellular telephony RSSI fingerprints containing very large numbers of carriers. EURASIP J. Wirel. Commun. Network. 2011, 2011, 81. [Google Scholar] [CrossRef]
  3. Alarifi, A.; Al-Salman, A.; Alsaleh, M.; Alnafessah, A.; Al-Hadhrami, S.; Al-Ammar, M.; Al-Khalifa, H. Ultra wideband indoor positioning technologies: Analysis and recent advances. Sensors 2016, 16, 707. [Google Scholar] [CrossRef]
  4. Kuang, J.; Niu, X.; Zhang, P.; Chen, X. Indoor positioning based on pedestrian dead reckoning and magnetic field matching for smartphones. Sensors 2018, 18, 4142. [Google Scholar] [CrossRef]
  5. Saab, S.S.; Nakad, Z.S. A standalone RFID indoor positioning system using passive tags. IEEE Trans. Ind. Electron. 2011, 58, 1961–1970. [Google Scholar] [CrossRef]
  6. Zhuang, Y.; Yang, J.; Li, Y.; Qi, L.; El-Sheimy, N. Smartphone-based indoor localization with bluetooth low energy beacons. Sensors 2016, 16, 596. [Google Scholar] [CrossRef] [PubMed]
  7. Guo, X.; Shao, S.; Ansari, N.; Khreishah, A. Indoor localization using visible light via fusion of multiple classifiers. IEEE Photon. J. 2017, 9, 1–16. [Google Scholar] [CrossRef]
  8. Uradzinski, M.; Guo, H.; Liu, X.; Yu, M. Advanced indoor positioning using Zigbee wireless technology. Wirel. Pers. Commun. 2017, 97, 6509–6518. [Google Scholar] [CrossRef]
  9. Husen, M.; Lee, S. Indoor location sensing with invariant Wi-Fi received signal strength fingerprinting. Sensors 2016, 16, 1898. [Google Scholar] [CrossRef]
  10. Jiang, P.; Zhang, Y.; Fu, W.; Liu, H.; Su, X. Indoor mobile localization based on Wi-Fi fingerprint’s important access point. Int. J. Distrib. Sens. Netw. 2015, 11, 429104. [Google Scholar] [CrossRef]
  11. Liu, H.; Yang, J.; Sidhom, S.; Wang, Y.; Chen, Y.; Ye, F. Accurate WiFi based localization for smartphones using peer assistance. IEEE Trans. Mob. Comput. 2014, 13, 2199–2214. [Google Scholar] [CrossRef]
  12. Kim, Y.; Chon, Y.; Cha, H. Smartphone-based collaborative and autonomous radio fingerprinting. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2012, 42, 112–122. [Google Scholar] [CrossRef]
  13. Xiao, W.; Ni, W.; Toh, Y.K. Integrated Wi-Fi fingerprinting and inertial sensing for indoor positioning. In Proceedings of the 2011 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Guimaraes, Portugal, 21–23 September 2011; pp. 1–6. [Google Scholar]
  14. Tian, Z.; Fang, X.; Zhou, M.; Li, L. Smartphone-based indoor integrated WiFi/MEMS positioning algorithm in a multi-floor environment. Micromachines 2015, 6, 347–363. [Google Scholar] [CrossRef]
  15. Tarrío, P.; Besada, J.A.; Casar, J.R. Fusion of RSS and inertial measurements for calibration-free indoor pedestrian tracking. In Proceedings of the 16th International Conference on Information Fusion, Istanbul, Turkey, 9–12 July 2013; pp. 1458–1464. [Google Scholar]
  16. Poulose, A.; Eyobu, O.S.; Han, D.S. A Combined PDR and Wi-Fi Trilateration Algorithm for Indoor Localization. In Proceedings of the 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Okinawa, Japan, 11–13 February 2019; pp. 72–77. [Google Scholar]
  17. Poulose, A.; Eyobu, O.S.; Han, D.S. An Indoor Position-Estimation Algorithm Using Smartphone IMU Sensor Data. IEEE Access 2019, 7, 11165–11177. [Google Scholar] [CrossRef]
  18. Ho, N.-H.; Truong, P.; Jeong, G.-M. Step-detection and adaptive step-length estimation for pedestrian dead-reckoning at various walking speeds using a smartphone. Sensors 2016, 16, 1423. [Google Scholar] [CrossRef] [PubMed]
  19. Yuan, X.; Yu, S.; Zhang, S.; Wang, G.; Liu, S. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system. Sensors 2015, 15, 10872–10890. [Google Scholar] [CrossRef] [PubMed]
  20. Kang, W.; Han, Y. SmartPDR: Smartphone-based pedestrian dead reckoning for indoor localization. IEEE Sens. J. 2015, 15, 2906–2916. [Google Scholar] [CrossRef]
  21. Evennou, F.; Marx, F. Advanced integration of WiFi and inertial navigation systems for indoor mobile positioning. Eur. J. Appl. Signal Process. 2006, 2006, 164. [Google Scholar] [CrossRef]
  22. Wang, H.; Lenz, H.; Szabo, A.; Bamberger, J.; Hanebeck, U.D. WLAN-based pedestrian tracking using particle filters and low-cost MEMS sensors. In Proceedings of the 2007 4th Workshop on pOsitioning, Navigation and Communication (WPNC), Hannover, German, 22 March 2007; pp. 1–7. [Google Scholar]
  23. Carrera, J.L.; Zhao, Z.; Braun, T.; Li, Z. A real-time indoor tracking system by fusing inertial sensor, radio signal and floor plan. In Proceedings of the 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 4–7 October 2016; pp. 1–8. [Google Scholar]
  24. Knauth, S. Smartphone PDR positioning in large environments employing WiFi, particle filter, and backward optimization. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017; pp. 1–6. [Google Scholar]
  25. Frank, K.; Krach, B.; Catterall, N.; Robertson, P. Development and evaluation of a combined WLAN & inertial indoor pedestrian positioning system. In Proceedings of the ION GNSS, Savannah, Georgia, 22–25 September 2009; pp. 1–9. [Google Scholar]
  26. Zhuang, Y.; El-Sheimy, N. Tightly-coupled integration of WiFi and MEMS sensors on handheld devices for indoor pedestrian navigation. IEEE Sens. J. 2016, 16, 224–234. [Google Scholar] [CrossRef]
  27. Deng, Z.-A.; Hu, Y.; Yu, J.; Na, Z. Extended Kalman filter for real time indoor localization by fusing WiFi and smartphone inertial sensors. Micromachines 2015, 6, 523–543. [Google Scholar] [CrossRef]
  28. Leppäkoski, H.; Collin, J.; Takala, J. Pedestrian navigation based on inertial sensors, indoor map, and WLAN signals. J. Signal Process. Syst. 2013, 71, 287–296. [Google Scholar] [CrossRef]
  29. Chen, G.; Meng, X.; Wang, Y.; Zhang, Y.; Tian, P.; Yang, H. Integrated WiFi/PDR/Smartphone using an unscented kalman filter algorithm for 3D indoor localization. Sensors 2015, 15, 24595–24614. [Google Scholar] [CrossRef] [PubMed]
  30. Waqar, W.; Chen, Y.; Vardy, A. Incorporating user motion information for indoor smartphone positioning in sparse Wi-Fi environments. In Proceedings of the 17th ACM international conference on Modeling, Analysis and Simulation of Wireless and Mobile Systems (ACM), Montreal, QC, Canada, 21–26 September 2014; pp. 267–274. [Google Scholar]
  31. Li, Y.; Zhuang, Y.; Lan, H.; Zhou, Q.; Niu, X.; El-Sheimy, N. A hybrid WiFi/magnetic matching/PDR approach for indoor navigation with smartphone sensors. IEEE Commun. Lett. 2016, 20, 169–172. [Google Scholar] [CrossRef]
  32. Li, Z.; Liu, C.; Gao, J.; Li, X. An improved WiFi/PDR integrated system using an adaptive and robust filter for indoor localization. ISPRS Int. J. Geo-Inf. 2016, 5, 224. [Google Scholar] [CrossRef]
  33. Kim, P. Kalman Filter for Beginners: With MATLAB Examples; CreateSpace: Scotts Valley, CA, USA, 2011. [Google Scholar]
  34. Diaz, E.M.; Gonzalez, A.L.M. Step detector and step length estimator for an inertial pocket navigation system. In Proceedings of the 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Busan, Korea, 27–30 October 2014; pp. 105–110. [Google Scholar]
  35. Zhang, M.; Wen, Y.; Chen, J.; Yang, X.; Gao, R.; Zhao, H. Pedestrian dead-reckoning indoor localization based on OS-ELM. IEEE Access 2018, 6, 6116–6129. [Google Scholar] [CrossRef]
  36. Shchekotov, M. Indoor localization method based on wi-fi trilateration technique. In Proceedings of the 16th Conference of Fruct Association, Oulu, Finland, 27–31 October 2014; pp. 177–179. [Google Scholar]
  37. Ilci, V.; Alkan, R.M.; Gülal, V.E.; Cizmeci, H. Trilateration technique for WiFi-based indoor localization. In Proceedings of the 11th International Conference on Wireless and Mobile Communications (ICWMC), St. Julians, Malta, 11–16 October 2015; p. 36. [Google Scholar]
  38. Yim, J.; Jeong, S.; Gwon, K.; Joo, J. Improvement of Kalman filters for WLAN based indoor tracking. Expert Syst. Appl. 2010, 37, 426–433. [Google Scholar] [CrossRef]
  39. Yim, J. Development of Web Services for WLAN-based Indoor Positioning and Floor Map Repositories. Int. J. Control Autom. 2014, 7, 63–74. [Google Scholar] [CrossRef]
  40. Navarro, E.; Peuker, B.; Quan, M.; Clark, C.; Jipson, J. Wi-Fi Localization Using RSSI Fingerprinting; Citeseer, 2010. Available online: http://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1007&context=cpesp (accessed on 18 March 2019).
  41. Zegeye, W.K.; Amsalu, S.B.; Astatke, Y.; Moazzami, F. WiFi RSS fingerprinting indoor localization for mobile devices. In Proceedings of the 2016 IEEE 7th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), New York, NY, USA, 20–22 October 2016; pp. 1–6. [Google Scholar]
  42. Khan, M.; Kai, Y.D.; Gul, H.U. Indoor Wi-Fi positioning algorithm based on combination of location fingerprint and unscented Kalman filter. In Proceedings of the 2017 14th International Bhurban Conference on Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 10–14 January 2017; pp. 693–698. [Google Scholar]
  43. Retscher, G. Fusion of location fingerprinting and trilateration based on the example of differential wi-fi positioning. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 4, 377–384. [Google Scholar] [CrossRef]
  44. Chan, S.; Sohn, G. Indoor localization using wi-fi based fingerprinting and trilateration techiques for lbs applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 38, C26. [Google Scholar] [CrossRef]
  45. Elliott, R.J.; Aggoun, L.; Moore, J.B. Hidden Markov Models: Estimation And Control; Springer Science & Business Media: New York, NY, USA, 28 July 2008; Volume 29. [Google Scholar]
Figure 1. Proposed sensor fusion framework for PDR and Wi-Fi.
Figure 1. Proposed sensor fusion framework for PDR and Wi-Fi.
Applsci 09 04379 g001
Figure 2. Wi-Fi localization system using trilateration.
Figure 2. Wi-Fi localization system using trilateration.
Applsci 09 04379 g002
Figure 3. Wi-Fi fingerprint flow chart [41].
Figure 3. Wi-Fi fingerprint flow chart [41].
Applsci 09 04379 g003
Figure 4. Location estimation procedure.
Figure 4. Location estimation procedure.
Applsci 09 04379 g004
Figure 5. LKF implementation.
Figure 5. LKF implementation.
Applsci 09 04379 g005
Figure 6. Experiment scenarios.
Figure 6. Experiment scenarios.
Applsci 09 04379 g006
Figure 7. Trajectories of ground truth, proposed sensor fusion framework, conventional sensor fusion frameworks from rectangular motion.
Figure 7. Trajectories of ground truth, proposed sensor fusion framework, conventional sensor fusion frameworks from rectangular motion.
Applsci 09 04379 g007
Figure 8. Trajectories of ground truth, proposed sensor fusion framework, conventional sensor fusion frameworks from linear motion.
Figure 8. Trajectories of ground truth, proposed sensor fusion framework, conventional sensor fusion frameworks from linear motion.
Applsci 09 04379 g008
Figure 9. Average localization error from rectangular motion.
Figure 9. Average localization error from rectangular motion.
Applsci 09 04379 g009
Figure 10. Average localization error from linear motion.
Figure 10. Average localization error from linear motion.
Applsci 09 04379 g010
Figure 11. Probability distribution of the localization error for rectangular motion. (a) PDR+Wi-Fi trilateration. (b) PDR+Wi-Fi fingerprint. (c) Proposed Sensor Fusion Framework.
Figure 11. Probability distribution of the localization error for rectangular motion. (a) PDR+Wi-Fi trilateration. (b) PDR+Wi-Fi fingerprint. (c) Proposed Sensor Fusion Framework.
Applsci 09 04379 g011
Figure 12. Probability distribution of the localization error for linear motion. (a) PDR+Wi-Fi trilateration. (b) PDR+Wi-Fi fingerprint. (c) Proposed Sensor Fusion Framework.
Figure 12. Probability distribution of the localization error for linear motion. (a) PDR+Wi-Fi trilateration. (b) PDR+Wi-Fi fingerprint. (c) Proposed Sensor Fusion Framework.
Applsci 09 04379 g012
Table 1. Variables used in the LKF implementation.
Table 1. Variables used in the LKF implementation.
x ^ k State variable
P k Error covariance matrix
AState transition matrix
kVariable used for the recursive execution of Kalman filter
x ^ k 1 , P k 1 For internal computation
QNoise covariance of the process
K k Kalman gain
z k Measurement
HMatrix for calculating the predicted value in the form of a measured value
RCovariance of the measurement noise
Table 2. Performance of different sensor fusion frameworks for path 1 rectangular motion.
Table 2. Performance of different sensor fusion frameworks for path 1 rectangular motion.
Position Estimation SystemsMean Error (m)Max. Error (m)Min. Error (m)Standard Deviation of Error (m)
PDR+Wi-Fi Trilateration0.541.70.010.45
PDR+Wi-Fi Fingerprint0.501.380.50.39
Proposed sensor fusion framework0.421.170.020.25
Table 3. Performance of different sensor fusion frameworks for path 2 straight-line motion.
Table 3. Performance of different sensor fusion frameworks for path 2 straight-line motion.
Position Estimation SystemsMean Error (m)Max. Error (m)Min. Error (m)Standard Deviation of Error (m)
PDR+Wi-Fi Trilateration0.350.860.040.23
PDR+Wi-Fi Fingerprint0.300.780.0080.21
Proposed sensor fusion framework0.220.440.0070.12

Share and Cite

MDPI and ACS Style

Poulose, A.; Kim, J.; Han, D.S. A Sensor Fusion Framework for Indoor Localization Using Smartphone Sensors and Wi-Fi RSSI Measurements. Appl. Sci. 2019, 9, 4379. https://doi.org/10.3390/app9204379

AMA Style

Poulose A, Kim J, Han DS. A Sensor Fusion Framework for Indoor Localization Using Smartphone Sensors and Wi-Fi RSSI Measurements. Applied Sciences. 2019; 9(20):4379. https://doi.org/10.3390/app9204379

Chicago/Turabian Style

Poulose, Alwin, Jihun Kim, and Dong Seog Han. 2019. "A Sensor Fusion Framework for Indoor Localization Using Smartphone Sensors and Wi-Fi RSSI Measurements" Applied Sciences 9, no. 20: 4379. https://doi.org/10.3390/app9204379

APA Style

Poulose, A., Kim, J., & Han, D. S. (2019). A Sensor Fusion Framework for Indoor Localization Using Smartphone Sensors and Wi-Fi RSSI Measurements. Applied Sciences, 9(20), 4379. https://doi.org/10.3390/app9204379

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop