Collaborative SLAM Based on WiFi Fingerprint Similarity and Motion Information

Simultaneous localization and mapping (SLAM) has been extensively researched in past years particularly with regard to range-based or visual-based sensors. Instead of deploying dedicated devices that use visual features, it is more pragmatic to exploit the radio features to achieve this task, due to their ubiquitous nature and the widespread deployment of the Wi-Fi wireless network. This article presents a novel approach for collaborative simultaneous localization and radio fingerprint mapping (C-SLAM-RF) in large unknown indoor environments. The proposed system uses received signal strengths (RSS) from Wi-Fi access points (APs) in the existing infrastructure and pedestrian dead reckoning (PDR) from a smartphone, without a prior knowledge about map or distribution of AP in the environment. We claim a loop closure based on the similarity of the two radio fingerprints. To further improve the performance, we incorporate the turning motion and assign a small uncertainty value to a loop closure if a matched turning is identified. The experiment was done in an area of 130 m by 70 m and the results show that our proposed system is capable of estimating the tracks of four users with an accuracy of 0.6 m with Tango-based PDR and 4.76 m with a step counter-based PDR.

Authors:
Liu Ran, Marakkalage Sumudu Hasala, Padmal Madhushanka, Shaganan Thiruketheeswaran, Yuen Chau, Guan Yong Liang

Publication type:
A1 Journal article – refereed

Place of publication:

Keywords:
Radio navigation, radio propagation, sensor fusion, simultaneous localization and mapping, trajectory optimization

Published:

Full citation:
R. Liu et al., “Collaborative SLAM Based on WiFi Fingerprint Similarity and Motion Information,” in IEEE Internet of Things Journal, vol. 7, no. 3, pp. 1826-1840, March 2020, doi: 10.1109/JIOT.2019.2957293

DOI:
https://doi.org/10.1109/JIOT.2019.2957293

Read the publication here:
http://urn.fi/urn:nbn:fi-fe2020060440594