Easily establishing pairing between Internet-of-Things (IoT) devices is important for fast deployment in many smart home scenarios. Traditional pairing methods, including passkey, QR code, and RFID, often require speci-c user interfaces, surface's shape/material, or additional tags/readers. The growing number of low-resource IoT devices without an interface may not meet these requirements, which makes their pairing a challenge. On the other hand, these devices often already have sensors embedded for sensing tasks, such as inertial sensors. These sensors can be used for limited user interaction with the devices, but are not suitable for pairing on their own. In this paper, we present UniverSense, an alternative pairing method between low-resource IoT devices with an inertial sensor and a more powerful networked device equipped with a camera. To establish pairing between them, the user moves the low-resource IoT device in front of the camera. Both the camera and the ondevice sensors capture the physical motion of the low-resource device. UniverSense converts these signals into a common statespace to generate fingerprints for pairing. We conduct real-world experiments to evaluate UniverSense and it achieves an F1 score of 99:9% in experiments carried out by five participants.
|Title of host publication||HotMobile 2018 - Proceedings of the 19th International Workshop on Mobile Computing Systems and Applications|
|Publisher||Association for Computing Machinery, Inc|
|Number of pages||6|
|Publication status||Published - 2018 Feb 12|
|Event||19th International Workshop on Mobile Computing Systems and Applications, HotMobile 2018 - Tempe, United States|
Duration: 2018 Feb 12 → 2018 Feb 13
|Name||HotMobile 2018 - Proceedings of the 19th International Workshop on Mobile Computing Systems and Applications|
|Conference||19th International Workshop on Mobile Computing Systems and Applications, HotMobile 2018|
|Period||18/2/12 → 18/2/13|
Bibliographical noteFunding Information:
This research was supported in part by the National Science Foundation (under grants CNS-1149611, CMMI-1653550 and CNS-1645759), Intel and Google. The views and conclusions contained here are those of the authors and should not be interpreted as necessarily representing the ocial policies or endorsements, either express or implied, of CMU, NSF, or the U.S. Government or any of its agencies.
This research was supported in part by the National Science Foundation (under grants CNS-1149611, CMMI-1653550
© 2018 Association for Computing Machinery.
All Science Journal Classification (ASJC) codes
- Human-Computer Interaction
- Computer Science Applications
- Computer Networks and Communications