Real-Time Visual-Inertial SLAM Based on Adaptive Keyframe Selection for Mobile AR Applications

Jin Chun Piao, Shin Dug Kim

Research output: Contribution to journalArticle

Abstract

Simultaneous localization and mapping (SLAM) technology is used in many applications, such as augmented reality (AR)/virtual reality, robots, drones, and self-driving vehicles. In AR applications, rapid camera motion estimation, actual size, and scale are important issues. In this research, we introduce a real-time visual-inertial SLAM based on an adaptive keyframe selection for mobile AR applications. Specifically, the SLAM system is designed based on the adaptive keyframe selection visual-inertial odometry method that includes the adaptive keyframe selection method and the lightweight visual-inertial odometry method. The inertial measurement unit data are used to predict the motion state of the current frame and it is judged whether or not the current frame is a keyframe by an adaptive selection method based on learning and automatic setting. Relatively unimportant frames (not a keyframe) are processed using a lightweight visual-inertial odometry method for efficiency and real-time performance. We simulate it in a PC environment and compare it with state-of-the-art methods. The experimental results demonstrate that the mean translation root-mean-square error of the keyframe trajectory is 0.067 m without the ground-truth scale matching, and the scale error is 0.58% with the EuRoC dataset. Moreover, the experimental results of the mobile device show that the performance is improved by 34.5%-53.8% using the proposed method.

Original languageEnglish
Article number8698793
Pages (from-to)2827-2836
Number of pages10
JournalIEEE Transactions on Multimedia
Volume21
Issue number11
DOIs
Publication statusPublished - 2019 Nov

Fingerprint

Augmented reality
Units of measurement
Motion estimation
Mobile devices
Mean square error
Virtual reality
Cameras
Trajectories
Robots

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Media Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this

@article{14a9bc3d7068456ea8c89d7d8c18d7e2,
title = "Real-Time Visual-Inertial SLAM Based on Adaptive Keyframe Selection for Mobile AR Applications",
abstract = "Simultaneous localization and mapping (SLAM) technology is used in many applications, such as augmented reality (AR)/virtual reality, robots, drones, and self-driving vehicles. In AR applications, rapid camera motion estimation, actual size, and scale are important issues. In this research, we introduce a real-time visual-inertial SLAM based on an adaptive keyframe selection for mobile AR applications. Specifically, the SLAM system is designed based on the adaptive keyframe selection visual-inertial odometry method that includes the adaptive keyframe selection method and the lightweight visual-inertial odometry method. The inertial measurement unit data are used to predict the motion state of the current frame and it is judged whether or not the current frame is a keyframe by an adaptive selection method based on learning and automatic setting. Relatively unimportant frames (not a keyframe) are processed using a lightweight visual-inertial odometry method for efficiency and real-time performance. We simulate it in a PC environment and compare it with state-of-the-art methods. The experimental results demonstrate that the mean translation root-mean-square error of the keyframe trajectory is 0.067 m without the ground-truth scale matching, and the scale error is 0.58{\%} with the EuRoC dataset. Moreover, the experimental results of the mobile device show that the performance is improved by 34.5{\%}-53.8{\%} using the proposed method.",
author = "Piao, {Jin Chun} and Kim, {Shin Dug}",
year = "2019",
month = "11",
doi = "10.1109/TMM.2019.2913324",
language = "English",
volume = "21",
pages = "2827--2836",
journal = "IEEE Transactions on Multimedia",
issn = "1520-9210",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "11",

}

Real-Time Visual-Inertial SLAM Based on Adaptive Keyframe Selection for Mobile AR Applications. / Piao, Jin Chun; Kim, Shin Dug.

In: IEEE Transactions on Multimedia, Vol. 21, No. 11, 8698793, 11.2019, p. 2827-2836.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Real-Time Visual-Inertial SLAM Based on Adaptive Keyframe Selection for Mobile AR Applications

AU - Piao, Jin Chun

AU - Kim, Shin Dug

PY - 2019/11

Y1 - 2019/11

N2 - Simultaneous localization and mapping (SLAM) technology is used in many applications, such as augmented reality (AR)/virtual reality, robots, drones, and self-driving vehicles. In AR applications, rapid camera motion estimation, actual size, and scale are important issues. In this research, we introduce a real-time visual-inertial SLAM based on an adaptive keyframe selection for mobile AR applications. Specifically, the SLAM system is designed based on the adaptive keyframe selection visual-inertial odometry method that includes the adaptive keyframe selection method and the lightweight visual-inertial odometry method. The inertial measurement unit data are used to predict the motion state of the current frame and it is judged whether or not the current frame is a keyframe by an adaptive selection method based on learning and automatic setting. Relatively unimportant frames (not a keyframe) are processed using a lightweight visual-inertial odometry method for efficiency and real-time performance. We simulate it in a PC environment and compare it with state-of-the-art methods. The experimental results demonstrate that the mean translation root-mean-square error of the keyframe trajectory is 0.067 m without the ground-truth scale matching, and the scale error is 0.58% with the EuRoC dataset. Moreover, the experimental results of the mobile device show that the performance is improved by 34.5%-53.8% using the proposed method.

AB - Simultaneous localization and mapping (SLAM) technology is used in many applications, such as augmented reality (AR)/virtual reality, robots, drones, and self-driving vehicles. In AR applications, rapid camera motion estimation, actual size, and scale are important issues. In this research, we introduce a real-time visual-inertial SLAM based on an adaptive keyframe selection for mobile AR applications. Specifically, the SLAM system is designed based on the adaptive keyframe selection visual-inertial odometry method that includes the adaptive keyframe selection method and the lightweight visual-inertial odometry method. The inertial measurement unit data are used to predict the motion state of the current frame and it is judged whether or not the current frame is a keyframe by an adaptive selection method based on learning and automatic setting. Relatively unimportant frames (not a keyframe) are processed using a lightweight visual-inertial odometry method for efficiency and real-time performance. We simulate it in a PC environment and compare it with state-of-the-art methods. The experimental results demonstrate that the mean translation root-mean-square error of the keyframe trajectory is 0.067 m without the ground-truth scale matching, and the scale error is 0.58% with the EuRoC dataset. Moreover, the experimental results of the mobile device show that the performance is improved by 34.5%-53.8% using the proposed method.

UR - http://www.scopus.com/inward/record.url?scp=85074572984&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85074572984&partnerID=8YFLogxK

U2 - 10.1109/TMM.2019.2913324

DO - 10.1109/TMM.2019.2913324

M3 - Article

AN - SCOPUS:85074572984

VL - 21

SP - 2827

EP - 2836

JO - IEEE Transactions on Multimedia

JF - IEEE Transactions on Multimedia

SN - 1520-9210

IS - 11

M1 - 8698793

ER -