Landmark-based homing navigation using omnidirectional depth information

Changmin Lee, Seung Eun Yu, DaeEun Kim

Research output: Contribution to journalArticle

10 Citations (Scopus)

Abstract

A number of landmark-based navigation algorithms have been studied using feature extraction over the visual information. In this paper, we apply the distance information of the surrounding environment in a landmark navigation model. We mount a depth sensor on a mobile robot, in order to obtain omnidirectional distance information. The surrounding environment is represented as a circular form of landmark vectors, which forms a snapshot. The depth snapshots at the current position and the target position are compared to determine the homing direction, inspired by the snapshot model. Here, we suggest a holistic view of panoramic depth information for homing navigation where each sample point is taken as a landmark. The results are shown in a vector map of homing vectors. The performance of the suggested method is evaluated based on the angular errors and the homing success rate. Omnidirectional depth information about the surrounding environment can be a promising source of landmark homing navigation. We demonstrate the results that a holistic approach with omnidirectional depth information shows effective homing navigation.

Original languageEnglish
Article number1928
JournalSensors (Switzerland)
Volume17
Issue number8
DOIs
Publication statusPublished - 2017 Aug 22

Fingerprint

homing
landmarks
navigation
Navigation
Mobile robots
Feature extraction
robots
pattern recognition
Sensors
sensors

All Science Journal Classification (ASJC) codes

  • Analytical Chemistry
  • Atomic and Molecular Physics, and Optics
  • Biochemistry
  • Instrumentation
  • Electrical and Electronic Engineering

Cite this

@article{85b48c6c03a44c269d559f9587fb12a0,
title = "Landmark-based homing navigation using omnidirectional depth information",
abstract = "A number of landmark-based navigation algorithms have been studied using feature extraction over the visual information. In this paper, we apply the distance information of the surrounding environment in a landmark navigation model. We mount a depth sensor on a mobile robot, in order to obtain omnidirectional distance information. The surrounding environment is represented as a circular form of landmark vectors, which forms a snapshot. The depth snapshots at the current position and the target position are compared to determine the homing direction, inspired by the snapshot model. Here, we suggest a holistic view of panoramic depth information for homing navigation where each sample point is taken as a landmark. The results are shown in a vector map of homing vectors. The performance of the suggested method is evaluated based on the angular errors and the homing success rate. Omnidirectional depth information about the surrounding environment can be a promising source of landmark homing navigation. We demonstrate the results that a holistic approach with omnidirectional depth information shows effective homing navigation.",
author = "Changmin Lee and Yu, {Seung Eun} and DaeEun Kim",
year = "2017",
month = "8",
day = "22",
doi = "10.3390/s17081928",
language = "English",
volume = "17",
journal = "Sensors",
issn = "1424-3210",
publisher = "Multidisciplinary Digital Publishing Institute (MDPI)",
number = "8",

}

Landmark-based homing navigation using omnidirectional depth information. / Lee, Changmin; Yu, Seung Eun; Kim, DaeEun.

In: Sensors (Switzerland), Vol. 17, No. 8, 1928, 22.08.2017.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Landmark-based homing navigation using omnidirectional depth information

AU - Lee, Changmin

AU - Yu, Seung Eun

AU - Kim, DaeEun

PY - 2017/8/22

Y1 - 2017/8/22

N2 - A number of landmark-based navigation algorithms have been studied using feature extraction over the visual information. In this paper, we apply the distance information of the surrounding environment in a landmark navigation model. We mount a depth sensor on a mobile robot, in order to obtain omnidirectional distance information. The surrounding environment is represented as a circular form of landmark vectors, which forms a snapshot. The depth snapshots at the current position and the target position are compared to determine the homing direction, inspired by the snapshot model. Here, we suggest a holistic view of panoramic depth information for homing navigation where each sample point is taken as a landmark. The results are shown in a vector map of homing vectors. The performance of the suggested method is evaluated based on the angular errors and the homing success rate. Omnidirectional depth information about the surrounding environment can be a promising source of landmark homing navigation. We demonstrate the results that a holistic approach with omnidirectional depth information shows effective homing navigation.

AB - A number of landmark-based navigation algorithms have been studied using feature extraction over the visual information. In this paper, we apply the distance information of the surrounding environment in a landmark navigation model. We mount a depth sensor on a mobile robot, in order to obtain omnidirectional distance information. The surrounding environment is represented as a circular form of landmark vectors, which forms a snapshot. The depth snapshots at the current position and the target position are compared to determine the homing direction, inspired by the snapshot model. Here, we suggest a holistic view of panoramic depth information for homing navigation where each sample point is taken as a landmark. The results are shown in a vector map of homing vectors. The performance of the suggested method is evaluated based on the angular errors and the homing success rate. Omnidirectional depth information about the surrounding environment can be a promising source of landmark homing navigation. We demonstrate the results that a holistic approach with omnidirectional depth information shows effective homing navigation.

UR - http://www.scopus.com/inward/record.url?scp=85028335754&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85028335754&partnerID=8YFLogxK

U2 - 10.3390/s17081928

DO - 10.3390/s17081928

M3 - Article

VL - 17

JO - Sensors

JF - Sensors

SN - 1424-3210

IS - 8

M1 - 1928

ER -