Visual navigation in robotics is one of the challenging issues, and many navigation approaches are based on localization of a mobile robot in the environment. The snapshot model is a biologically inspired model of insect behaviour to return home and it shows a simple algorithm to compare the snapshot images at the current position and the destination, instead of complex localization process. Here, we propose a new homing navigation method based on a moment measure to characterize the snapshot image efficiently. The method uses range values or pixel values of surrounding landmarks. Then it defines a moment measure to evaluate the environmental features, or landmark distributions, and the measure forms a convex shape of landscape with respect to robot positions in the environment. Based on the landscape, the mobile robot can return home successfully. Range sensors or image sensors can sufficiently provide the landscape information. Our experimental results demonstrate that the method is effective even in real environments.
|Title of host publication||From Animals to Animats - 14th International Conference on Simulation of Adaptive Behavior, SAB 2016, Proceedings|
|Editors||John Hallam, Elio Tuci, Alexandros Giagkos, Myra Wilson|
|Number of pages||12|
|Publication status||Published - 2016|
|Event||14th International Conference on Simulation of Adaptive Behavior, SAB 2016 - Aberystwyth, United Kingdom|
Duration: 2016 Aug 23 → 2016 Aug 26
|Name||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|
|Other||14th International Conference on Simulation of Adaptive Behavior, SAB 2016|
|Period||16/8/23 → 16/8/26|
Bibliographical noteFunding Information:
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MEST) (No. 2014R1A2A1A11053839).
© Springer International Publishing Switzerland 2016.
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Computer Science(all)