Animals and insects use their own navigating system to return home in various ways. One of the most widely used senses is vision. They use visual information to remember the home snapshot image which is useful in returning home from an arbitrary location. Inspired by behaviours of insects and other animals, there have been many homing algorithms applied to mobile robots. These methods use visual information to choose the moving direction by comparing the current landmark view with the snapshot taken at the nest. In this paper, we suggest a new image-based navigation method which is called landmark navigation with distance estimation. This method computes the distance to each landmark by using only the visual information of an omnidirectional camea. Estimated distance is then used to locate the robot in an environmental map. As a result, this new method shows better performance in returning home from an arbitrary location than other methods. The corresponding robotic experiments will be demonstrated.