Many insects return home by using their environmental landmarks. They remember the image at their nest and find the homeward direction, comparing it with the current image. There have been robotic researches to model the landmark navigation, focusing on how the image matching process can lead an agent to return to the nest, starting from an arbitrary spot. According to Franz's navigation algorithm, an agent estimates the changes of image for its own movement, and evaluates which directional movement can produce the image pattern most similar to the snapshot taken at the nest. Then it finally chooses the best image-matching direction. Based on the idea, we suggest a new navigation approach where the image is divided into several sectors and then the sector-based image matching is applied. It checks the occupancy and the distance variation for each sector. As a result, it shows better performance than Franz's algorithm.