The insect has a navigation ability to estimate the direction to their habitat using visual information after finding their food. It is known that many insects including ants can use visual snapshot around them for homing navigation. Inspired by this navigation ability of insect, many navigation algorithms have been suggested. One of the navigation algorithms is the average landmark vector (ALV) algorithm to calculate the direction to the target location relative to the current location. This algorithm is based on the observation of landmarks from visual information. Observing and identifying landmarks in real environment is a challenging problem. For the snapshot model, the feature extraction from the visual image plays an important role. Segmentation or clustering over color pixels may not provide a robust solution to find landmarks in the snapshot model. In this paper, we suggest that a vertical edge features with neighbor pixel colors can be a very efficient and effective solution to identify landmarks. These vertical edge features are not warped by the movement of camera, and they maintain the characteristic for the movement of a robot. We test a new algorithm of detecting these vertical edge features as landmarks and finding the correspondence between those landmarks at the nest and at the current location. As a result, the algorithm easily determines the homing direction.