Touch screens are rapidly penetrating our daily life. Even general users are enjoying various applications such as drawing sketches and playing games with their mobile phones. In particular, the improvement of graphics capabilities with large display screens now allows easy access at any location to 3D environments including 3D games, animation, and augmented reality applications that were formerly limited to desktop environments. Touch screen devices have distinct features compared with previous mobile phones: the screen is widened and the device is mainly operated with touch input. As a natural consequence, users want to be able to feel the touched objects in many cases. From the viewpoint of haptics on surfaces, a variety of haptic rendering/interaction techniques have been proposed to make interactions on surface richer and more natural. One approach to lend applications tactility is to provide frictional feedback on touch surfaces [Bau and Poupyrev 2012; Kim et al. 2013; Winfield et al. 2007]. These technologies plausibly reproduce finetuned textures, such as craters on the surface of the Moon and the lines on the palm of a hand. However, the technologies are not suitable to reproduce the object surface with contours that extend beyond the fingertip, such as the large-scaled geometry of the Moon and the palm. To handle such issues, a recent work proposed a lateral haptic display that can provide two-dimensional force feedback slightly above the screen [Saga and Deguchi 2012]. They exploited the phenomenon that people tend to have the perception of touching 3D objects if they receive force feedback in/against the direction of movement on a 2D surface. A similar technique was applied to image-based haptic interaction in another recent work [Kim and Kwon 2013]. This type of feeling can be reproduced by considering one important feature of haptics: collocation, which is often ignored in the context of surface interaction in that interaction space is limited to a 2D surface. This is a substantially different setup from that used for conventional haptic rendering, which generally collocates haptic information in the real interaction space. A recent study dealt with this issue by proposing a 1-dimensional translational robot system [Sinclair et al. 2013]. Their robotic touch display allows users to explore virtual 3D content such as volumetric medical images by moving the display surface along the z-axis perpendicular to the display surface. This trial has significant meaning as the first work to take collocation into consideration in the context of surface haptics. Another recent study addressed this issue by proposing a haptic stylus with a variable tip length. The proposed system is designed to give users the illusion that some part of the stylus is immersed in virtual space, enabling the direct touch of virtual objects displayed on the flat surface [Withana et al. 2010].