Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch

Research output: Contribution to journalArticle

11 Citations (Scopus)

Abstract

Background/aims: Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Methods: Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. Results: We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Conclusion: Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications.

Original languageEnglish
Pages (from-to)164-174
Number of pages11
JournalSkin Research and Technology
Volume21
Issue number2
DOIs
Publication statusPublished - 2015 May 1

Fingerprint

Touch
Skin
Equipment and Supplies
Surface Properties
Friction
Skin Diseases
Cosmetics
Cues

All Science Journal Classification (ASJC) codes

  • Dermatology

Cite this

@article{65cb840843ed4f9ab7d19c3384ad86cc,
title = "Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch",
abstract = "Background/aims: Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Methods: Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. Results: We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Conclusion: Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications.",
author = "K. Kim and Lee, {Sang Youn}",
year = "2015",
month = "5",
day = "1",
doi = "10.1111/srt.12173",
language = "English",
volume = "21",
pages = "164--174",
journal = "Skin Research and Technology",
issn = "0909-752X",
publisher = "Wiley-Blackwell",
number = "2",

}

Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch. / Kim, K.; Lee, Sang Youn.

In: Skin Research and Technology, Vol. 21, No. 2, 01.05.2015, p. 164-174.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch

AU - Kim, K.

AU - Lee, Sang Youn

PY - 2015/5/1

Y1 - 2015/5/1

N2 - Background/aims: Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Methods: Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. Results: We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Conclusion: Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications.

AB - Background/aims: Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Methods: Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. Results: We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Conclusion: Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications.

UR - http://www.scopus.com/inward/record.url?scp=84926406719&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84926406719&partnerID=8YFLogxK

U2 - 10.1111/srt.12173

DO - 10.1111/srt.12173

M3 - Article

C2 - 25087469

AN - SCOPUS:84926406719

VL - 21

SP - 164

EP - 174

JO - Skin Research and Technology

JF - Skin Research and Technology

SN - 0909-752X

IS - 2

ER -