Intraoperative margin assessment of human breast tissue in optical coherence tomography images using deep neural networks

Amal Rannen Triki, Matthew B. Blaschko, Yoon Mo Jung, Seungri Song, Hyun Ju Han, Seung Il Kim, Chulmin Joo

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Assessing the surgical margin during breast lumpectomy operations can avoid the need for additional surgery. Optical coherence tomography (OCT) is an imaging technique that has been proven to be efficient for this purpose. However, to avoid overloading the surgeon during the operation, automatic cancer detection at the surface of the removed tissue is needed. This work explores automated margin assessment on a sample of patient data collected at the Pathology Department, Severance Hospital (Seoul, South Korea). Some methods based on the spatial statistics of the images have been developed, but the obtained results are still far from human performance. In this work, we investigate the possibility to use deep neural networks (DNNs) for real time margin assessment, demonstrating performance significantly better than the reported literature and close to the level of a human expert. Since the goal is to detect the presence of cancer, a patch-based classification method is proposed, as it is sufficient for detection, and requires training data that is easier and cheaper to collect than for other approaches such as segmentation. For that purpose, we train a DNN architecture that was proved to be efficient for small images on patches extracted from images containing only cancer or only normal tissue as determined by pathologists in a university hospital. As the number of available images in all such studies is by necessity small relative to other deep network applications such as ImageNet, a good regularization method is needed. In this work, we propose to use a recently introduced function norm regularization that attempts to directly control the function complexity, in contrast to classical approaches such as weight decay and DropOut. As neither the code nor the data of previous results are publicly available, the obtained results are compared with reported results in the literature for a conservative comparison. Moreover, our method is applied to locally collected data on several data configurations. The reported results are the average over the different trials. The experimental results show that the use of DNNs yields significantly better results than other techniques when evaluated in terms of sensitivity, specificity, F1 score, G-mean and Matthews correlation coefficient. Function norm regularization yielded higher and more robust results than competing regularization methods. We have demonstrated a system that shows high promise for (partially) automated margin assessment of human breast tissue, Equal error rate (EER) is reduced from approximately 12% (the lowest reported in the literature) to 5% – a 58% reduction. The method is computationally feasible for intraoperative application (less than 2 s per image) at the only cost of a longer offline training time.

Original languageEnglish
Pages (from-to)21-32
Number of pages12
JournalComputerized Medical Imaging and Graphics
Volume69
DOIs
Publication statusPublished - 2018 Nov 1

Fingerprint

Optical tomography
Optical Coherence Tomography
Breast
Tissue
Pathology
Network architecture
Surgery
Hospital Pathology Department
Automation
Statistics
Imaging techniques
Neoplasms
Republic of Korea
Segmental Mastectomy
Deep neural networks
Costs
Weights and Measures
Costs and Cost Analysis
Sensitivity and Specificity

All Science Journal Classification (ASJC) codes

  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Graphics and Computer-Aided Design

Cite this

Rannen Triki, Amal ; Blaschko, Matthew B. ; Jung, Yoon Mo ; Song, Seungri ; Han, Hyun Ju ; Kim, Seung Il ; Joo, Chulmin. / Intraoperative margin assessment of human breast tissue in optical coherence tomography images using deep neural networks. In: Computerized Medical Imaging and Graphics. 2018 ; Vol. 69. pp. 21-32.
@article{39e43acc439d426ba8280b514e779e63,
title = "Intraoperative margin assessment of human breast tissue in optical coherence tomography images using deep neural networks",
abstract = "Assessing the surgical margin during breast lumpectomy operations can avoid the need for additional surgery. Optical coherence tomography (OCT) is an imaging technique that has been proven to be efficient for this purpose. However, to avoid overloading the surgeon during the operation, automatic cancer detection at the surface of the removed tissue is needed. This work explores automated margin assessment on a sample of patient data collected at the Pathology Department, Severance Hospital (Seoul, South Korea). Some methods based on the spatial statistics of the images have been developed, but the obtained results are still far from human performance. In this work, we investigate the possibility to use deep neural networks (DNNs) for real time margin assessment, demonstrating performance significantly better than the reported literature and close to the level of a human expert. Since the goal is to detect the presence of cancer, a patch-based classification method is proposed, as it is sufficient for detection, and requires training data that is easier and cheaper to collect than for other approaches such as segmentation. For that purpose, we train a DNN architecture that was proved to be efficient for small images on patches extracted from images containing only cancer or only normal tissue as determined by pathologists in a university hospital. As the number of available images in all such studies is by necessity small relative to other deep network applications such as ImageNet, a good regularization method is needed. In this work, we propose to use a recently introduced function norm regularization that attempts to directly control the function complexity, in contrast to classical approaches such as weight decay and DropOut. As neither the code nor the data of previous results are publicly available, the obtained results are compared with reported results in the literature for a conservative comparison. Moreover, our method is applied to locally collected data on several data configurations. The reported results are the average over the different trials. The experimental results show that the use of DNNs yields significantly better results than other techniques when evaluated in terms of sensitivity, specificity, F1 score, G-mean and Matthews correlation coefficient. Function norm regularization yielded higher and more robust results than competing regularization methods. We have demonstrated a system that shows high promise for (partially) automated margin assessment of human breast tissue, Equal error rate (EER) is reduced from approximately 12{\%} (the lowest reported in the literature) to 5{\%} – a 58{\%} reduction. The method is computationally feasible for intraoperative application (less than 2 s per image) at the only cost of a longer offline training time.",
author = "{Rannen Triki}, Amal and Blaschko, {Matthew B.} and Jung, {Yoon Mo} and Seungri Song and Han, {Hyun Ju} and Kim, {Seung Il} and Chulmin Joo",
year = "2018",
month = "11",
day = "1",
doi = "10.1016/j.compmedimag.2018.06.002",
language = "English",
volume = "69",
pages = "21--32",
journal = "Computerized Medical Imaging and Graphics",
issn = "0895-6111",
publisher = "Elsevier Limited",

}

Intraoperative margin assessment of human breast tissue in optical coherence tomography images using deep neural networks. / Rannen Triki, Amal; Blaschko, Matthew B.; Jung, Yoon Mo; Song, Seungri; Han, Hyun Ju; Kim, Seung Il; Joo, Chulmin.

In: Computerized Medical Imaging and Graphics, Vol. 69, 01.11.2018, p. 21-32.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Intraoperative margin assessment of human breast tissue in optical coherence tomography images using deep neural networks

AU - Rannen Triki, Amal

AU - Blaschko, Matthew B.

AU - Jung, Yoon Mo

AU - Song, Seungri

AU - Han, Hyun Ju

AU - Kim, Seung Il

AU - Joo, Chulmin

PY - 2018/11/1

Y1 - 2018/11/1

N2 - Assessing the surgical margin during breast lumpectomy operations can avoid the need for additional surgery. Optical coherence tomography (OCT) is an imaging technique that has been proven to be efficient for this purpose. However, to avoid overloading the surgeon during the operation, automatic cancer detection at the surface of the removed tissue is needed. This work explores automated margin assessment on a sample of patient data collected at the Pathology Department, Severance Hospital (Seoul, South Korea). Some methods based on the spatial statistics of the images have been developed, but the obtained results are still far from human performance. In this work, we investigate the possibility to use deep neural networks (DNNs) for real time margin assessment, demonstrating performance significantly better than the reported literature and close to the level of a human expert. Since the goal is to detect the presence of cancer, a patch-based classification method is proposed, as it is sufficient for detection, and requires training data that is easier and cheaper to collect than for other approaches such as segmentation. For that purpose, we train a DNN architecture that was proved to be efficient for small images on patches extracted from images containing only cancer or only normal tissue as determined by pathologists in a university hospital. As the number of available images in all such studies is by necessity small relative to other deep network applications such as ImageNet, a good regularization method is needed. In this work, we propose to use a recently introduced function norm regularization that attempts to directly control the function complexity, in contrast to classical approaches such as weight decay and DropOut. As neither the code nor the data of previous results are publicly available, the obtained results are compared with reported results in the literature for a conservative comparison. Moreover, our method is applied to locally collected data on several data configurations. The reported results are the average over the different trials. The experimental results show that the use of DNNs yields significantly better results than other techniques when evaluated in terms of sensitivity, specificity, F1 score, G-mean and Matthews correlation coefficient. Function norm regularization yielded higher and more robust results than competing regularization methods. We have demonstrated a system that shows high promise for (partially) automated margin assessment of human breast tissue, Equal error rate (EER) is reduced from approximately 12% (the lowest reported in the literature) to 5% – a 58% reduction. The method is computationally feasible for intraoperative application (less than 2 s per image) at the only cost of a longer offline training time.

AB - Assessing the surgical margin during breast lumpectomy operations can avoid the need for additional surgery. Optical coherence tomography (OCT) is an imaging technique that has been proven to be efficient for this purpose. However, to avoid overloading the surgeon during the operation, automatic cancer detection at the surface of the removed tissue is needed. This work explores automated margin assessment on a sample of patient data collected at the Pathology Department, Severance Hospital (Seoul, South Korea). Some methods based on the spatial statistics of the images have been developed, but the obtained results are still far from human performance. In this work, we investigate the possibility to use deep neural networks (DNNs) for real time margin assessment, demonstrating performance significantly better than the reported literature and close to the level of a human expert. Since the goal is to detect the presence of cancer, a patch-based classification method is proposed, as it is sufficient for detection, and requires training data that is easier and cheaper to collect than for other approaches such as segmentation. For that purpose, we train a DNN architecture that was proved to be efficient for small images on patches extracted from images containing only cancer or only normal tissue as determined by pathologists in a university hospital. As the number of available images in all such studies is by necessity small relative to other deep network applications such as ImageNet, a good regularization method is needed. In this work, we propose to use a recently introduced function norm regularization that attempts to directly control the function complexity, in contrast to classical approaches such as weight decay and DropOut. As neither the code nor the data of previous results are publicly available, the obtained results are compared with reported results in the literature for a conservative comparison. Moreover, our method is applied to locally collected data on several data configurations. The reported results are the average over the different trials. The experimental results show that the use of DNNs yields significantly better results than other techniques when evaluated in terms of sensitivity, specificity, F1 score, G-mean and Matthews correlation coefficient. Function norm regularization yielded higher and more robust results than competing regularization methods. We have demonstrated a system that shows high promise for (partially) automated margin assessment of human breast tissue, Equal error rate (EER) is reduced from approximately 12% (the lowest reported in the literature) to 5% – a 58% reduction. The method is computationally feasible for intraoperative application (less than 2 s per image) at the only cost of a longer offline training time.

UR - http://www.scopus.com/inward/record.url?scp=85052468834&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85052468834&partnerID=8YFLogxK

U2 - 10.1016/j.compmedimag.2018.06.002

DO - 10.1016/j.compmedimag.2018.06.002

M3 - Article

C2 - 30172090

AN - SCOPUS:85052468834

VL - 69

SP - 21

EP - 32

JO - Computerized Medical Imaging and Graphics

JF - Computerized Medical Imaging and Graphics

SN - 0895-6111

ER -