Automatic characterization and segmentation of human skin using three-dimensional optical coherence tomography Automatic characterization and segmentation of human skin using three-dimensional optical coherence tomography

Access this Article

Abstract

A set of fully automated algorithms that is specialized for analyzing a three-dimensional optical coherence tomography (OCT) volume of human skin is reported. The algorithm set first determines the skin surface of the OCT volume, and a depth-oriented algorithm provides the mean epidermal thickness, distribution map of the epidermis, and a segmented volume of the epidermis. Subsequently, an en face shadowgram is produced by an algorithm to visualize the infundibula in the skin with high contrast. The population and occupation ratio of the infundibula are provided by a histogram-based thresholding algorithm and a distance mapping algorithm. En face OCT slices at constant depths from the sample surface are extracted, and the histogram-based thresholding algorithm is again applied to these slices, yielding a three-dimensional segmented volume of the infundibula. The dermal attenuation coefficient is also calculated from the OCT volume in order to evaluate the skin texture. The algorithm set examines swept-source OCT volumes of the skins of several volunteers, and the results show the high stability, portability and reproducibility of the algorithm.This paper was published in Optics Express and is made available as an electronic reprint with the permission of OSA. The paper can be found at the following URL on the OSA website: http://www.opticsinfobase.org/abstract.cfm?URI=oe-14-5-1862. Systematic or multiple reproduction or distribution to multiple locations via electronic or other means is prohibited and is subject to penalties under law.

Journal

  • Optics Express

    Optics Express 14(5), 1862-1877, 2006-03

    Optical Society of America

Cited by:  1

Keywords

Codes

  • NII Article ID (NAID)
    120002836934
  • Text Lang
    ENG
  • Article Type
    Journal Article
  • ISSN
    1094-4087
  • Data Source
    CJPref  IR 
Page Top