Reflectance confocal microscopy (RCM) images skin noninvasively, with optical sectioning and nuclear-level resolution comparable with that of pathology. On the basis of the assessment of the dermal–epidermal junction (DEJ) and morphologic features in its vicinity, skin cancer can be diagnosed in vivo with high sensitivity and specificity. However, the current visual, qualitative approach for reading images leads to subjective variability in diagnosis. We hypothesize that machine learning–based algorithms may enable a more quantitative, objective approach. Testing and validation were performed with two algorithms that can automatically delineate the DEJ in RCM stacks of normal human skin. The test set was composed of 15 fair- and 15 dark-skin stacks (30 subjects) with expert labelings. In dark skin, in which the contrast is high owing to melanin, the algorithm produced an average error of $7.9\pm6.4 \mu m$. In fair skin, the algorithm delineated the DEJ as a transition zone, with average error of $8.3\pm5.8 \mu m$ for the epidermis-to-transition zone boundary and $7.6 \pm 5.6 \mu m$ for the transition zone-to-dermis. Our results suggest that automated algorithms may quantitatively guide the delineation of the DEJ, to assist in objective reading of RCM images. Further development of such algorithms may guide assessment of abnormal morphological features at the DEJ.
You can access the full PDF from here