Using super-resolution generative adversarial network models and transfer learning to obtain high resolution digital periapical radiographs.

2021 
Abstract Periapical Radiographs are commonly used to detect several anomalies, like caries, periodontal, and periapical diseases. Even considering that digital imaging systems used nowadays tend to provide high-quality images, external factors, or even system limitations can result in a vast amount of radiographic images with low quality and resolution. Commercial solutions offer tools based on interpolation methods to increase image resolution. However, previous literature shows that these methods may create undesirable effects in the images affecting the diagnosis accuracy. One alternative is using deep learning-based super-resolution methods to achieve better high-resolution images. Nevertheless, the amount of data for training such models is limited, demanding transfer learning approaches. In this work, we propose the use of super-resolution generative adversarial network (SRGAN) models and transfer learning to achieve periapical images with higher quality and resolution. Moreover, we evaluate the influence of using the transfer learning approach and the datasets selected for it in the final generated images. For that, we performed an experiment comparing the performance of the SRGAN models (with and without transfer learning) with other super-resolution methods. Considering Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), Structural Similarity Index (SSIM), and Mean Opinion Score (MOS), the results of SRGAN models using transfer learning were better on average. This superiority was also verified statistically using the Wilcoxon paired test. In the visual analysis, the high quality achieved by the SRGAN models, in general, is visible, resulting in more defined edges details and fewer blur effects.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    53
    References
    4
    Citations
    NaN
    KQI
    []