Repository logo
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    New user? Click here to register.Have you forgotten your password?
Repository logo
Central Library
  • Communities & Collections
  • All of Central Library
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Jennyfer Susan M B"

Now showing 1 - 1 of 1
Results Per Page
Sort Options
  • No Thumbnail Available
    Item
    Specular Reflection Removal in Smart Colposcopy Images Using Deep Learning Models for Enhanced Grading of Cervical Cancer
    (Avinashilingam, 2024-04) Jennyfer Susan M B; Dr. P. Subashini
    Cervical cancer is a significant global health concern, ranking as the fourth most common cancer among women, primarily caused by Human Papillomavirus (HPV) affecting the lower uterus. Despite preventive measures like HPV vaccination and screening programs, many women hesitate due to invasiveness. Smart colposcopy, an advanced non-invasive approach, captures cervix images for examination. However, white specular reflections caused by body moisture pose challenges, hindering accurate analysis and potentially leading to misclassification of dysplasia regions. This research aims to improve cervical cancer grading by identifying and removing specular reflection from smart colposcopy images. Initial focus lies on specular reflection identification, employing RGB and XYZ color spaces for optimal detection. The proposed intensity-based threshold method accurately identifies specular reflection on XYZ color, overcoming challenges posed by vaginal discharge and acetowhite regions. In the second phase, pixel-wise segmentation models like Fully Convolutional Neural Network (FCN), SegNet, and UNet Model are employed. On comparison analysis of the segmentation model, the UNet model indeed demonstrates higher accuracy. However, when it comes to the intersection of Union, the UNet model falls short due to the overlapping of segmentation. To address this limitation, different versions of the UNet model are compared, and the UNet++ model emerges as the most promising, exhibiting higher intersection of union metrics. Subsequently, the UNet++ model is fine-tuned to optimize its performance in segmenting reflection regions. After segmentation of the reflection, the empty region should be filled with neighboring pixels to improve the quality of the images. A novel Bilateral-based Convolutional Inpainting model fills eliminated regions, improving image quality. This model outperforms traditional methods, particularly in medical image applications, showcasing efficacy across different masking ratios. Enhanced images, with removed specular reflection, undergo grading using DenseNet121, VGG19, and EfficientNet. Trained on both enhanced and non-enhanced images, classification models achieve significantly improved prediction accuracy with enhanced images, underscoring the enhancement technique's impact on cancer stage classification. This research offers valuable insights into medical image analysis, presenting an integrated approach for cervical cancer grading. The proposed methodologies exhibit promising results, laying the groundwork for further advancements in women's health and cancer diagnosis.

Help Desk: library@avinuty.ac.in

DSpace software copyright © 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback

Installed and maintained by Greenbooks Imaging Services LLP