• About
  • Products & Solutions
    hStroke Suite
    hStroke CT hStroke LVO hStroke DWI hStroke CTP
    hprostate
    hProstate
    hchestxr
    hChestXR
    hbreastmmg
    hBreastMMG

    Hevi AI offers revolutionary artificial intelligence-based solutions in the healthcare sector. Our product range includes specially designed systems such as hStroke, hProstate, hChestX and hBreastMMG. Each contributes to enhancing medical imaging and diagnostic processes, aiding in achieving faster and more accurate results.

  • News
  • Book a Demo
  • About
  • Productions & Solutions
    Product & Solutions
    hStroke Suite
    hStroke CT hStroke LVO hStroke DWI hStroke CTP
    hProstate
    hChestXR
    hBreastMMG

    Hevi.ai offers revolutionary artificial intelligence-based solutions in the healthcare sector. Our product range includes specially designed systems such as hStroke, hProstate, hChestX and hBreastMMG. Each contributes to enhancing medical imaging and diagnostic processes, aiding in achieving faster and more accurate results.

  • News
  • Book a Demo
Home
News
Evaluating Explainable AI Methods in Mammogram Analysis
Evaluating Explainable AI Methods in Mammogram Analysis
Facebook
Linkedin
Twitter
Instagram
Youtube
Quantitative Evaluation of Saliency-Based Explainable Artificial Intelligence (XAI) Methods in Deep Learning-Based Mammogram Analysis,

Our team has contributed to a pivotal study, "Quantitative Evaluation of Saliency-Based Explainable Artificial Intelligence (XAI) Methods in Deep Learning-Based Mammogram Analysis," which assesses the effectiveness of XAI techniques in breast cancer detection.

 

Overview:  

Explainable AI (XAI) is becoming crucial in deciphering the decisions of deep learning models, particularly in medical imaging. This study focuses on the quantitative evaluation of popular saliency-based XAI methods like Gradient-weighted Class Activation Mapping (Grad-CAM), Grad-CAM++, and Eigen-CAM.

 

Methods and Results:  

Using a balanced dataset from three centers, comprising 1,496 mammograms, three radiologists outlined ground-truth areas indicating cancer presence. The study employed a modified, pre-trained deep learning model for detection, analyzing the alignment of saliency maps with radiologist-drawn boundaries using the Pointing Game metric. The findings revealed Pointing Game Scores of 0.41 for Grad-CAM, 0.30 for Grad-CAM++, and 0.35 for Eigen-CAM, indicating a moderate success in accurately identifying cancerous lesions.

 

Conclusion:  

Although saliency-based XAI methods offer some level of interpretability, they often do not fully clarify how decisions are made within deep learning models. The study underscores the need for further refinement in XAI methods to enhance their utility and reliability in clinical settings.

 

For those in the field of medical imaging and AI, this study presents significant insights into the current capabilities and limitations of XAI methods. 

 

https://www.ejradiology.com/article/S0720-048X(24)00072-X/fulltext

Share This Post:
  • Home
  • About
  • FAQS
  • Contact
info@hevi.ai
|
İçerenköy Mh. Acıbadem Üniv. Tıp Fakültesi No: 32-36B Ataşehir/İstanbul

Learn More About Our AI-Powered Solutions: Schedule a Demo Today
© 2024 Hevi AI - All Rights Reserved.
PDPL Policy | Information Security | Clarification Text | Storage&Disposal

This website uses cookies to enhance user experience and provide services. To learn more, please visit our Cookies Policy page.