论文标题
估计冠状病毒(COVID-19)检测深度学习的不确定性和解释性
Estimating Uncertainty and Interpretability in Deep Learning for Coronavirus (COVID-19) Detection
论文作者
论文摘要
深度学习已经达到了医学成像中最新的表现。但是,这些疾病检测方法的重点仅是在不量化决策中的不确定性的情况下提高分类或预测的准确性。知道对基于计算机的医学诊断有多大信心对于获得临床医生对技术的信任并因此改善治疗至关重要。如今,2019年冠状病毒(SARS-COV-2)感染已成为世界各地的主要医疗挑战。在X射线图像中检测COVID-19对于诊断,评估和治疗至关重要。但是,报告中的诊断不确定性对于放射科医生来说是一项具有挑战性但不可避免的任务。在本文中,我们研究了基于下降重量的贝叶斯卷积神经网络(BCNN)如何估计深度学习解决方案中的不确定性,从而使用公开可用的Covid-19胸部X射线数据集改善人机团队的诊断性能,并表明预测的不确定性与预测的准确性相关。我们认为,不确定性感知的深度学习解决方案的可用性将在临床环境中更广泛地采用人工智能(AI)。
Deep Learning has achieved state of the art performance in medical imaging. However, these methods for disease detection focus exclusively on improving the accuracy of classification or predictions without quantifying uncertainty in a decision. Knowing how much confidence there is in a computer-based medical diagnosis is essential for gaining clinicians trust in the technology and therefore improve treatment. Today, the 2019 Coronavirus (SARS-CoV-2) infections are a major healthcare challenge around the world. Detecting COVID-19 in X-ray images is crucial for diagnosis, assessment and treatment. However, diagnostic uncertainty in the report is a challenging and yet inevitable task for radiologist. In this paper, we investigate how drop-weights based Bayesian Convolutional Neural Networks (BCNN) can estimate uncertainty in Deep Learning solution to improve the diagnostic performance of the human-machine team using publicly available COVID-19 chest X-ray dataset and show that the uncertainty in prediction is highly correlates with accuracy of prediction. We believe that the availability of uncertainty-aware deep learning solution will enable a wider adoption of Artificial Intelligence (AI) in a clinical setting.