论文标题

基于双域注意的深层网络,用于稀疏视图CT减少ct伪影

Dual-domain Attention-based Deep Network for Sparse-view CT Artifact Reduction

论文作者

Gao, Xiang, Su, Ting, Zhu, Jiongtao, Yang, Jiecheng, Zhang, Yunxin, Mi, Donghua, Zheng, Hairong, Long, Xiaojing, Liang, Dong, Ge, Yongshuai

论文摘要

由于X射线计算机断层扫描(CT)在医学成像活动中的广泛应用,辐射暴露已成为公共卫生的主要关注点。稀疏视图CT是一种有前途的方法,可以通过减少获得的预测总数来减少辐射剂量。但是,这种稀疏视图成像方法重建的CT图像遭受严重的伪影和结构信息丢失的影响。在这项工作中,提出了端到端双域注意力域的深网(DDANET)来解决这种不适的CT图像重建问题。将图像域CT图像和投影域的辛图放入DDANET的两个平行子网中,以独立提取不同的高级特征图。此外,引入了一个指定的注意模块,以融合上述双域特征图,以允许对去除条纹的伪像并减轻结构损失的互补优化。进行了数值模拟,拟人化胸腔幻影和体内临床前实验,以验证DDANET的稀疏视图CT成像性能。结果表明,这种新开发的方法能够在维护精细结构的同时坚强地去除串联的伪像。结果,DDANET为实现高质量稀疏视图CT成像提供了有希望的解决方案。

Due to the wide applications of X-ray computed tomography (CT) in medical imaging activities, radiation exposure has become a major concern for public health. Sparse-view CT is a promising approach to reduce the radiation dose by down-sampling the total number of acquired projections. However, the CT images reconstructed by this sparse-view imaging approach suffer from severe streaking artifacts and structural information loss. In this work, an end-to-end dual-domain attention-based deep network (DDANet) is proposed to solve such an ill-posed CT image reconstruction problem. The image-domain CT image and the projection-domain sinogram are put into the two parallel sub-networks of the DDANet to independently extract the distinct high-level feature maps. In addition, a specified attention module is introduced to fuse the aforementioned dual-domain feature maps to allow complementary optimizations of removing the streaking artifacts and mitigating the loss of structure. Numerical simulations, anthropomorphic thorax phantom and in vivo pre-clinical experiments are conducted to verify the sparse-view CT imaging performance of the DDANet. Results demonstrate that this newly developed approach is able to robustly remove the streaking artifacts while maintaining the fine structures. As a result, the DDANet provides a promising solution in achieving high quality sparse-view CT imaging.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源