论文标题
重新构造器:使用复发变压器加速MRI重建
ReconFormer: Accelerated MRI Reconstruction Using Recurrent Transformer
论文作者
论文摘要
加速磁共振图像(MRI)重建过程是一个充满挑战性的逆问题,这是由于K-Space中过度抽样的操作过多。在本文中,我们提出了一个反复的变压器模型,即用于MRI重建的重建器,该模型可以迭代地从高度采样的K空间数据中迭代地重建高生育力磁共振图像。特别是,所提出的体系结构建立在复发的金字塔变压器层(RPTL)上,该层共同利用每个体系结构单元的内在多尺度信息以及通过经常性状态的深度特征相关性的依赖性。此外,提议的重新形式是轻量级的,因为它采用了其参数效率的经常性结构。我们验证了在具有不同磁共振序列的多个数据集上重新形式的有效性,并表明它对具有更好的参数效率的最新方法实现了显着改善。实现代码将在https://github.com/guopengf/reconformer中提供。
Accelerating magnetic resonance image (MRI) reconstruction process is a challenging ill-posed inverse problem due to the excessive under-sampling operation in k-space. In this paper, we propose a recurrent transformer model, namely ReconFormer, for MRI reconstruction which can iteratively reconstruct high fertility magnetic resonance images from highly under-sampled k-space data. In particular, the proposed architecture is built upon Recurrent Pyramid Transformer Layers (RPTL), which jointly exploits intrinsic multi-scale information at every architecture unit as well as the dependencies of the deep feature correlation through recurrent states. Moreover, the proposed ReconFormer is lightweight since it employs the recurrent structure for its parameter efficiency. We validate the effectiveness of ReconFormer on multiple datasets with different magnetic resonance sequences and show that it achieves significant improvements over the state-of-the-art methods with better parameter efficiency. Implementation code will be available in https://github.com/guopengf/ReconFormer.