论文标题
通过不对称的共同注意学习深层交织的网络,以恢复图像
Learning Deep Interleaved Networks with Asymmetric Co-Attention for Image Restoration
论文作者
论文摘要
最近,卷积神经网络(CNN)在图像恢复(例如,图像超分辨率,图像脱张,降雨脱落和脱去)方面表现出了显着成功。但是,现有的基于CNN的模型通常被用作单条路径流,以丰富来自低质量(LQ)输入空间的特征表示,以进行最终预测,这些空间未能完全将先前的低级上下文中的上下文完全融合到网络中的后来的高级特征中,从而产生不优质的结果。在本文中,我们提出了一个深层交织的网络(DIN),该网络了解如何将不同状态的信息合并为高质量(HQ)图像重建。提出的DIN遵循多路和多分支模式,使多个相互连接的分支在不同状态下交织和融合。通过这种方式,浅信息可以指导深层代表性特征预测以增强特征表达能力。此外,我们提出了不对称的共同注意事项(ASYCA),该共同集中在每个交织节点上附着以建模特征依赖性。这样的Asyca不仅可以自适应地强调来自不同州的信息特征,还可以提高网络的歧视能力。我们提出的DIN可以端到端训练,并应用于各种IR任务。对公共基准和现实世界数据集的全面评估表明,拟议的DIT在定量和质量上对最先进的方法表现出色。
Recently, convolutional neural network (CNN) has demonstrated significant success for image restoration (IR) tasks (e.g., image super-resolution, image deblurring, rain streak removal, and dehazing). However, existing CNN based models are commonly implemented as a single-path stream to enrich feature representations from low-quality (LQ) input space for final predictions, which fail to fully incorporate preceding low-level contexts into later high-level features within networks, thereby producing inferior results. In this paper, we present a deep interleaved network (DIN) that learns how information at different states should be combined for high-quality (HQ) images reconstruction. The proposed DIN follows a multi-path and multi-branch pattern allowing multiple interconnected branches to interleave and fuse at different states. In this way, the shallow information can guide deep representative features prediction to enhance the feature expression ability. Furthermore, we propose asymmetric co-attention (AsyCA) which is attached at each interleaved node to model the feature dependencies. Such AsyCA can not only adaptively emphasize the informative features from different states, but also improves the discriminative ability of networks. Our presented DIN can be trained end-to-end and applied to various IR tasks. Comprehensive evaluations on public benchmarks and real-world datasets demonstrate that the proposed DIN perform favorably against the state-of-the-art methods quantitatively and qualitatively.