论文标题
QISTA-NET:DNN架构求解$ \ ell_q $ - 最小化问题和图像压缩感应
QISTA-Net: DNN Architecture to Solve $\ell_q$-norm Minimization Problem and Image Compressed Sensing
论文作者
论文摘要
在本文中,我们将$ q \ in(0,1)$的非convex $ \ ell_q $ - norm最小化问题重新调整为2步问题,该问题由一个凸出和一个非问题子问题组成,并提出了一种新颖的迭代算法,称为qista($ \ ell_q $ - ell_ $ - sole)$ soley $ y yeys $ - 通过在加速优化算法中利用深度学习,以及使用网络中所有以前所有层的动量的加速策略,我们提出了一种基于学习的方法,称为QISTA-NET-S,以解决稀疏信号重建问题。广泛的实验比较表明,即使原始的稀疏信号嘈杂,QISTA-NET-S与最先进的$ \ ell_1 $ -norm优化(加上学习)算法的重建质量更好。另一方面,基于与QISTA相关的网络体系结构,考虑到卷积层的使用,我们提出了QISTA-NET-N来解决图像CS问题,并且重建的性能仍然优于大多数最先进的自然图像重建方法。 QISTA-NET-N设计用于展开QISTA并添加卷积操作员为词典。这使得QISTA-NET-S可以解释。我们提供了完整的实验结果,QISTA-NET-S和QISTA-NET-N比竞争能力更好地重建性能。
In this paper, we reformulate the non-convex $\ell_q$-norm minimization problem with $q\in(0,1)$ into a 2-step problem, which consists of one convex and one non-convex subproblems, and propose a novel iterative algorithm called QISTA ($\ell_q$-ISTA) to solve the $\left(\ell_q\right)$-problem. By taking advantage of deep learning in accelerating optimization algorithms, together with the speedup strategy that using the momentum from all previous layers in the network, we propose a learning-based method, called QISTA-Net-s, to solve the sparse signal reconstruction problem. Extensive experimental comparisons demonstrate that the QISTA-Net-s yield better reconstruction qualities than state-of-the-art $\ell_1$-norm optimization (plus learning) algorithms even if the original sparse signal is noisy. On the other hand, based on the network architecture associated with QISTA, with considering the use of convolution layers, we proposed the QISTA-Net-n for solving the image CS problem, and the performance of the reconstruction still outperforms most of the state-of-the-art natural images reconstruction methods. QISTA-Net-n is designed in unfolding QISTA and adding the convolutional operator as the dictionary. This makes QISTA-Net-s interpretable. We provide complete experimental results that QISTA-Net-s and QISTA-Net-n contribute the better reconstruction performance than the competing.