论文标题
具有深度恢复神经网络的转移不变空间的近似
Approximation in shift-invariant spaces with deep ReLU neural networks
论文作者
论文摘要
我们研究了深层神经网络的表达能力,以在扩张的移位不变空间中近似功能,这些空间广泛用于信号处理,图像处理,通信等。相对于神经网络的宽度和深度估算了近似误差边界。网络构建基于深神经网络的位提取和数据拟合能力。作为我们主要结果的应用,获得了经典函数空间(例如Sobolev空间和BESOV空间)的近似速率。我们还给出了$ l^p(1 \ le p \ le \ infty)$近似误差的下限,这表明我们的神经网络的构建是渐近的最佳选择,最大为对数因子。
We study the expressive power of deep ReLU neural networks for approximating functions in dilated shift-invariant spaces, which are widely used in signal processing, image processing, communications and so on. Approximation error bounds are estimated with respect to the width and depth of neural networks. The network construction is based on the bit extraction and data-fitting capacity of deep neural networks. As applications of our main results, the approximation rates of classical function spaces such as Sobolev spaces and Besov spaces are obtained. We also give lower bounds of the $L^p (1\le p \le \infty)$ approximation error for Sobolev spaces, which show that our construction of neural network is asymptotically optimal up to a logarithmic factor.