论文标题
三角传输的稀疏近似。第一部分:有限的尺寸案例
Sparse approximation of triangular transports. Part I: the finite dimensional case
论文作者
论文摘要
对于两种概率度量,$ρ$和$π$在$ d $二维的立方体上具有分析密度$ [ - 1,1]^d $,我们研究了独特的三角形单调单调单调knothe-rosenblatt $ t的近似值:[-1,1]结果表明,基于稀疏的多项式扩展或深层relu神经网络,对于\ mathbb {n} $,存在近似值$ \ tilde t $的$ t $,使得$ \ tilde t_ \ t_ \ t_ \ t_ \ shrow pharp p $和$π$之间的距离降低了指定性的。更准确地说,我们证明了$ \ exp(-βn^{1/d})$(或$ \ exp($βn^{1/(d+1))$的错误范围,其中$ n $是指包含$ \ t $ \ t $ t $ t $ thevernts ansatz space(或包含网络大小)的尺寸的$ n $。距离的概念包括Hellinger距离,总变化距离,Wasserstein距离和Kullback-Leibler差异。我们的施工保证$ \ tilde t $是HyperCube $ [-1,1]^d $的单调三角形三角形传输。相似的结果适用于反传输$ s = t^{ - 1} $。这些证据是建设性的,我们给出了ANSATZ空间的先验描述,可用于数值实现。
For two probability measures $ρ$ and $π$ with analytic densities on the $d$-dimensional cube $[-1,1]^d$, we investigate the approximation of the unique triangular monotone Knothe-Rosenblatt transport $T:[-1,1]^d\to [-1,1]^d$, such that the pushforward $T_\sharpρ$ equals $π$. It is shown that for $d\in\mathbb{N}$ there exist approximations $\tilde T$ of $T$, based on either sparse polynomial expansions or deep ReLU neural networks, such that the distance between $\tilde T_\sharpρ$ and $π$ decreases exponentially. More precisely, we prove error bounds of the type $\exp(-βN^{1/d})$ (or $\exp(-βN^{1/(d+1)})$ for neural networks), where $N$ refers to the dimension of the ansatz space (or the size of the network) containing $\tilde T$; the notion of distance comprises the Hellinger distance, the total variation distance, the Wasserstein distance and the Kullback-Leibler divergence. Our construction guarantees $\tilde T$ to be a monotone triangular bijective transport on the hypercube $[-1,1]^d$. Analogous results hold for the inverse transport $S=T^{-1}$. The proofs are constructive, and we give an explicit a priori description of the ansatz space, which can be used for numerical implementations.