论文标题
$ f $ -Divergences和积分概率指标之间的最佳界限
Optimal Bounds between $f$-Divergences and Integral Probability Metrics
论文作者
论文摘要
$ f $ -DDIVERGENCES(例如Kullback-Leibler Divergence)和积分概率指标(例如,总变化距离或最大平均差异)的家族被广泛用于量化概率分布之间的相似性。在这项工作中,我们从凸双重性的角度系统地研究了这两个家庭之间的关系。从$ f $ divergence的紧密变异表示开始,我们得出了生成矩的函数的概括,我们表明,这表明$ f $ divergence的最佳下限是给定IPM的函数。使用这种表征,我们获得了新的界限,同时还以统一的方式恢复了众所周知的结果,例如Hoeffding的引理,Pinsker的不等式及其扩展到Subgaussian功能以及Hammersley-Chapman-Robbins BOUND。这种表征还使我们能够证明可能具有独立关注的差异性能的新结果。
The families of $f$-divergences (e.g. the Kullback-Leibler divergence) and Integral Probability Metrics (e.g. total variation distance or maximum mean discrepancies) are widely used to quantify the similarity between probability distributions. In this work, we systematically study the relationship between these two families from the perspective of convex duality. Starting from a tight variational representation of the $f$-divergence, we derive a generalization of the moment-generating function, which we show exactly characterizes the best lower bound of the $f$-divergence as a function of a given IPM. Using this characterization, we obtain new bounds while also recovering in a unified manner well-known results, such as Hoeffding's lemma, Pinsker's inequality and its extension to subgaussian functions, and the Hammersley-Chapman-Robbins bound. This characterization also allows us to prove new results on topological properties of the divergence which may be of independent interest.