论文标题
繁殖内核的回归中的早期停止和多项式平滑
Early stopping and polynomial smoothing in regression with reproducing kernels
论文作者
论文摘要
在本文中,我们研究了非参数回归框架中繁殖的内核希尔伯特空间(RKHS)中迭代学习算法的早期停止问题。特别是,我们与梯度下降和(迭代)内核脊回归算法合作。我们提出了一个数据驱动的规则,可以在没有基于所谓的最小差异原理的验证集的情况下执行早期停止。该方法仅对回归函数享有一个假设:它属于繁殖的内核希尔伯特空间(RKHS)。事实证明,该规则在不同类型的内核空间(包括有限级和Sobolev平滑度类别)上被证明是最小的。该证明来自对局部Rademacher复杂性的定点分析,该分析是获得非参数回归文献中最佳速率的标准技术。除此之外,我们在人工数据集上介绍了模拟结果,这些结果显示了设计规则相对于其他停止规则的可比性能,例如由V折交叉验证确定的规则。
In this paper, we study the problem of early stopping for iterative learning algorithms in a reproducing kernel Hilbert space (RKHS) in the nonparametric regression framework. In particular, we work with the gradient descent and (iterative) kernel ridge regression algorithms. We present a data-driven rule to perform early stopping without a validation set that is based on the so-called minimum discrepancy principle. This method enjoys only one assumption on the regression function: it belongs to a reproducing kernel Hilbert space (RKHS). The proposed rule is proved to be minimax-optimal over different types of kernel spaces, including finite-rank and Sobolev smoothness classes. The proof is derived from the fixed-point analysis of the localized Rademacher complexities, which is a standard technique for obtaining optimal rates in the nonparametric regression literature. In addition to that, we present simulation results on artificial datasets that show the comparable performance of the designed rule with respect to other stopping rules such as the one determined by V-fold cross-validation.