论文标题

适用的新内核回归方法$ L_2 $增强

A new Kernel Regression approach for Robustified $L_2$ Boosting

论文作者

Chatla, Suneel Babu

论文摘要

我们在内核回归的背景下研究$ L_2 $提升。总体而言,内核Smoothers缺乏诸如对称性和积极的确定性等吸引人的特征,这不仅对于理解理论方面,而且对于实现良好的实践表现至关重要。我们认为一种基于投影的平滑效果(Huang and Chen,2008)是对称,积极的和收缩的。基于平滑的正交分解的理论结果揭示了对增强算法的其他见解。在我们的渐近框架中,我们可以用低级别近似替换完整的平滑级别。我们证明,平滑级别的低级别($ d(n)$)在上面由$ o(h^{ - 1})$界定,其中$ h $是带宽。我们的数字发现表明,就预测准确性而言,低级别的smoothers可能胜过全级友善者。此外,我们表明具有低排名更平滑的增强估计器可实现最佳收敛速率。最后,为了提高在异常值的存在下增强算法的性能,我们提出了一种新型的鲁棒增强算法,该算法可与研究中讨论的任何更平滑的算法一起使用。我们使用模拟和现实情况研究了所提出方法的数值性能。

We investigate $L_2$ boosting in the context of kernel regression. Kernel smoothers, in general, lack appealing traits like symmetry and positive definiteness, which are critical not only for understanding theoretical aspects but also for achieving good practical performance. We consider a projection-based smoother (Huang and Chen, 2008) that is symmetric, positive definite, and shrinking. Theoretical results based on the orthonormal decomposition of the smoother reveal additional insights into the boosting algorithm. In our asymptotic framework, we may replace the full-rank smoother with a low-rank approximation. We demonstrate that the smoother's low-rank ($d(n)$) is bounded above by $O(h^{-1})$, where $h$ is the bandwidth. Our numerical findings show that, in terms of prediction accuracy, low-rank smoothers may outperform full-rank smoothers. Furthermore, we show that the boosting estimator with low-rank smoother achieves the optimal convergence rate. Finally, to improve the performance of the boosting algorithm in the presence of outliers, we propose a novel robustified boosting algorithm which can be used with any smoother discussed in the study. We investigate the numerical performance of the proposed approaches using simulations and a real-world case.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源