论文标题
多重下降:设计自己的概括曲线
Multiple Descent: Design Your Own Generalization Curve
论文作者
论文摘要
本文探讨了在不同参数化和过度参数化的模型的可变参数化家族中线性回归的概括损失。我们表明,概括曲线可以具有任意数量的峰值,此外,可以明确控制这些峰的位置。我们的结果强调了一个事实,即经典的U形概括曲线和最近观察到的双重下降曲线不是模型家族的内在特性。相反,它们的出现是由于数据的性质与学习算法的电感偏见之间的相互作用所致。
This paper explores the generalization loss of linear regression in variably parameterized families of models, both under-parameterized and over-parameterized. We show that the generalization curve can have an arbitrary number of peaks, and moreover, locations of those peaks can be explicitly controlled. Our results highlight the fact that both classical U-shaped generalization curve and the recently observed double descent curve are not intrinsic properties of the model family. Instead, their emergence is due to the interaction between the properties of the data and the inductive biases of learning algorithms.