论文标题

基于神经网络回归中拉伸概率分布的经验策略

Empirical Strategy for Stretching Probability Distribution in Neural-network-based Regression

论文作者

Koo, Eunho, Kim, Hyungjun

论文摘要

在人工神经网络下的回归分析中,预测性能取决于确定层之间的适当权重。由于在给定损耗函数下使用梯度下降程序在背部传播过程中随机初始化的权重更新,因此损耗函数结构可以显着影响性能。在这项研究中,我们将两个分布(预测值和标签的分布的不一致)视为预测误差,并将提出的加权经验拉伸(WES)视为增加两个分布的重叠面积的新颖损失函数。该功能取决于给定标签的分布,因此,它适用于任何分布形状。此外,它包含一个缩放的超参数,因此适当的参数值最大化了两个分布的共同部分。为了测试功能能力,我们生成了理想的分布式曲线(单峰,单峰,双峰和偏斜的双峰)作为标签,并在前进的神经网络下使用了曲线曲线的傅里叶提取的输入数据。通常,WES在广泛使用方面的表现优于损失功能,并且性能在各种噪声水平上都具有鲁棒性。预计将在非线性复杂系统(例如自然灾害和金融危机)中预测极端领域的RMSE结果(即,分布的两个尾部区域)的改进结果。

In regression analysis under artificial neural networks, the prediction performance depends on determining the appropriate weights between layers. As randomly initialized weights are updated during back-propagation using the gradient descent procedure under a given loss function, the loss function structure can affect the performance significantly. In this study, we considered the distribution error, i.e., the inconsistency of two distributions (those of the predicted values and label), as the prediction error, and proposed weighted empirical stretching (WES) as a novel loss function to increase the overlap area of the two distributions. The function depends on the distribution of a given label, thus, it is applicable to any distribution shape. Moreover, it contains a scaling hyperparameter such that the appropriate parameter value maximizes the common section of the two distributions. To test the function capability, we generated ideal distributed curves (unimodal, skewed unimodal, bimodal, and skewed bimodal) as the labels, and used the Fourier-extracted input data from the curves under a feedforward neural network. In general, WES outperformed loss functions in wide use, and the performance was robust to the various noise levels. The improved results in RMSE for the extreme domain (i.e., both tail regions of the distribution) are expected to be utilized for prediction of abnormal events in non-linear complex systems such as natural disaster and financial crisis.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源