论文标题
神经进化和梯度下降之间的对应关系
Correspondence between neuroevolution and gradient descent
论文作者
论文摘要
我们分析表明,通过条件随机突变或其权重训练神经网络在小突变的极限上是等效的,以在存在高斯白噪声的情况下逐渐下降。基于对学习过程的独立实现,神经进化等同于损失函数的梯度下降。我们使用数值模拟表明,对于有限的突变,对于浅层和深神经网络,可以观察到这种对应关系。我们的结果提供了两个神经网络培训方法之间的联系,通常被认为是根本上不同的。
We show analytically that training a neural network by conditioned stochastic mutation or neuroevolution of its weights is equivalent, in the limit of small mutations, to gradient descent on the loss function in the presence of Gaussian white noise. Averaged over independent realizations of the learning process, neuroevolution is equivalent to gradient descent on the loss function. We use numerical simulation to show that this correspondence can be observed for finite mutations,for shallow and deep neural networks. Our results provide a connection between two families of neural-network training methods that are usually considered to be fundamentally different.