论文标题
审查和比较深层神经网络的常用激活功能
Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks
论文作者
论文摘要
主要的神经网络决策单元是激活功能。此外,他们评估了网络神经节点的输出;因此,它们对于整个网络的性能至关重要。因此,在神经网络计算中选择最合适的激活函数至关重要。 Acharya等。 (2018年)建议多年来制定了许多食谱,尽管这些天有些食谱被认为是在某些条件下无法正常操作的,因为它们被认为是弃用的。这些功能具有多种特征,这对于成功学习至关重要。它们的单调性,个体衍生物和范围有限是其中的一些特征(Bach 2017)。本研究论文将评估常用的加性功能,例如Swish,Relu,Sigmoid等。随后将是他们的属性,自己的缺点和专业人士以及特定的公式申请建议。
The primary neural networks decision-making units are activation functions. Moreover, they evaluate the output of networks neural node; thus, they are essential for the performance of the whole network. Hence, it is critical to choose the most appropriate activation function in neural networks calculation. Acharya et al. (2018) suggest that numerous recipes have been formulated over the years, though some of them are considered deprecated these days since they are unable to operate properly under some conditions. These functions have a variety of characteristics, which are deemed essential to successfully learning. Their monotonicity, individual derivatives, and finite of their range are some of these characteristics (Bach 2017). This research paper will evaluate the commonly used additive functions, such as swish, ReLU, Sigmoid, and so forth. This will be followed by their properties, own cons and pros, and particular formula application recommendations.