论文标题
VLSI实施双曲线切线函数的多项式和合理近似的比较分析
Comparative Analysis of Polynomial and Rational Approximations of Hyperbolic Tangent Function for VLSI Implementation
论文作者
论文摘要
深度神经网络产生了最新的最先进的导致许多计算机视觉和人体机器接口应用程序,例如对象检测,语音识别等。由于这些网络在计算上是昂贵的,定制的加速器旨在以较低的成本和功率来实现所需的性能。这些神经网络的关键构建块之一是非线性激活函数,例如Sigmoid,双曲线切线(Tanh)和Relu。需要低复杂性准确的激活功能的硬件实现来满足神经网络加速器的性能和区域目标。即使已经发布了Tanh激活函数的各种方法和实现,但缺少比较研究。本文介绍了多项式和合理方法及其硬件实施的比较分析。
Deep neural networks yield the state-of-the-art results in many computer vision and human machine interface applications such as object detection, speech recognition etc. Since, these networks are computationally expensive, customized accelerators are designed for achieving the required performance at lower cost and power. One of the key building blocks of these neural networks is non-linear activation function such as sigmoid, hyperbolic tangent (tanh), and ReLU. A low complexity accurate hardware implementation of the activation function is required to meet the performance and area targets of the neural network accelerators. Even though, various methods and implementations of tanh activation function have been published, a comparative study is missing. This paper presents comparative analysis of polynomial and rational methods and their hardware implementation.