论文标题
抽象神经网络
Abstract Neural Networks
论文作者
论文摘要
深度神经网络(DNN)迅速应用于无人机和飞机控制等安全 - 关键领域,激励技术来验证其行为的安全性。不幸的是,DNN验证是NP-HARD,电流算法呈指数减慢,而DNN中的节点数量。本文介绍了抽象神经网络(ANN)的概念,该概念可用于使用较少的节点时,可用于过度贴应的DNN。 ANN就像DNN一样,除了重量矩阵被给定的抽象域中的值所取代。我们提供了一个由抽象域和DNN中使用的激活功能参数的框架,可用于构造相应的ANN。我们在DNN激活函数上呈现了必要的和充分的条件,以使构造的ANN过度陈列于给定的DNN。先前关于DNN抽象的工作仅限于间隔结构域和Relu激活函数。我们的框架可以与其他抽象域进行实例化,例如Octagons和Polyhedra,以及其他激活函数,例如泄漏的Relu,Sigmoid和双曲线切线。
Deep Neural Networks (DNNs) are rapidly being applied to safety-critical domains such as drone and airplane control, motivating techniques for verifying the safety of their behavior. Unfortunately, DNN verification is NP-hard, with current algorithms slowing exponentially with the number of nodes in the DNN. This paper introduces the notion of Abstract Neural Networks (ANNs), which can be used to soundly overapproximate DNNs while using fewer nodes. An ANN is like a DNN except weight matrices are replaced by values in a given abstract domain. We present a framework parameterized by the abstract domain and activation functions used in the DNN that can be used to construct a corresponding ANN. We present necessary and sufficient conditions on the DNN activation functions for the constructed ANN to soundly over-approximate the given DNN. Prior work on DNN abstraction was restricted to the interval domain and ReLU activation function. Our framework can be instantiated with other abstract domains such as octagons and polyhedra, as well as other activation functions such as Leaky ReLU, Sigmoid, and Hyperbolic Tangent.