论文标题

Fo-Pinns:物理知情神经网络的一阶配方

FO-PINNs: A First-Order formulation for Physics Informed Neural Networks

论文作者

Gladstone, Rini J., Nabian, Mohammad A., Sukumar, N., Srivastava, Ankit, Meidani, Hadi

论文摘要

物理知识的神经网络(PINN)是一类深入学习的神经网络,它们在没有任何模拟数据的情况下学习物理系统的响应,并且仅通过将局部分化方程(PDE)纳入其损失函数中。尽管PINN成功地用于解决前进和反问题,但其准确性显着降低了参数化系统。 PINNS还具有边界条件的软性实现,导致边界条件并未完全施加边界上的任何地方。面对这些挑战,我们提出了一阶物理信息的神经网络(FO-PINNS)。这些是使用PDE损耗函数的一阶配方训练的PINN。我们表明,与标准的PINN相比,FO-Pinns在求解参数化系统方面具有明显的精度,并通过删除计算第二或高级衍生物所需的额外反向传播来降低每一次或每次的时间。此外,FO-PINN可以使用近似距离函数来确切地施加边界条件,当应用于高阶PDE时会构成挑战。通过三个示例,我们在准确性和训练速度方面证明了Fo-Pinns比标准PINN的优势。

Physics-Informed Neural Networks (PINNs) are a class of deep learning neural networks that learn the response of a physical system without any simulation data, and only by incorporating the governing partial differential equations (PDEs) in their loss function. While PINNs are successfully used for solving forward and inverse problems, their accuracy decreases significantly for parameterized systems. PINNs also have a soft implementation of boundary conditions resulting in boundary conditions not being exactly imposed everywhere on the boundary. With these challenges at hand, we present first-order physics-informed neural networks (FO-PINNs). These are PINNs that are trained using a first-order formulation of the PDE loss function. We show that, compared to standard PINNs, FO-PINNs offer significantly higher accuracy in solving parameterized systems, and reduce time-per-iteration by removing the extra backpropagations needed to compute the second or higher-order derivatives. Additionally, FO-PINNs can enable exact imposition of boundary conditions using approximate distance functions, which pose challenges when applied on high-order PDEs. Through three examples, we demonstrate the advantages of FO-PINNs over standard PINNs in terms of accuracy and training speedup.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源